Could an Airbag Revolutionize U.S. Healthcare?
Fueled partly by the high cost of hip fractures, the U.S. spent $2.8 trillion in 2012 on healthcare. Could a startup's innovative airbags help fix that problem?
How would you react if you learned that, apart from its superior mathematical and computational abilities, your computer could also measure your emotional state, determining whether you’re happy, anxious, or relaxed? Though it might sound like the plot of a novel set in some dystopian future, that technology is the foundation of Affectiva, a startup that spun out of a research project at M.I.T.’s Media Lab.
Since it was founded, Affectiva has quickly, and quietly, ushered in an era in which businesses are increasingly using sophisticated machine learning technology to better understand consumer tastes and preferences. While the notion that a computer could have emotional intelligence is, at least on some level, unnerving, marketers are keyed in to value these types of insights, as they can lead to better targeted advertisements, products, and programming. What’s more, it could even pave the way for breakthroughs in autism research, even though Affectiva is not directly pursuing that field of study.
“We have made our technology available as a software developer kit (SDK) for both iOS and Android, so that developers can emotion-enable their apps and digital experience, making these more authentic, interactive, and engaging,” Affectiva co-founder Rana el Kaliouby tells Free Enterprise. “We’ve had a number of developers who are in the autism space reach out to us, and they’re using it to develop apps that are targeted at this population.”
The result of years of research conducted by el Kaliouby, the company has developed a program—one that can be deployed on any camera-equipped computer device—that observes and analyzes a user’s emotional response to a stimulus, typically a video. Affectiva also provides emotion analytics and insight to understand consumer emotional engagement.
“If you’re Kit Kat, and you want to know how well you are doing in the U.S., then we can show you your ad compared to all the other ads in your chocolate category, and show you where your ad does drive emotional engagement.”
From this visual data, Affectiva’s computer vision algorithms are able to isolate specific areas of the face and delineate among a broad range of emotional expressions. Drawing on that data, Affectiva’s software, Affdex, then generates a detailed report illustrating how a person’s emotional response evolved while engaging with digital content or a digital experience. Skeptical? You can demo Affectiva’s technology on their website.
This kind of breakthrough in machine learning would not have been possible had it not been for el Kaliouby, who arrived at M.I.T. by way of the University of Cambridge, where she earned her Ph.D. in computer science. For el Kaliouby, who had always been fascinated by the face, the notion that a machine could read a person’s emotions was particularly compelling.
When she was working toward her doctorate, el Kaliouby says, there was little emphasis placed on the importance of emotional intelligence, and how people express their emotions; most research was focused rather on cognitive science and cognitive computing. Yet that has changed over the past 15 years, el Kaliouby points out.
“I think there’s a real understanding today that our emotions, our emotional intelligence, and how we express our emotions plays a very important part of who we are, of how we connect with other people online and digitally,” she says.
While at Cambridge, el Kaliouby began building a machine that could read emotions based on facial expressions. Born of this project was the idea that would eventually become Affectiva. “I was presenting, and someone in the audience mentioned autism,” she says.
“I didn’t even know how to actually spell the word autism; I had not heard of it. This was the early 2000s. I was intrigued, so I followed up. I got very interested in autism, and the challenges and problems that people on the autism spectrum face. I thought that a lot of the technologies we were talking about and trying to develop like emotional intelligence could be packaged as an assistive technology for people who are autistic or struggle with communication in general.”
That led el Kaliouby to M.I.T., where she worked on building more sophisticated algorithms that could recognize all kinds of facial expressions, regardless of age, ethnicity, and sex. Though her work at the Media Lab was done with the understanding that it could benefit the autism community, it ultimately evolved into its current iteration at the urging of businesses that partner with the M.I.T. Media Lab.
“A big chunk of our funding at M.I.T. Media Lab came from industry, so twice a year we would invite our business sponsors—we had Unilever, Proctor & Gamble, Samsung, Microsoft, Google, and other Fortune 100 companies that basically sponsored the lab to get access to the new ideas and trends and research coming out of it—to visit us,” she says. “We showed them our autism research and, for a couple years, they would consistently say that there was a lot of commercial interest in this technology.”
In 2009, el Kaliouby took their advice and co-founded Affectiva with M.I.T. professor Dr. Rosalind Picard to address the growing demand for emotion technology and emotion analytics. Since then, the company has increased its database of emotion metrics, amassing more than 2.7 million facial videos representing a total of more than 11 billion emotion data points. It has also acquired an impressive array of customers—more than 1,400 brands—who rely on Affdex to better understand how consumers respond to their advertisements.
Affdex is especially useful in media and advertising because the company’s emotion analytics illustrate the moment-by-moment progression of a person’s emotional response. It then summarizes the patterns it has detected to an overall emotional score, one it ties to a confidence level. (“John looks like he’s smiling at a 90% confidence score.”)
“It’s going to take all these metrics and map them into an emotional state—things like enjoyment, surprise, disgust, as well as net positivity, how positive or negative you are—and it’s going to do that for your video,” she explains. “It’s also going to do that for every other person who has watched similar content or gone through a similar experience. It then aggregates this data, and it gives you insights.”
Drawing on that aggregate data, Affectiva then creates norms that businesses can use to compare their own advertising performance to that of their peers. “For example, if you’re Kit Kat, and you want to know how well you are doing in the U.S., then we can show you your ad compared to all the other ads in your chocolate category, and show you where your ad does drive emotional engagement,” el Kaliouby says.
To ensure that Affectiva’s software is as effective and accurate as possible, el Kaliouby and her team have worked to continually expand the database it taps into when analyzing users. Doing so has required the company to take a different approach from the one traditionally employed in academia, where undergraduate psychology students typically act as research subjects. Though effective at a basic level, this method is flawed, critics say, because there’s little variability among this research pool in terms of age and race, among other factors.
“It is absolutely crucial that your dataset is global and cross-cultural,” el Kaliouby stresses. “Our data set has young children, it’s got older people, and it has a mix of ethnicities and genders from 75 countries. And that’s very important from a machine learning perspective.”
The care with which el Kaliouby and her team have assembled and employed their machine learning technology has already won over global business leaders, including Millward Brown, a market research firm. The company, a division of WPP—the world’s largest advertising holding company—was among Affectiva’s first clients.
According to The New Yorker, Millward Brown came on board in 2011 after testing Affdex’s accuracy determining user responses to four ads it had already extensively studied. Among them was a Dove ad that focused on how young girls are targeted by the beauty industry. Using Affdex, the company realized that its initial assessment of the ad—that it made viewers uncomfortable—didn’t tell the entire story. Affdex showed a more nuanced interpretation: Though initially uneasy with the subject matter, “at the moment of resolution this discomfort went away.”
Millward Brown executive Graham Page recounted to the magazine: “The software was telling us something we were potentially not seeing. People often can’t articulate such detail in sixty seconds, and also, when it comes to negative content, they tend to be polite.”
For her part, el Kaliouby continues to refine and hone Affdex’s facial expression database, as well as its analysis algorithms. With interest from industry continuing to grow, it’s likely that advertisements you see in the future—and, in fact, today—have been analyzed by a computer program that is potentially more emotionally intelligent than some human beings.
It’s a machine learning world, and humans are just living in it.