BLOG

Science Emotion AI

Science Deep Dive Series: Bringing Emotional Intelligence to Technology with our Global Emotion Database

10.12.18

This is the second post of the Science Deep Dive series. In the first post we talked about the complexities of Emotion AI, and the basics of how Affectiva brings this emotional intelligence to technology. Similar to machine learning, we solve these Emotion AI complexities by collecting a lot of data for training, complex algorithms that factor in multiple signals of nuanced emotional states, the infrastructure to be able to train these algorithms, and a skilled team in that understand emotion and the complex nuances to then be able to put the first three together to produce models that work.

In this post, we’ll discuss the first step to address those complexities, which is to use our robust and scalable data strategy. This enables us to acquire large and diverse data sets, as well as annotate these datasets using manual and automated approaches. 

Our data strategies focus on four aspects of a high quality dataset, that can be effectively used for building predictive models. First and foremost, the dataset is large. Over the past seven years, we have spent considerable effort in developing a massive proprietary dataset that serve as a foundation to all our model training approaches. The size of this dataset increases continuously. Second, the data has to be diverse so that models trained from it can generalize well: for example, our data contains a wide-range of emotional displays acquired from individuals across different demographic segments. Third, it is representative of the various real-world conditions in which our Emotion AI solutions will be used, so we are aware of environmental factors and individual behaviors likely to occur in these conditions and our models are optimized to address them. And finally, the dataset is labeled. At Affectiva, we have a large team of in-house using expert labelers, who label for what they observe: emotions, age, gender, speech events. However, due to the sheer volume of data that we have, not all of it can be manually labeled—so we use automated techniques such as active learning to quickly identify video segments of interest.

In order to get the algorithms to analyze and recognize nuanced emotion “in the wild,” we need to gather examples of real spontaneous emotions. This means we look for real-world data, such as actual instances of people driving. We also have to be robust to the demographics around the world since we want our data demographically diverse as well. We are able to get this data through our in-market partners.

Screen Shot 2018-10-10 at 11.24.15 AM

Affectiva is also investing in gathering proprietary, market-specific data in automotive. For the automotive industry we are focused on looking inside the car, with drivers and occupants. This is in contrast with the majority of automotive data collection already happening outside of the car, for the purpose of building autonomous vehicles. In some cases, we've licensed data that can be really hard to find elsewhere or scrape public online data sources to augment the data. Additionally, on top of the natural data, we execute in-house annotation and labeling - highly trained and specialized labeling team looking at this data and giving their judgment.

Speaking to the diversity of our dataset, we have data gathered from around the world and have people represented across age, gender and ethnicity.

The World’s Largest Emotion Data Repository

Screen Shot 2018-10-10 at 11.05.58 AM

Affectiva is the first company to ask this many people, now over seven million faces, to turn on their living room cameras, laptop or mobile cameras to gather video of them reacting spontaneously. So this is really extremely valuable in the wild proprietary data set that's growing steadily. Read more about how we collect this data.

The Bottom Line

In order to model this complexity, machine learning is a must, and Affectiva is using a data driven approach to achieve it. Emotion AI requires a lot of data, as well as complex machine learning algorithms in order to really learn and understand the difference between someone experiencing joy versus smiling because they are frustrated. Next, we’ll discuss more on our algorithms - so stay tuned!

 

At the 2018 Emotion AI Summit, Affectiva’s Dr. Taniya Mishra, Director of AI Research & Lead Speech Scientist, and Jay Turcot, Director of Applied AI at Affectiva, presented a workshop on

The Science Behind Affectiva’s Emotion AI. Download the recorded workshop session to see the full presentation.

Download Emotion AI Summit 2018 content now

Science Emotion AI