© 2016 AFFECTIVA, INC.
PRIVACY POLICY | TERMS OF SERVICE

Contact

Send Us a Message

Contact Information

294 Washington Street
Boston, MA 02108

Email: info@affectiva.com
Phone: (781) 996-3037
Fax: (781) 996-3038

BLOG

SDK on the Spot: Peppy Pals Educational Apps Teaches Children SEL/EQ Skills
March 23, 2017 Responsive Gaming SDK Emotional Engagement Emotional Intelligence

SDK on the Spot: Peppy Pals Educational Apps Teaches Children SEL/EQ Skills

By: Ashley McManus, Global Marketing Manager; featuring Peppy Pals Founder & CEO Rosie Linder

How we built the world’s largest emotion data repository


By: Rana el Kaliouby, Co-Founder and Chief Science Officer

Affectiva has recently reached a major milestone: to date, we have analyzed emotion responses from over 2 million facial videos gathered from people watching media in 75+ countries around the world.

As a scientist engaged in research into human emotion for over a decade now, this is very, very exciting news!

When in March 2011 we first debuted Affdex, our flagship automated facial coding product, we had no idea our journey would someday lead to the world’s largest emotion repository. In fact, we wondered if anyone would even opt-in to turn their camera on and share their emotion responses. What happened blew us away!

The first indication came with our first big crowdsource effort for Forbes Magazine in 2011, where we obtained 3,000 facial responses to Super Bowl commercials in less than 3 days. Most people were watching the content from their homes, often without the presence of anyone else in the room. Yet, we saw a wide range of emotions expressed via the face – everything from amusement and surprise to skepticism. This validated our hypothesis that people do in fact emote their response to content and has since led to a scalable solution for measuring people’s true response to content anywhere in the world.

Let me explain why this is important.

Reason 1: in the search to understand people’s emotional connection to content, researchers have historically spent millions of dollars setting up an experiments, securing lab space, sourcing panelists, surveying and analyzing data–only to yield marginal insights into engagement.

With Affdex, we can collect emotion responses from any device, anywhere in the world – all you need is a webcam and an internet connection. In remote parts of the world lacking reliable internet connectivity, we even use a store and forward mechanism in central locations. The emotion data Affdex captures and analyzes is natural – people opt-in to share their true emotion responses from their homes, their offices …anywhere. If you have ever struggled to capture true emotion responses of your audience, or the emotional connection people have with your content, products or services – you can now do so in a scientifically valid AND scalable way!

Reason 2: We use computer vision and machine learning techniques to train computers to do what we humans do naturally–to “read” emotions from the face. The resulting Affdex classifiers take face videos as inputs and provide frame-by-frame emotion metrics as outputs. To be sure, data is critical to this process. The more data (and more varied data) you have, the better and more robust your emotion classifiers become. When I started research in this space, the only dataset available to train a computer to recognize emotions consisted of less than a 1,000 images and they were mostly all of undergrad students who agreed to pose for the different emotions in exchange for credit! Today with over 2 million face videos analyzed from all over the world, we’ve got ample representation of all ethnicities and cultures. About half a million images are labeled and can be used for training and testing purposes. Our experiments are showing very clearly that the accuracy scales with the amount of data.

Reason 3: As we continued to build our database of emotion responses to media (everything from advertising, movie trailers, and TV shows) – we also started marrying it with third party data – everything from location information to demographics, as well as virality and sales effectiveness data. The results were clear and consistent with research in Emotion Decision-Making (which has been typically done in a lab and on much smaller populations): emotions drive our opinions, our brand attitudes and ultimately our purchase decisions. We have shown that the unfolding of emotion responses over the course of media content – from how you hook the viewer in the first few seconds, to how you end the ad – correlates very highly with KPIs such as ad likeability, virality, ad avoidance and sales data. We are now able to show, outside of a lab, that emotions really matter.

Finally, this one is more aspirational, but as we continue to grow this emotion data repository I believe we have an opportunity to track and even influence people’s moods and sentiments. Imagine a world where devices automatically assess your mood, and based on the mood fingerprints of thousands of other people, suggests mood enhancing content, music or lightng that brightens your day. After all, that’s what motivates all of us at Affectiva: the opportunity to explore new frontiers, develop innovative new solutions and build new markets.

As always, I would love to hear your thoughts on this, please reach out to me on Twitter @Kaliouby