BLOG

Emotion Technology Emotion Recognition SDK Emotion AI AI

Emotion Technology Year in Review: Affectiva in 2016

12.29.16

By: Rana el Kaliouby, Co-Founder and CEO

It’s an exciting time to be in the artificial intelligence space. AI is becoming more pervasive and a lot more mainstream. The work we do at Affectiva brings a unique perspective to AI, with our mission of humanizing technology with artificial emotional intelligence.

Beyond the scope of Affectiva, interest in Emotion AI is heating up, with Apple acquiring Emotient earlier this year and other large tech companies investing in the space.

Personally, it was quite an exciting and busy year. I became Affectiva’s CEO and focused on our culture of openness, innovation and flexibility (in fact we were listed as one of the best rated places for parents to work). We raised additional capital for the company and positioned Affectiva at the center of artificial intelligence. Throughout the year, we evangelized how Emotion AI is transformative across various industries. We spoke at events such as TechCrunch and WebSummit showing audiences how they can apply emotion analytics in their own line of work.

Here are my top 5 highlights of 2016:

1. Defining the Category: Introducing “Emotion AI”

This year, we named our category Emotion AI: the next frontier of artificial intelligence.

In 2010, shortly after we spun Affectiva out of MIT Media Lab, I had the opportunity to meet with Todd Dagres of Spark Capital. “The best companies”, he said, “are those that define a new category, name it, seed it and lead it”. Advice I never forgot.

Affectiva has always been a thought leader within the new and emerging emotion technology space, yet the category was still unnamed.

And so earlier this year, Gabi Zijderveld, Affectiva’s CMO and I, were brainstorming this category’s name. Our top contenders were emotion recognition, emotion analytics, and emotion sensing. Though they were true to the category, they did not fully capture the transformative implications of our technology. So we came up with Emotion AI and used #EmotionAI in a tweet the same day.

Emotion AI, short for artificial emotional intelligence, underscores that machines, like humans, need emotional intelligence to be most effective. This is especially true today in a world where technology is becoming more and more conversational, perceptual and relational.

Ever since then, Emotion AI has taken a life of its own. It has been referenced thousands of times, and has become synonymous for this class of technology. My favorite example was when months later a VC called and said “We’re looking to invest in the Emotion AI space”, I remember thinking to myself “Wow, so it’s a thing now!”.

2. Expanding the Emotion AI Conversation

We focused on growing awareness for Emotion AI at events, through press and thought leadership.

To position Affectiva front and center in the AI space and to further build our category, we focused on growing awareness for Emotion AI at events and through press.

This past year, Affectiva was present at 70 events. I personally traveled the globe to speak at 15 of them, including TechCrunch Disrupt, WebSummit, Collision, O’Reilly AI, DreamForce and Tech in Asia. The highlight was presenting at the Congress of Future Science and Technology leaders, to an audience of 5000 highly energetic and engaged high school students.

RANA_SCITECHLEADERS.jpg

We also continued to get outstanding press coverage, with 60 articles as of today, including coverage in Fast Company, TechCrunch, Huffington Post, Wired, Forbes, PBS Newshour, Ad Age, App Developer Magazine, SD Times, PC Mag, Die Welt, and The Boston Globe. WIRED named me as one of “25 Geniuses Who Are Creating the Future of Business”, such an honor to be named among all these awesome innovators. Forbes listed Affectiva as one of the companies that use deep learning to produce actionable results. (You can read all of these articles and more on affectiva.com/news)

In addition, representing the Emotion AI space, I was asked to join the World Economic Forum’s Future Global Council on Robotics and Artificial Intelligence. We had our first meeting in November in Dubai, where I joined an amazing group of thought leaders to develop recommendations on AI for policy makers and business leaders, to be presented at the World Economic Forum in Davos.

3. Leading in Emotion AI Innovation

From deep learning to on-device emotion sensing, we pushed the boundaries of computer vision and machine learning to recognize a range of facial attributes and emotions.  

Screen Shot 2016-12-28 at 8.35.53 AM.png

We strive to provide comprehensive facial expression and emotion classifiers. This year we added multiple-face tracking in photos as well as video streams. We also added gender, age and ethnicity detectors, 13 new emojis and 6 new facial expression classifiers including nuanced ones such as a cheek raise, lip stretch and jaw drop.

We continue to experiment with deep learning, leveraging the massive dataset of emotions we have and our video labeling infrastructure. As I write this, we have collected and analyzed over 4.8 million face videos from 75 countries. This year we also focused on expanding the various contexts represented in our data to include data of people viewing online content, playing video games, driving cars and conversing with other humans and robots.

4. Making Emotion AI Accessible to All (with our SDK)

From cross-platform support to exciting developer evangelism initiatives, we invested heavily in our Emotion SDK. We saw over 500% increase in SDK usage in Q4 versus earlier in the year, and closed the year with over 5,000 SDK downloads.

Whatever platform you’re developing on, we’ve got you covered. Our Emotion SDK provides real-time emotion analytics and runs on-device. The SDK is now available for iOS, Android, Linux, Windows, macOS, Unity and JavaScript.

We made it a lot easier for developers to get access to our SDK. Any developer or organization making under $1 million in annual revenue can now use our SDK for free, and everyone else can download it for evaluation purposes. We feel very strongly about seeing the many different use cases for emotion technology brought to market in real apps, projects and devices. A lot of this innovation takes place in smaller organizations that are not yet making money. We do not want to stand in their way. We want to make it super easy for them to use our technology - and when they succeed, so does Affectiva.

We’ve also started a “developer evangelism” initiative geared toward growing the number of developers using our SDK and laying the groundwork for a vibrant Emotion AI developer community.

In March of 2016 we organized EmotionLab ’16, Affectiva’s first-ever hackathon. We had 60 participants and 10 projects presented. We also hosted the first-ever online Emotion AI Developer Day with 363 registrants from around the world (an amazing number that was more than five times our goal). Over 85% percent of attendees said they would attend another Affectiva developer event - stay tuned for similar events in 2017.

We supported six hackathons:

  • Hack Roboy in Germany, where the winning team used our SDK to build an emotionally responsive robot. This underscores a growing trend of social robotics and conversational interfaces. People are looking for “companion” robots that they can engage with and that can understand them better.
  • At the Neuro Do.ai hackathon, the winning team reParrot used Affectiva’s SDK to help people addicted to painkillers stay in recovery.
  • At the NBCUniversal hackathon in Miami, we saw that measuring and tracking emotion is becoming desirable for television programming - for example, how you feel while you watch an episode of your favorite sitcom can give you and the content creators insights into your reactions.
  • Other successful hackathons included Box Hack Day, Do.ai, Reality, Virtually at MIT and Global Game Jam, all of which not only have given us great feedback for improving our SDK, but reaffirmed our intention to have an even bigger presence at hackathons in 2017.

5. Applying Emotion AI Across Multiple Industries

This year we invested in seeding Emotion AI in several verticals such as gaming, social robotics and education

Our vision for Emotion AI is one that cuts across many markets, humanizing the way in which we interact with technology while enabling businesses to get emotion insights in a way never possible before. This year we invested in seeding Emotion AI in several verticals:

  • In advertising research, we continue to grow our footprint with global partners such as Millward Brown and Unruly, through which over a third of the Global Fortune 100 and over 1,400 brands use our technology to optimize their advertising content and media placement.
  • In gaming, we announced our Unity SDK and partnered with game studios to build emotion-aware games. For example, Flying Mollusk developed Nevermind, a psychological thriller that adapts gameplay based on the player’s emotional state.
  • Through iMotions and their human behavioral research platform, our technology is in use by many research labs including Texas A&M, GSK, Deloitte, McGill University, NYU, and Mindshare.
  • In social robotics, the healthcare companion robot Mabu uses our tech to most effectively engage with patients and encourage adherence to treatment plans.
  • In education, learning apps like Little Dragon are on a mission to make learning more enjoyable and effective by analyzing streams of facial expression data to infer student engagement, then adjusting the app to keep kids engaged.
  • We partnered with Giphy to analyze and encode emotions in all their content – making it possible to search for GIFs by emotion, and even use your own emotions to search online with our emotion recognition technology.

Looking Ahead: 2017

As exciting as all of these developments were this year, 2017 is already looking like the momentum will continue. On the science and product front, we continue to invest in deep learning and expand our portfolio of emotion classifiers. We will continue to invest in our SDK, making it easier than ever to add emotion awareness to any application. And lastly, we will continue to evangelize the value of Emotion AI, which we believe will fundamentally change how we interact with technology.

Coming Soon: Where You Can Find Affectiva

A number of us will be at CES in January - please holler if you are going to be there and would like to meet up. In addition, I am speaking at StartMIT on the 9th of January and will be at CB Insights’ Innovation Summit in Santa Barbara 10-11th of January.

We are very excited to be moving into a new office in downtown Boston in January. To celebrate, we will roll out a series of events open to the public around all things Emotion AI, innovation, entrepreneurship, development, and much more. Stay tuned for the official calendar, and we hope to see you at our new space!

Final thoughts:

I just finishing reading Shoe Dog--Phil Knight's fascinating memoir of how he started Nike--and was struck by this line "With thanks for taking a chance on me".

So as we end 2016, I want to thank you for taking a chance on Affectiva, for believing in our mission, for championing it, for being close and trusted partners and clients and for being avid followers. Our team at Affectiva is grateful for that support. Together, we are building a bigger and better future for Emotion AI, and have some new developments in store for the industry in 2017. We hope you’ll join us for the ride.

Best,

Rana

download emotion SDK to emotion enable your project

Emotion Technology Emotion Recognition SDK Emotion AI AI