By: Rana el Kaliouby, Co-Founder and Chief Strategy and Science Officer; Gabi Zijderveld, VP Marketing and Product Strategy

Picture this: you’re at home surrounded by your “super” smart devices – intelligent TV, connected sound system, smart LED lighting, smart watch, smartphone, and laptop.

You are in the middle of a serious conversation, and every other sentence is interrupted: the TV buzzes to let you know it’s found a show you might like, and then your fitness tracker tells you that you only have 200 steps left to meet your daily goal. Your mother calls on your watch, the fridge tells you you’re out of milk and your laptop buzzes to let you know you have a meeting in 10 minutes.

DALL·E 2024-02-20 13.50.16 - A person sits in the middle of a living room, visibly overwhelmed and surrounded by an array of smart devices, including a smartphone, tablet, smartwa

You know emotional intelligence when you see it. It’s when your five year old decides that it’s a perfect time to ask for that cookie because you are in the middle of a conference call and likely not paying much attention to her. Or when your mom knows that something is wrong just from “hello”. Or when you walk into your manager’s office and judging by the angry face, decide that it’s not the right time to ask for that raise. It’s a well-known fact that individuals with higher EQ (Emotional Quotient), a measure of emotional intelligence, are more likable, more persuasive and generally speaking lead more successful lives (and live longer too!).

In today’s hyper-connected world of smart devices and appliances our technology has lots of IQ, but no EQ and lots of cognitive intelligence but no emotional intelligence. What if technology could understand our emotions? What if our devices knew when to shut up and let us be, and when to tread gently and support us? The same way an emotionally intelligent friend would. At Affectiva, we envision a whole new world of devices – the Internet of Things – that can detect our emotional state and respond to it.

The number of connected devices on us and around us is growing exponentially. Gartner forecasts that 4.9 billion connected things will be in use in 2015, up 30 percent from 2014, and will reach 25 billion by 2020. These connected things can be anything from wearable devices and cars, to your TV, fridge, mirror and home robots. To deliver value and positively change our lives, these devices need to be context-aware. In fact, many of these devices are designed to bring about positive behavior change as they persuade or incent you to do things differently, better, faster or more efficiently. To be most effective these devices need to be perceptual and in tune with our emotions.

What does this look like? Imagine your fridge working with you to eat healthier, or your wearable and TV team up to get you off the couch! Your bathroom mirror senses that you’re a bit stressed and interacts with your lighting to adjust it while turning on the right mood-enhancing music. A mood-aware Internet of Things will not only bring these types of improvements in the “smart home”, it has also the potential to transform major industries.

In healthcare, and telemedicine specifically, mood-aware wearables could contribute to monitoring mental health and wellbeing, perhaps even helping those with depression track their emotional state and feed that data to their doctor if they choose to do so. Not only would cars sense anger and help manage road-rage, the car of the future would be an environment in which automotive technology, drivers and passengers engage and take action based on mood. And online education would become much more interesting and effective – and would lower drop out rates – if students could turn on their emotion sensor, so that educational content adapts if they’re distracted, bored or highly engaged.

SoMe Graphics - Press Articles

How would this work? We are not going to interact with those mood-aware devices using text commands. We will expect to interact with these devices the same way we interact with each other: by using the multitude of ways we express ourselves through gestures, voice and facial expressions. (See Forrester Research: Artificial Intelligence Can Finally Unleash Your Business Applications’ Creativity.)

Technically, we have all the pieces to make this reality. Just imagine that devices have an“emotion chip”, very much like a GPS location chip that is now part of many consumer devices. An emotion chip would have an optical sensor and perhaps other sensors as well that can read your emotions – your facial expressions, your tone of voice, your physiology. These small chips would passively collect data about your emotional state (with your consent of course!) It would leverage machine learning on device or in the cloud to make real time inferences about your emotions – for example, when a device knows that you’re stressed it can modify its behavior to handle that. Connected devices can also use the cloud to make sense of all that data so they can understand your baseline, and when you deviate from it.

To enable this world of mood-aware appliances and devices, we developed a suite of APIs and Software Development Kits (SDKs) on multiple platforms including iOS, Android and Microsoft Windows. These allow developers and makers to emotion-enable their devices, digital experiences and applications. Our mantra in building these SDKs was to make it simple, actually overly simple, to add emotion sensing to your technology.

So what does emotion-enable really mean? Two things. First, real time interaction: adding emotions to a digital experience allows you to measure a user or crowd’s emotions and respond in real time. The response can range from changing the lights in your environment or playing a different music playlist to personalizing a learning experience, adapting a game or even customizing a business process. Secondly, analytics: with longitudinal data collected on individuals’ emotional responses, digital experiences and applications gain insights that will help make better decisions faster.

With the community of developers and creators growing everyday we are particularly interested in what can be built with our SDKs. Recently our mobile visionary and iOS guru Boisy Pitre and analyst Ben Anding emotion-enabled an Arduino (see the video here). Wild Blue Technologies created a Smile Sampler for The Hershey Company, ooVoo built the Flip Dot Wall and TrueCar L.E.D. (Light Emotion Data) provides concertgoers a unique sensory experience. These are fun and unique examples of what our SDKs can do, and from these it will be easy to imagine how our SDKs could emotion-enable the Internet of Things.

At Affectiva we believe the next wave in computing will be the emotionally intelligent and mood-aware Internet of Things. The sky is the limit when it comes to the possibilities and we can’t wait. We are providing developers with the tools to power their digital experiences with our emotion-sensing technology. 

What emotion-aware experience would YOU build?


Affectiva is focused on advancing it’s technology in key verticals: automotive, market research, social robotics, and, through our partner iMotions, human behavioral research. Affectiva is no longer making it’s SDK and Emotion as a Service offerings directly available to new developers. Academic researchers should contact iMotions, who have our technology integrated into their platform.

Girl Decoded by Rana el Kaliouby

Emotion-enabled Affdex SDK Emotion Recognition Emotion-aware Personalization