By: Boisy G. Pitre, Mobile Visionary
At Affectiva, we are constantly pushing the boundaries of Emotion AI — a term that we have coined to describe the intersection of human emotions and artificial intelligence.
Our scientists and engineers accomplish this through collecting an ever increasing amount of data. This data improves our science which creates better machine learning algorithms. These improved algorithms are then deployed at scale through our emotion SDKs and APIs, which provide additional data, thus leading to the virtuous cycle.
More data also gives us more ways to examine the subtleties of human emotions and expressions. This continuing search leads us to not only improve our algorithms, but to identify new ways to measure changes in the face while people emote. To that end, we are rolling out a new release of our SDKs for all of our supported platforms (iOS, Android, Windows, Linux, OS X, and Unity) which provide both improved classification of existing facial expressions as well as new ones.
But first, let’s look at an exciting and new platform for our SDK.
Emotions in a Browser?
Our SDKs are easy to integrate into your existing applications, but we challenged ourselves to deliver an even more seamless integration experience. We asked ourselves if we could use the most common tool available on computers, the web browser, to both demonstrate Emotion AI technology and at the same time, allow developers to put aside development environments, compilers, and tool chains.
If you’re running one of the above supported browsers, you can try the demo right now! Just click here to launch the demo (your computer will need a webcam or other camera, of course)!
Another fun demo that you can try is our YouTube emotion tracker, which will track your emotions as you watch any content from YouTube
We’re excited about this new offering. There is no easier and quicker way to bootstrap our emotion sensing technology!
But wait, there’s more!
Wouldn’t it be great if a machine could estimate both your age and ethnicity? Starting with SDK 3.1, we can do just that. Our age classifier can estimate a person within one of the following age bands which are self-explanatory:
- Under 18
- 18 – 24
- 25 – 34
- 35 – 44
- 45 – 54
- 55 – 64
Our new ethnicity classifiers recognize the following five values: Caucasian, Black African, South Asian, East Asian and Hispanic. We are releasing both the age and ethnicity features as beta, and are looking to your feedback to make these better.
We have also expanded our collection of detected facial expressions. In addition to the existing 15 facial expressions that we detect, we now have 6 additional ones:
- Cheek Raise – Lifting of the cheeks, often accompanied by “crow’s feet” wrinkles at the eye corners
- Chin Raise – The chin boss and the lower lip pushed upwards
- Dimpler – The lip corners tightened and pulled inwards
- Eye Widen – The upper lid raised sufficient to expose the entire iris
- Jaw Drop – The jaw pulled downwards
- Lid Tighten – The eye aperture narrowed and the eyelids tightened
These new expressions allow you to mine the rich canvas of the human face for even more hints and cues to how people are emoting. You can view images which illustrate these facial expressions here.
It’s Easier Than Ever!