Emotion AI Media Analytics Eye Tracking

How to Combine Emotion AI and Eye Tracking for Next Level Creative Insights



At Affectiva, we love to show people how our Emotion AI technology can be a measure to inform about how people are responding to advertising, entertainment content and in qualitative research.  Our facial coding solutions can be seamlessly integrated into surveys and market research platforms, allowing our tech to serve as an extra layer of analysis to understand the emotional aspects of consumer behavior. 

Collaborating with our research partner and Smart Eye sister company iMotions and their web-based eye tracking solution, we were able to combine our neuroscience tools together to showcase our capabilities and meaningful insights.  In working together, we can give our research partners demonstrable evidence of how people are emotionally engaging with content and what visual elements in the scene are of viewer interest. And all of these measures can be found within our Affectiva Media Analytics portal.


In an initial case study to test out our technologies, we used a trailer from a suspense thriller film released this summer called Beast, featuring Idris Elba. Participants first read our consent form to opt-in to take part in the survey and activate their cameras, and then completed a brief eye tracking pre-calibration video.  They then watched the trailer and finished with a post-calibration eye tracking video before completing a short series of survey questions about the trailer.

The goal of the pre and post-calibration exercise is for the eye tracking algorithm to determine the position of each participant’s eyes when they look at various parts of the screen.  These positions are used to assess the gaze on screen for the entire study.  It is therefore imperative to use the recommended calibration blocks and instruct participants to look at the dots for the entirety of their presentationLearn more about iMotions web-based eye tracking here

Learnings from Emotion AI

One of the best starting points when looking at Emotion AI data is to view our summary measures of Valence (net positivity) and Expressiveness (overall engagement) (below).  As this was a suspense thriller type of film, it is unsurprising to see moments of negative valence, as it is a desired response for the genre and contributes to the emotional journey viewers take when watching the trailer.  Additionally, we can see that overall expressiveness was at a high–viewers were engaged throughout, particularly at the action moments where the Beast enters the scene. 

The beginning of the content starts off with a positive and sentimental tone as the main character Dr. Nate Samuels (played by Idris Elba) sets up a safari adventure trip to connect with his daughters after the passing of his wife in hopes of creating happy memories.  But as the introduction of a potential threat emerges, Valence (green) decreases, and the viewers are suddenly plunged into a state of tension and ambiguity with the Beast attacking villages and people. This unsettling mood keeps viewers in sustained engagement (blue) as they try to unravel the story.

How can eye tracking enhance insights?

By using eye tracking, we were able to add heatmap functionality to our portal system.  Eye tracking heatmaps are visualizations that show the distribution of gaze points in viewers.  This allows the researcher to pinpoint exactly where on the screen viewers fixate on and the changing of their gaze patterns as they view content.  To learn more about eye tracking from iMotions, check out their Pocket Guide for best practices.

When looking at an aggregated trace of Brow Furrow (above), which may indicate focusing and tension, we can how in the first part where the real story narrative begins to pick up, viewers disperse their attention away from the actors while trying to decode the visual cues from the surroundings to get hints about the Beast. In the second part however, while the gaze is scattering across the screen during the Tension Build scene, we witness some peaks in confusion (above normative levels). As a trailer montage goes with quick change of scenes, it generates not only suspense and anticipation but some confusion too.

During the Final scene when the family are in the jeep attacked by the Beast, we can see how the viewer’s attention goes from the characters to trying to locate the Beast, as they can hear an impending attack. That results in genuine Surprise (Brow Raise) expressed during the moment, also leading to the final branding moment of the trailer, which allows better memorability (below).

Within the Affectiva portal, you will be able to locate instances like these and more within our dashboard; just as you would get the moment-by-moment Emotion AI tracelines, you can also capture the heatmaps frame-by-frame.  

How can Emotion AI and eye tracking help you?

Having the combination of Emotion AI and eye tracking technologies will now allow you to put together emotional and attentional components of your research findings.  These two methodologies can be used together in not just entertainment content testing, but also advertisements in form of video and static content with brand assets or scenes and points of interest that you want to capture gaze and emotional response for. 

As always, Affectiva’s technology does not require any kind of hardware or software installation–all we would ever need is participant consent with a working webcam where our Emotion AI can analyze the facial expression data. And with iMotions’ web-based eye tracking solution, only a webcam would be needed to capture the gaze data.  The resulting heatmaps would be available within Affectiva’s Media Analytics portal, without needing to sign up for an iMotions license.  This makes our solutions scalable for remote research and accessible as the data is all within one streamlined self-service system. 

Interested in learning more about Affectiva’s Emotion AI and iMotions’ eye tracking?  Contact us to schedule a demo!

emotion analytics for ads

Emotion AI Media Analytics Eye Tracking