Emotion AI Media Analytics Qual

Segmenting Viewers with Both Qualitative & Emotion AI Insights



Creative development in the entertainment industry often relies on qualitative researchers using their observations and intuition to understand participants' feelings. However, sometimes participants won't, or simply can't, articulate the emotion they're experiencing. That’s where Affectiva’s Emotion AI technology comes in! Emotion AI can support the qualitative researcher through easily administered technology that provides confidence in user experience and adds scientifically validated insights — all while complimenting and enhancing the expertise of the researcher.

Affectiva recently collaborated with Vidlet to assist a movie studio with a trailer pre-test. The goal was to capture differentiated viewer responses, in a natural environment and “in their own words,” while also eliciting the unfiltered and underlying emotional response.

On a recent podcast episode, Affectiva’s Director of Marketing, Ashley McManus was joined by Alex Duckett, Account Director, Media Analytics at Affectiva and Paige Guge, Design Researcher at Vidlet to discuss the ins and outs of the case study.

For those who are unfamiliar with Vidlet, can you explain briefly what your offerings are?

Paige: Vidlet is an insights company where we use the power of mobile video to learn how customers think, feel and behave in their natural environment. We do this by using three main types of studies. The first and most common is called an empathy study, which aims to identify a customer’s needs, pain points and delights. The second type of study, known as concept testing, allows for early customer input on designs, from low fidelity mock-ups to refined prototypes. The third type, a diary study, provides continued interaction with participants over a longer period of time, allowing patterns to be identified. 

Vidlet uses cell phones that we have on us every day to delve into people's lives to be able to see them in their natural environment and better understand what they think and how they feel. 

Alex and Paige, you both came together to present a talk for Qual360 — can you give us a high-level description of the background of your presentation and what you were looking to solve for?  

Paige: The goal of this project was to take a piece of content — in this case it was a trailer for an upcoming movie — and to understand in a holistic way how this trailer was going to land in the market with target consumers. The client was interested in the reactions to the trailer, both viewers' overall thoughts on the film in general, but also the trailer editing. They wanted to use robust, quantitative data to get a full scope of understanding as to how this film was going to resonate with viewers. It's a unique trailer, unlike any genre out there, so the client wasn’t sure how exactly to describe it and wanted to see from every angle how people would react.

Alex: I think from the Affectiva perspective, our agenda was driven by trying to push methodologies in this space and disrupt the way we understand creative content, such as a movie trailer. Many people will be familiar with the dial test that often gets used in this area — someone's watching a film or a film trailer and turning a dial to the right or left depending upon how interested they are in the content they’re watching. 

While dial testing can be insightful, using Emotion AI technology provides a much deeper understanding of viewers' in-the-moment facial and emotional responses as they watch different types of content. Emotion AI unlocks doors a bit further, providing insight into how someone is really feeling in that moment, because by the time your brain has a chance to think and turn a dial one way or another, all of the social norms and cultural biases have already had an impact.

Let’s talk methodology — can you dive deeper on what this process looked like, and how Affectiva and Vidlet worked together to get these results?

Alex: It was really a team effort — Affectiva provided the building blocks to the insights that we were generating, and then Paige and the team at Vidlet brought together multiple data sources to really make sense of it. Thinking about Affectiva Emotion AI data specifically, we were focused on attaining real life understanding of viewers consuming content in their own natural environment. 

We shared a web link for the viewer to click on using their own device — computer, tablet, mobile, whatever it may be — and leveraging the device’s camera, the viewer would watch the content as they normally would in their home environment. Our software then captured the viewer’s moment by moment reactions based on their facial expressions, providing a really rich picture of exactly what specific moments evoked strong emotion and engagement as they watched the content. 

 Affectiva’s large data set, diverse in terms of demographics and cultural groups, provided a full, contextual understanding of how viewers were feeling during the trailer experience. With our unbiased emotional response data ,we were able to feed those signals into the wider project for Paige to interpret.

Paige: After the viewers watched the content, we immediately asked them to open their Vidlet app and record themselves talking about what they saw and how they experienced that content. So it was kind of that two-fold process — we’ve already collected their moment by moment emotional reactions to the content and now we're getting in their own words how the content made them feel and the “why” behind it. We also asked participants to answer a few open ended survey questions, which was incorporated into our analysis as well.

Can you give a couple of examples or summarize what their thoughts were around the trailer? 

Paige: When analyzing the data, it was interesting to see that there were some demographic groups that this trailer absolutely landed with, and others that it didn’t resonate with at all. The client for this project wasn't sure how the trailer would be received by different age groups, and was especially interested in determining how young is “too young” for this content. Through running this study, we learned that the plot was completely lost on children 13 and younger, and in some cases the children were scared and confused throughout. (See graph below)

Screen Shot 2021-06-24 at 4.54.50 PM

Looking at the analysis of the 1,300 boys aged 13 and under, it was so validating to see the boys' recounting of their experience substantiated by Affectiva’s emotion AI technology.  In contrast, the mid-late teens and early 20’s groups were really intrigued by the trailer with some wishing the trailer shared more of the plot. Again, it was great to see the emotional peaks in the data echoing their enthusiasm and interest as described “in their own words”. 

Understanding both sides of the spectrum, those that resonated with the trailer as well as those that it didn’t, is super valuable information for the client. These insights allow the movie studio to make necessary adjustments in order to ensure they’re reaching their key demographic.

What are some of the limits of collecting demographic responses? How can we create an emotional segmentation and why do we even need a new approach to understanding trailer and creative content?

Alex: What we discovered through this whole process was that just focusing on demographic criteria alone leaves you with a really simplistic and potentially misleading picture. Certainly looking at our data from this, you could see that there were distinct and different types of emotional response, but you couldn't just neatly segment them into a particular age range, for example. There was much more to it than that, it was much more layered. It was especially interesting to then see the Vidlet layer findings as they started to pull apart those different threads that really separated people and united people in terms of their different responses.

Paige: My take would be that it is just way too simplistic to say that demographic criteria defines a person’s response. Looking at the group that we call the intrigued viewers or “opportunity group”, as they watched the trailer they started out interested but then there was a moment along the way where they disengaged. 

Using Affectiva’s Emotion AI analysis allows us to go back and see the exact moment where the viewer emotionally disengaged while watching. Pairing those metrics with viewer commentary and understanding exactly what about the trailer is losing people provides rich context that you wouldn’t get from looking at demography alone.

The Bottom Line

While Qualitative researchers are experts at identifying participants' often unspoken emotions, Affectiva’s Emotion AI technology can support the researcher’s own perceptions of emotional subtleties and shared feelings. The cutting edge technology can highlight crucial times where a respondent may be feeling something different to what they are saying and pinpoint moments that might not have come up in conversation and otherwise would have been missed. Using multiple robust measures together, provides more complex and deeper understanding that adds a layer of scientific validity to qualitative research. 

Interested to see how you can strengthen your Qualitative Research by integrating Affectiva’s Emotion AI Technology? Contact us to learn more!


New Call-to-action

Emotion AI Media Analytics Qual