Amplify advertising effect using facial emotion recognition

  • by Magnus Linde
  • 5 min

Woman looking at a computer emotion recognition

The market research expert Magnus Linde explains with an example study, why and how testing ad concepts for emotional impact can improve the success of commercials.

Memorable advertising evokes emotions

If the first principle of advertising is to get seen, then the second is the ad must be felt. Successful advertising depends on getting both right. But while the first is intuitively easy to understand – what is unseen cannot affect anyone – the second may be less obvious. In business, we are trained to think rationally and to set emotions aside. Or as the neuroanatomist Jill Bolte Taylor puts it, “We live in a world where we are taught from the start that we are thinking creatures that feel. The truth is, we are feeling creatures that think.” Breakthroughs in brain science over the last decades have shown that emotions are the source behind all human action. Making customers feel is thus key to gaining their attention, committing the ad to memory, and encouraging them to ultimately buy our products. Therefore, ad testing with traditional market research that relies on conscious and rational thinking often fails to capture the whole picture. And that gap between data and reality may translate into costly failures. Luckily, there are methods to round out this problem and bridge the gap.

Why you should leverage eye tracking and facial coding in ad testing

Thanks to technological advances it is now possible to implicitly measure human responses to advertising on TV like commercials and product placements, social media video ads or YouTube pre-rolls. Facial coding and webcam eye tracking are two easy-to-use methods to measure those first crucial principles of advertising. Using them in combination turns the otherwise elusive powers of attention and emotion into unambiguous data. And together they are quick to collect data and cost-effective at that. Here, let me show you what we found out about a couple of Tobii Dynavox video ads in less than 48 hours and while spending just a few hundred dollars.

An example: Evidence-based social media ad testing

Nowhere is the war for attention more intense than in social media. So, to make sure the two video ads for Tobii Dynavox would hit the mark we tested them with Sticky by Tobii. Participants saw just one of the videos in the test and while they watched it we tracked their attention and emotions via their webcam. Afterwards we also asked how they liked the ads.

Listen to a Story - Google Assistant now in Snap Core First

Listen to a Story - Google Assistant now in Snap Core First

Play and Control Music - Google Assistant now in Snap Core First

Play and Control Music - Google Assistant now in Snap Core First

When the results came back, answers to the survey questions indicated an overwhelming success in positive responses to the ads. Almost all respondents (+90 %) said their overall impression was ‘very’ or ‘somewhat positive’. Likeability is a key indicator of advertising effect, so this told us one important thing; the ad concept worked once we got people to stop and watch it.  

However, it is one thing that people like a video after viewing it to end in a testing environment where we have already secured their attention. But is it engaging enough to catch and hold peoples’ attention in the wild? That is why we also tested the ads with implicit research methods.

Turning to the emotional data, we saw that sadness appeared to be the dominating emotion. And while that might raise concerns in many other cases, here is what we were looking for. In the context of assistive technology, sadness indicates that the videos manage to engage compassion and move the viewers enough to earn their attention.

Emotional recognition - sadness using Sticky by Tobii Pro

Also, drilling deeper into the data, we could see peaks of joy in the moment the video shows the happiness resulting from using the assistive technology. And eye tracking showed that almost all participants looked at the product name in the same moment as joy peaked, leaving viewers with a positive brand connection. 

Emotional recognition - joy using Sticky by Tobii Pro

Overall, the data tells us we have a winning concept with a high probability of succeeding in the marketplace. We also got clues as to how to edit later versions for potentially even higher engagement. We know from earlier tests that minor editing can make huge differences. A beer brand commercial, for example, got an earlier and stronger joy-response and a +40% overall higher emotional engagement by just trimming a 30 second spot to 20 seconds (Sticky by Tobii research 2017). 

Emotion recognition and eye tracking adds perspective 


Summing up the findings reminds me of an old parable where blind men describe an elephant by feeling one part of the animal’s body, and while each man’s description is correct, they all describe a different animal. 

Similarly, in research, focusing on results from a sole data source runs the risk of missing the bigger picture. In our test, for example, relying solely on survey responses would have left us in the dark about whether they would engage enough or not. Adding implicit research provided us with that crucial validation in an unambiguous way. 

In conclusion, ad testing with webcam eye tracking and facial coding enables all campaign stakeholders – from creative teams to media buyers – to get a broader, richer, and more accurate view of how their advertising performs. 

Written by

  • Magnus Linde - Tobii Pro Event Speaker

    Magnus Linde

    Marketing Research Specialist and co-founder Implicit Academy

    Magnus Linde is an experienced Market Researcher and Analyst with 20+ years’ experience. Magnus has also co-founded the implicit academy, a network dedicated to spread awareness and knowledge in behavioral economy, neuromarketing and implicit market research, with special focus on eye tracking, facial coding, EEG and IAT.

Related content

Subscribe to our blog

Subscribe to our stories about how people are using eye tracking and attention computing.