top of page

AI Case Study

MARS Marketing improved accuracy of purchase intent in marketing research by 8% using facial analysis during ad viewing

MARS Marketing conducted a marketing research study which tracked customers' facial emotions while watching ads. This was to determine if emotional state could be used to predict likelihood to buy products, particularly more so than traditional marketing research techniques. They found that emotional state complemented research subjects' self-reported purchase intents to boost predictive power.

Industry

Professional Services

Consultancy And Business Services

Project Overview

"The primary goal of the partnership between MARS Marketing and Affectiva was to explore the role that emotions in
advertising play in driving sales. MARS selected a subset of ads from their database that could be quantified as good, average, and bad with high confidence. Most were entertaining but had performed quite differently in the market: some performed poorly, others did very well. The ads were all aired between 2001 and 2012. In total, 115 ads were tested, from 4 different regions (France, Germany, UK and US) and four product categories (pet care, instant foods, chocolate and chewing gum). To systematically explore how emotions in advertising drive sales, our teams designed a large-scale study to collect facial emotion responses to ads for which we also had sales data.

Facial coding technology provides an objective measure of emotion that is unobtrusive, cost-effective and scalable. The universality of facial expressions, the ubiquity of webcams and recent developments in computer vision make automatic facial
coding an objective, passive and scalable measure of emotion. Moreover, unlike selfreport, facial coding enables the remote measurement of subtle and nuanced expressions of emotion on a moment-by-moment basis. Affectiva’s automated facial coding technology captures facial action units and identifies the main dimensions of emotion (valence and arousal/intensity) and a range of discrete emotion states, e.g., enjoyment, surprise, disgust/dislike, confusion and skepticism. The technology is made
available as a cloud-based platform, allowing for seamless integration with online surveys as well as offline venue-based studies. Affectiva’s facial coding platform is currently used on an ongoing basis to pre-test ad copies globally for major brands
including Unilever, Coca Cola and others."

Reported Results

The research found that "the automatically coded spontaneous facial actions were more telling of the short-term sales effectiveness than the self reported purchase intent or likability of the brand. Facial emotion responses could complement traditional self-report methods; in this case study, boosting the discriminative power of self-reported measures by 8%."

Technology

"To quantify a person’s emotional response, algorithms process each frame of the face video and locate the main features on the face (e.g., mouth, eyebrows). The movement, shape and texture composition of these regions are used to identify facial action units such as an eyebrow raise or smirk. Machine learning classifiers then map facial texture and movement to emotional states. The average accuracy when tested on spontaneous images is over 90%; the smile detector is the most accurate detector with accuracy of 97%. Support Vector Machines (SVM) are a class of statistical classification models that have been used effectively in many disciplines where prediction is a objective (Wang 2005). Therefore, SVMs were chosen to perform classification of ads in
this preliminary analysis. Cost and gamma parameters, two inputs to SVMs that must be chosen, were varied in the validation process performed on the training data. We use a start of the art classifier that learns the profile of a good versus bad ad. We compare the performance of our models with a baseline that represents chance (i.e., the flip of a coin to decide whether the ad would perform well or badly in sales.)".

Function

Marketing

Marketing Research Planning

Background

"Like many advertisers, MARS is faced with the challenge of
assessing which ads are more likely to be effective from a sales point of view. It is critical to understand why, for instance, only one of two humorous ads that MARS had produced in the Chocolate category succeeded in boosting sales. What are the emotion profiles of the ad that led to sales success? Beyond predicting sales effectiveness, MARS also sought insights into what really
makes an ad work in order to identify “best practices” that can inform the design of future advertising to increase ad effectiveness."

Benefits

Data

"To train the emotion classifiers, our framework leverages a growing repository of realworld facial data, which at the time of writing exceeds over 300 million facial frames (sourced from 500,000+ facial videos) of real-world emotion data that represents a multitude of age ranges, ethnicities and cultures. Each ad was viewed by ~100 respondents; over 11,000 responses were
captured making this to date the largest existing dataset of facial emotion responses linked with sales data."

bottom of page