Smart Eye extends the use of Affectiva Emotion AI technology with new Conversational Engagement and Conversational Valence metrics that provide deeper insight into consumer responses using facial expression analysis in online qualitative research studies.

 

BOSTON–(BUSINESS WIRE)–Smart Eye today announces new capabilities in its category-defining Affectiva Emotion AI, that provide deeper insights for online qualitative research not available before. This release adds conversational engagement and valence metrics that use facial expression analysis to understand the emotional states and reactions of participants speaking in online qualitative research studies, such as focus groups and verbatim video feedback.

The new conversational engagement and valence metrics augment the “human touch” of study moderators, who can now gain additional insight more quickly and effectively during online studies. This has become critically important during the pandemic, where research studies, that were previously conducted in person, moved online.

Affectiva’s Emotion AI technology is used by 70 percent of the world’s largest advertisers and 28 percent of Fortune Global 500 companies to understand viewers’ emotional reactions to content and experiences, maximizing brand ROI. With the help of Emotion AI, clients can test the unbiased and unfiltered emotional responses that consumers have with brand content, such as video ad content and longer TV programming. Affectiva’s technology has validated exclusive measures to give confidence in market performance, giving companies clear guidance on the emotional role of their brand. The device-agnostic system works across mobile, tablet, desktop and physical environments, and works with optical sensors, standard webcams, and near-infrared and RGB cameras. Today’s announcement expands the capabilities and uses of the company’s market-leading Affectiva Emotion AI.

 

Conversational Metrics: Bringing back the “human touch” with deep analysis to a virtual research environment

While Affectiva’s Emotion AI for Media Analytics has traditionally worked on people watching content and expressing emotional responses to that content, today’s announcement brings an interactive element to the technology. In a world that has moved, and is expected to largely stay an online testing experience, it’s important for focus study group moderators to be able to pick up on the subtle discrepancies between what is said and felt in a virtual environment.

The Conversational Engagement and Conversational Valence metrics have been developed to augment the research community whose business it is to talk directly to users of services. Built on deep learning, the new metrics allow for the distortions in facial expression produced when people speak— thus allowing more accurate, and inference of responses based on those expressions. An adaptive version of the metrics is particularly suited for focus group discussion videos with multiple participants. Using a speech detector, it applies the new conversation metrics only to those sequences and individuals who are speaking – allowing the optimal measures to be deployed whether people are speaking, or reacting to the speech of others, as demonstrated in this video. See this video for more detail.

Smart Eye customers that augment their qualitative research using this Affectiva technology will provide validation to the “gut feel” their moderators have when conducting their research with participants. Conversational engagement and valence measurements can be used as an efficient tool to analyze study data in order to quickly identify emotional moments. These metrics can also provide more compelling evidence in debriefs of the emotional power of topics tested.

These metrics can be applied to most areas of qualitative research, and no other facial expression analysis tool offers this capability yet. While the focus remains on facial expressions of response, these metrics add a unique additional dimension which is designed to be used alongside the verbal responses given by respondents for a more complete picture of people’s responses.

“Affectiva has been helping market researchers for over a decade now understand how consumers emotionally respond to all sorts of content,” said Dr. Rana el Kaliouby, former Co-Founder and CEO of Affectiva, now Deputy CEO of Smart Eye. “The addition of conversational engagement and valence to our Emotion AI provides an even more robust way to detect viewer engagement and excitement when they are discussing ideas and concepts, rather than simply viewing content. I am excited that we are now able to bring this unique capability to our customers who have eagerly anticipated it.”

“Access to this kind of analysis has been a game changer for our moderators,” Affectiva Smart Eye customer Sarah Gorman, Director at Two Ears One Mouth, commented. “It’s exciting not only to see the next iteration of this innovative Emotion AI technology, but to collaborate with a partner that understands and prioritizes working with us to solve for the inevitable new challenges we face today and in the future.”

Conversational Engagement and Conversational Valence will be made available to Affectiva Emotion AI customers this September in both the Affectiva SDK and the core media analytics product.

Additionally, Global Managing Director of Affectiva Media Analytics Graham Page will be presenting in more detail about this at the Neuromarketing Web Forum on September 29th in Berlin. In his session, he will share how in-the-moment emotional reactions to content have been able to help businesses understand how consumers’ emotional responses to campaigns are changing in the face of a changing world. See more about his session here.

 

For more Information

For more information on Affectiva’s media analytics offering, visit: https://go.affectiva.com/affdex-for-market-research.

 

About Smart Eye and Affectiva

Smart Eye is the global leader in Human Insight AI, technology that understands, supports and predicts human behavior in complex environments. We are bridging the gap between humans and machines for a safe and sustainable future. Our multimodal software and hardware solutions provide unprecedented human insight in automotive and behavioral research—supported also by Affectiva and iMotions, companies we acquired in 2021.

In behavioral research, Smart Eye offers the world’s most advanced eye tracking systems for analyzing human behavior. Providing unparalleled performance in complex environments, our carefully crafted instruments enable deeper insights into human behavior and human-machine interaction in automotive, aviation, assistive technology, media & marketing, behavioral science and many more fields. Today, our industry-leading technology is used by NASA, Airbus, Boeing, Toyota, Daimler, Audi, GM, Harvard University and hundreds of research organizations and universities around the world.

Our Affectiva Media Analytics division provides the world’s largest brands and market researchers with a deeper understanding of how consumers and audiences engage with content, products, and services. Our industry-leading Emotion AI is used by 70 percent of the world’s largest advertisers and 28 percent of the Fortune Global 500 companies. iMotions provides the world’s leading biosensor software platform, enabling real-time synchronization of data streams from multiple sensors for advanced research and training in academic and commercial sectors.

In automotive, Smart Eye offers road-ready Driver Monitoring Systems and next-level Interior Sensing solutions. Our technology has been selected by 14 of the world’s leading car manufacturers for 94 car models, including BMW and Geely. Smart Eye also provides complete hardware and software solutions for fleet and small-volume OEMS. As the preferred partner to the automotive industry, Smart Eye is leading the way towards safer, more sustainable transportation and mobility experiences enhancing wellness, comfort, and entertainment.

Smart Eye was founded in 1999, is publicly traded and headquartered in Sweden with offices in the US, UK, Germany, Denmark, Egypt, Singapore, China and Japan. Learn more at www.smarteye.ai.

 

Contacts

Hailey Melamut
Vice President of PR, Walker Sands
Email: hailey.melamut@walkersands.com

 

See original release on Business Wire here