Overview

Artificial emotional intelligence or Emotion AI is also know as emotion recognition or emotion detection technology. In market research this is commonly referred to as facial coding.

Humans use a lot of non-verbal cues, such as facial expressions, gesture, body language and tone of voice,  to communicate their emotions.  The human face provides a rich canvas for our emotions, as we are innately programmed to express and communicate emotion through facial expressions.  Our Emotion AI unobtrusively measures unfiltered and unbiased facial expressions of emotion, using any optical sensor or just a standard webcam.

Our technology first identifies a human face in real time or in an image or video. Computer vision algorithms identify key landmarks on the face – for example the corners of your eyebrows, the tip of your nose, the corners of your mouth. Machine learning algorithms then analyze pixels in those regions to classify facial expressions. Combinations of these facial expressions are then mapped to emotions.

Like the face, speech contains strong signals for human emotions. Our science team is now working on technology for analyzing a speakers tone of voice. We plan to make this available in our products soon.

Metrics

In our products we measure 7 emotion metrics: anger, contempt, disgust, fear, joy, sadness and surprise. In addition, we provide 20 facial expression metrics.  In our SDK and API we also provide emojis, gender, age, ethnicity and a number of other metrics. Learn more about our metrics here.

Data and accuracy

Our algorithms are trained using our emotion data repository, that has now grown to more than 5 million faces analyzed in 75 countries. We continuously test our algorithms to provide the most reliable and accurate emotion metrics. Now, also using deep learning approaches, we can very quickly tune our algorithms for high performance and accuracy. Our key emotions achieve accuracy in the high 90th percentile. We sampled our test set, comprised of hundreds of thousands of facial frames, from our emotion data repository. This data has been gathered representing real-world, spontaneous facial expressions, made under challenging conditions, such as varying lighting, different head movements, and variances in facial features due to ethnicity, age, gender, facial hair and glasses. You can find more information on how we measure our accuracy here.

How to get it

Our emotion recognition technology is available in several products.  From an easy-to-use SDK and API for developers, to robust solutions for market research and advertising.