Send Us a Message
Your message has been submitted successfully. We will be in touch shortly.
Make sure to connect with us on social media via the channels listed below.
Sorry, something went wrong.
Our emotion-sensing and analytics technology began with groundbreaking research at MIT's Media Lab. Our industry-leading, patented science continuously pushes the boundaries of innovation in artificial intelligence. Using state-of-the-art computer vision and deep learning methodologies, we develop face and emotion algorithms (“classifiers”) that are trained and tested for broad coverage of nuanced emotional expressions achieving very high accuracy.
Our science platform is unparalleled in the field of emotion-aware AI, as it leverages massive amounts of spontaneous and natural data that we have gathered. Our emotion data repository consists of more than 3.9 million faces analyzed from over 75 countries, amounting to over 40 billion emotion data points. This data fuels the training and testing of our classifiers, and provides unique insights and analytics on which we build robust and relevant norms and benchmarks.
Affectiva has the largest patent portfolio of any start-up in this field, with 7 patents granted and over 30 pending patent applications.
Affdex delivers discrete and continuous emotion metrics — measured moment-by-moment from a single face or multiple faces simultaneously, in a video or still image. Accuracy and scale is achieved by highly precise emotion classifiers, trained using deep learning and massive amounts of data. This level of refinement provides for robust and trustworthy emotion and other face-related metrics that fuel our norms and predictive analytics.
We measure 7 emotions: anger, sadness, disgust, joy, surprise, fear and contempt, as well as valence and engagement. Metrics also include 15 facial expressions, gender, eyeglasses, head pose and lighting conditions. We also map facial expressions to emojis. Learn more here.
Our technology is built on rigorous science that is tried and tested on real-world applications. Affdex classifiers are trained on naturally-occurring data — not posed in optimal lab conditions. This enables our algorithms to accurately analyze faces “in the wild”, typified by low lighting and awkward head angles.
With a robust infrastructure built for global scale, Affdex easily meets demand, whether you build digital experiences and apps that run on your device, or integrate emotion insights and analytics into your data framework. Affdex can run on the cloud, or on multiple platforms including iOS, Android and Windows.
Emotions are the number one influencers of attention, perception, memory, human behavior and decision making. Our analytics turn emotion metrics into actionable business insights. Using our massive emotion data repository, we build norms and benchmarks that provide insight into how humans express emotions across cultures, geography, gender, and age, and how they respond to different types of content and digital experiences. Our emotion analytics are cross-culturally validated, confirming that while key facial expressions are universal, their magnitude varies by region.
In media and advertising, our analytics can predict if content is likeable and memorable, and whether it drives consumer behavior like purchasing, sharing or watching a video to completion. Brands and market researchers use our norms to optimize brand messaging, content effectiveness and media placement. Our entertainment clients use it to uncover viewer engagement with movie trailers and television pilots.
Emotion analytics can also augment the consumer experience in healthcare, automotive, robotics, video communication, corporate training and HR. In education, emotion analytics can be an early indicator of student engagement, driving better learning outcomes. In gaming, we can augment data analytics with real time player emotion metrics that trigger changes in game play, or help predict when a player will quit or purchase.
Over 1,400 brands, including 1/3 of the Global Fortune 100, use Affectiva's technology to understand consumer emotional engagement with digital content.