Watching people in conversation, you can usually tell when they lose interest or get confused or excited. But computers have a hard time telling a smile from a smirk, confusion from anger.
After examining thousands of faces from around the world, a Waltham start-up has figured out how to interpret these subtle shifts in expressions in real time. By reading tiny movements in the corners of the mouth, nose, and eyes and the shifting angles of the eyebrows, Affectiva Inc. can digitally decode facial messages — more effectively, it says, than others have done before.
The company’s facial-coding product, Affdex, has been used by advertisers, including Coca-Cola and Unilever, to gauge consumers’ reactions to commercials — one-fourteenth-of-a-second at a time.
On Friday, Affectiva released a beta version of Affdex for mobile devices, so companies will be able to get the same kind of feedback from cameras on consumer’s smartphones and tablets — with their permission, of course.
Affectiva hopes the new platform will help broaden its customer base to include companies in the mobile gaming, online education, and health-tracking businesses, virtually any field where an app developer wants feedback.
“We understand that emotion is important in a whole bunch of applications,” said Rana el Kaliouby, who helped develop the basis for the Affdex technology while she was a research scientist at the Media Lab at the Massachusetts Institute of Technology.
Although most people do not have trouble interpreting facial expressions, computers struggle to distinguish between an individual’s features and his or her expressions, Kaliouby said.
At MIT and in early work at Affectiva, she developed an algorithm for identifying expressions. With a genuine smile, the mouth is curved equally on both sides, while it is uneven in a smirk, for example. Disgust differs from confusion — though both have a furrowed brow — with a wrinkled nose.
By digitally distinguishing between expressions, Affectiva’s customers — so far, they are mainly marketing firms — can analyze test subjects’ reactions as they watch ads. Understanding when these subjects lose interest, get confused, or are truly engaged allows companies to make more effective advertisements and sell more products.