How a viewer reacts in a particular interaction can be determined by the mental states that the viewer exhibits. Loud shouting and an angry face convey a very different message from that of a normal speaking voice and a smiling face. The mental states of an individual can be inferred by capturing mental state data, facial data, skin electrical activity, etc. from the individual. Since mental states change over time, data capture over time is required to keep track of the current mental states. However, in a mobile environment, continuous data capture is not necessarily possible. Instead, the mental states must be inferred from some data that can only be captured sporadically. Here, mental state data from an individual is collected sporadically, while other data such as skin conductivity data may be collected continuously. A web request is sent to a web service to analyze the viewer’s mental state data. Mental state analysis is interpolated between analysis using collected data, and additional mental state data is imputed. Other faces captured inadvertently along with the viewer’s face are filtered out to determine whether the viewer is looking at a camera. Contextual data is determined based on motion data, and an output is rendered based on the analysis of the mental state data.

Read the full patent here.

Title: Sporadic Collection of Mobile Affect Data

Patent Number: US 9,204,836 B2

Application Number: 14/064,136

Filing Date: Oct. 26, 2013

Issue Date: Dec. 8, 2015

Inventors: Bender, Daniel; Rana el Kaliouby; Evan Kodra; Oliver Ernst Nowak; and Richard Scott Sandowsky

Related Application Data: Continuation-in-part of application No. 13/153,745, filed on Jun. 6, 2011.
Provisional application No. 61/719,383, filed on Oct. 27, 2012, provisional application No. 61/747,651,