Artificial intelligence already pervades 21st-century life, from Siri’s directions to Netflix’s suggestions of what you should watch next. But how much emotional intelligence is inside computers, cell phones, and video game consoles? In the past, the answer has been “none” — even the most complex deep learning machine is still a machine. That’s changing, though, thanks in part to Nevermind, a video game that can sense players’ emotions and adjust the experience to fit.
The psychological thriller, which debuted last year, isn’t the average first-person shooter game. Instead of being given a gun and told to kill enemies, players inhabit the persona of a Neuroprober, a physician who can enter the minds of trauma victims. As they explore the troubled psyches, Neuroprobers must solve logic puzzles and recover memory fragments to help their patients get better.
The world of the game becomes darker and more twisted as players exhibit more stress — and when they calm down, the game does, too. With the help of optional sensors like Garmin heart rate monitors, Apple Watch, and the Tobi EyeX controller, Nevermind responds to biofeedback like heart rate and eye movements. Now, the latest updates expand the game’s capabilities by incorporating emotion recognition software Affdex, which uses data from players’ webcams to track their response to emotional distress and further alter what happens on screen.