How emotion-tracking A.I. will change computing as we know it

With the exception of the occasional “Are you happy to continue with installation?” type pop-up message, computers haven’t classically cared much about how we feel.

That’s all set to change with the arrival of affective computing, the development of systems and devices that are able to recognize, interpret and respond accordingly to human emotions. With modern artificial intelligence breakthroughs having given us machines with significant IQ, a burgeoning group of researchers and well-funded startups now want to match this with EQ, used to describe a person’s ability to recognize the emotions of those around them.

Read the full article originally published in Digital Trends, in which Neurodata Lab’s Founder and CEO George Pliev comments the state of affective computing, shares his opinion on the future and challenges the industry face.

“From a technological point of view, the task of accurate emotion recognition was made possible by the active development of artificial neural networks,” George Pliev, founder and CEO of emotion-tracking company Neurodata Lab, told Digital Trends. “Thanks to them, machines learned to understand emotions on faces turned sideways and in low-light conditions. We got cheaper technology: even our smartphones have built-in neural network technologies [today]. This allows to integrate emotion recognition into service robots, detect emotions using a simple webcam, or to quickly process data in the cloud.”

“[…] some emotions are still tough nuts for detection systems,” Neurodata Lab’s George Pliev said. “We have recently tested the most well-known emotion recognition algorithms and found that happiness, sadness, and surprise were the easiest emotional facial expressions to detect, while fear, disgust, anger, and a neutral state were the most difficult for A.I. Complex cognitive states, hidden, mixed and fake emotions, among other things, would require analysis and understanding of the context, and this hasn’t yet been achieved.”