This is how computers analyze emotions

Originally published in Tages Anzeiger (in German) on December 5, 2018. For videos and diagrams please visit the original source.

It’s hard to overestimate emotions. Because we humans even perceive the most delicate movements in the facial expressions and gestures of our counterpart, and we hear many nuances. Computers can not do this as well as sensitive people. But they can handle huge amounts of data. «This opens up completely new opportunities for emotional analysis,» says Olga Perepelkina, head of research at the young company Neurodata Lab.

The Moscow-based company and research department in Florida recently opened a branch in Root, Lucerne, Switzerland. and sound equipment. Neurodata Lab reviewed 805 speeches delivered by federal councilors during the current year to the National Council and the Council of States for baz.ch/Newsnet; Between 72 and 274 appearances per person.

Example of an image, main emotion: disgust. Image: PD

Almost in real time, the software analyzes emotions with self-learning algorithms and neural networks. «We rely on a multichannel approach,» said Perepelkina: The voice and facial expressions, gestures and body movements of the people filmed are analyzed. «The results are so much more accurate than with simple [unichannel] analyzes, as they are common today.» The system should also learn quickly: «The more training data we have, the better the results are.» With a new algorithm, Neurodata Lab also analyzes the brightness of individual pixels to measure the pulse even without additional wearable devices.

The system speaks only English

Perepelkina does not not hide the fact that there can be many sources of error in such analysis. For pulse analysis, for example, the video material must be of very good quality. In the emotional analysis, there are sociocultural cliffs: until now, the system has been trained in English. Even subtle and culturally induced communication differences and and individual behavior can lead to inaccuracies. In addition, facial expressions and gestures are not interpreted in the same way by all people.

Example image, main emotion: sadness. Image: PD

Computer-aided emotional analysis is much more than an app, Olga Perepelkina points out. «Today, there are already many possible applications and many more will be in the future.» For example, robots should better understand the meaning of what their counterparts communicate and learn to respond more naturally and «humanly». Emotional «translators» for people could emerge in this area. The car in the future could analyze if the driver is fine – and if necessary pull the emergency brake. It is conceivable that employers can use analytics technology during job interviews. The advertising industry is already experiencing it. New opportunities make it easier to document people’s reaction to a product.

Image example, main emotion: Joy. Picture: PD