Originally published on Ponedelnikmag on January 16, 2019.
Imagine a robot that can get your ironic joke and smile back — this is the future that the developers of AI-based emotion recognition technology are approaching every day. Among them is Neurodata Lab, which recently set the machine the task of deciphering the mysterious smile of Mona Lisa. How did the algorithm work and did the machines learn to distinguish fake emotions? Read the interview with Neurodata Lab’s Chief Research Officer Olga Perepelkina.
Olga Perepelkina graduated from the Clinical Psychology Department at the Moscow State University and then worked in the laboratory of neurocomputer interfaces at the biological faculty of the Moscow State University. She has worked at Neurodata Lab since the foundation of the company, where, together with other researchers and developers, has created algorithms for automatic recognition of emotions and social behavior.
— Olga, what Neurodata Lab is working on now?
— We are engaged in research and creation of Emotion AI-based solutions. Thus, we have two main directions of work: R&D and business.
We’re currently working on several projects for banking, robotics, retail, and digital HR. They are all directly linked to automatic emotion recognition but in application to the specifics of the sphere. For banking, it is customer satisfaction assessment during client-employee interaction, both live and over the phone. For robotics, it means the introduction of emotion recognition soft into a robot, as well as the ability to change its communication strategy depending on the emotional state of the person it is talking to. For retail, it is about measuring the product satisfaction score, while for HR it is a detailed emotional and behavioral analysis of candidates who record video interviews.
We can receive information about how people assess the quality of service, whether they interact with a person or a robot, by analyzing face, voice, body movements and physiological signals. It opens up a big door for the introduction of technology to a variety of industries.
At the same time, our main agenda is to learn how to work with complex emotions, to understand how they manifest themselves and how to track them, taking the context into account and analyzing inner and outer relations.
— Are all emotions “clear” for the machines?
— Most systems recognize a so-called “basic” set of emotions: happiness, sadness, anger, disgust, fear, excitement and a neutral state. These emotions are usually intensely manifested through a happy smile or an enraged voice, what makes them easier to be recognized automatically.
Complex emotions aren’t that easy. The simplest way to teach the machine to recognize them is to present complex emotions as a sum of simpler ones. A vivid example of the complex emotion is the mysterious smile of Mona Lisa. We processed the picture via our algorithm and got interesting results: Gioconda’s face was 36.6% happy and 4.4% sad, but mostly neutral.
There are also hidden emotions that people experience but try not to show. A person can learn to control their mimics and voice to some extent but non-verbal manifestations are harder to follow. Not to mention physiology such as the pulse or rapid breathing. It is difficult to completely hide your feelings. The systems learn to recognize the subtlest manifestations.
Finally, there are so-called fake emotions when one expression is swapped with another. For example, a fake surprise. In these situations, people may reveal themselves by an atypical expression of emotions, but on this stage of technological development, it’s hard to detect.
— It’s not done just in Russia, but in other countries. Is there anyone interesting to keep in mind?
— These technologies have existed for about a decade in the U.S. If we talk about something peculiar, there is a very interesting case in the gaming industry — when emotion recognition was incorporated into the horror game Nevermind. The plot is adjusted to the player’s state of mind. His level of stress and fear are automatically detected in the video and via the heart rate monitor. Another game “Bring to Light” is also based on similar technologies, the degree of intensity of events changes for each user.
It is not a secret that posts on social networks can also be analyzed. In October 2018, U.S. researchers analyzed the posts of 700 Facebook users and found out that the algorithm can accurately predict depression three months prior to the official diagnosis.
On the other hand, there is an imitation of emotions. The company called Soul Machines creates avatars that can express a wide range of emotional states in a very naturalistic way, including in response to the emotions of the interlocutor.
— Let’s talk about how your tech can be applied in everyday life.
— It has a great potential for application, both in business, medicine, and everyday life. Such applications help to detect stress, warn about excessive nervous tension at work, and tell you when you need to rest. There are already bracelets that send an alert if the owner of such a bracelet has an epilepsy seizure to the curators and the family. Or the apps that help people with autism spectrum disorders to understand their interlocutor’s emotions better.
In summary, we can say that emotional technologies will be in demand in the entertainment, for example, in the gaming and AR/VR industries, and in more serious areas, such as security systems and digital medicine.
If you look at what the modern world of gadgets looks like, you will see a burst in popularity of devices that collect information about the human activity. Besides, the most obvious trend is more human-like human-computer interaction. Siri or Alisa are distant prototypes of how people of the future will talk to devices. Smart assistants will better understand the users who in turn will be able to solve a larger set of problems with their help. There will be robots that will be able to take care of elderly people and children. Part of business processes will become automated, including those where advanced communication skills are needed: for example, customer service centers.
— In which industries the technology is the most in demand?
— As we’ve already mentioned, the main customers of Neurodata Lab solutions are banks, robotics companies, and platforms for video interviews. Our technologies are indispensable at call centers. They help to understand how satisfied or dissatisfied a person is with the service and, if necessary, can switch them to a live employee. In stores, banks, insurance companies where high quality of service is also important, emotional analytics will allow managers to collect impersonal customer statistics in real time and then train the employees on its basis.
The request comes primarily from the businesses that are actively introducing service robotics. It is important for companies to understand how the robot should perform, what should be its functions and role, how they correlate to the needs of the customers, and how it can be improved. A robot consultant in a bank will be most often asked about particular services, and tracking the reaction to the provided information will help to understand whether the info has been useful or not. Another important area for modern robotics are personal robots and assistant robots. They must have a high level of emotional intelligence. Robots serving people with disabilities and the elderly can become not only mechanical assistants but also social workers. And, of course, emotion recognition and expression in such robots should be at a very high level.
— What difficulties do you face in your work? If we talk, for example, about human resources, about material and technical resources?
— We need a strong technical team, and we are talking not just about the coders but about the people who have the knowledge and skills to create high-precision automated systems, which have no analogs in open access. Searching for employees of such a spectrum is a difficult task which is known to every HR-specialist. The market is saturated with offers, and there are few specialists in this field… and it is a good opportunity for those who only choose a profession or a niche for development!