AT CES, A.I. ROBOTS CAN READ YOUR EMOTIONS

Originally published in GritDaily on January 12, 2019

Technology to help predict human emotions has been evolving over recent years and it only continues to grow more accurate. During my time at CES 2019 in Las Vegas, I was able to get close up and personal with one of these bots. Specifically, one that interacted with me, danced, and held somewhat of an engaging conversation. 

MEET THE EMOTIBOT

Born from the joint collaborations between Neurodata Labs, an emotion-A.I. software developer, and Promobot, a commercial robotics manufacturer, the ’emotibot’ was unveiled at CES 2019 for the first time, exploring the concept of multi-modal emotion detection. Specifically, it unveiled two Emotion A.I.-based solutions-one focused on customer experience (Emotion AI for CX Management) and tracking and Emotion AI for Robotics and IoT.

‘Multi-modal emotion detection’ means that the system analyzes not only facial expressions, but also body gestures, voice, eye movement and even heart rate to determine someone’s emotional state. The A.I. recognizes more than twenty (20) states of emotion and behavioral patterns such as happiness, sadness, disgust, surprise, disengagement, shame and more.

The robot can also simultaneously recognize the emotions of several people it communicated with, and reacts accordingly, measuring how satisfied those individuals are with its interaction, all in real-time.

BUT, WHY DO WE WANT ROBOTS READING OUR EMOTIONS? 

As robots become more prevalent in a variety of customer support settings, it is ever more important that they understand the nuances of human communication and response. We will soon be seeing robots performing functions in a variety of roles such as information guides in malls and airports, concierges in smart environments, assisting desk consultants, restaurant servers, and even adding some brains to the retail and banking sectors.  

These robots will need to understand customers in order to know how to respond in the right way that doesn’t negatively impact the customer experience. Taking a multi-modal approach that reads multiple factors of expression will lead to better human robot interactions and even create new opportunities for customer experience, enhancing the overall brand affiliation. Imagine interacting with a fun mascot robot at your local stadium or a concert.

EMOTIONAL A.I. GOING BEYOND ROBOTICS

Neurodata Labs supplies software that can go into many different types of products, even beyond its robots. Their EaaS (Emotion as a Service) can be leveraged for multiple use cases including:

  1. Automotive – The cars of the future will be able to detect whether the driver is in an agitated emotional state that could require precautionary measures such as pulling an emergency brake.
  2. Advertising  – With product purchases so closely aligned to emotion, the advertising industry has been using emotional AI technology to determine audience feedback to commercials, ads and in focus groups to understand the true response to their products.
  3. Human Resources – As the workforce moves more to a decentralized model with more people working from home, and interviews happening virtually, emotion AI will be able to help HR professionals gain valuable analytics into candidate personalities, beyond verbal responses.

For more information about these bots, check out Neurodata Labs advisor, Steve Ardire, and CMO Olga Serdiukova give their CES booth presentation.