Emotions are natural and essential for humans. Emotion expression is largely non-verbal: some manifestations are explicit and can be detected via facial expressions, body language, change in the tone of voice; some are implicit, like psychophysiological
manifestations, and require subtler recognition methods.
Social Signal Processing is a next step to understand how humans communicate emotions. In human to human interaction people use a broad range of non-verbal signals: posture, gestures, interpersonal distance, which are interpreted according to both psychological
and social phenomena.
How does Emotion Recognition work
Our emotion recognition technology uses sophisticated computer vision and machine learning techniques to detect more than 20 scales of affective and cognitive states, and social behavior patterns using highly accurate multimodal approach and tracking.
Multimodality is about the usage of data from different channels for emotion analysis for Natural Data Processing. Multimodal cues we use to analyse emotion states and social signals: