Multimodal Emotion Classification Unit
  • Affective & Cognitive States
    Emotions are natural and essential for humans. Emotion expression is largely non-verbal: some manifestations are explicit and can be detected via facial expressions, body language, change in the tone of voice; some are implicit, like psychophysiological manifestations, and require subtler recognition methods.
    • Emotions

    • Mental effort

    • Engagement

    • Self-confidence

  • Social Signals
    Social Signal Processing is a next step to understand how humans communicate emotions. In human to human interaction people use a broad range of non-verbal signals: posture, gestures, interpersonal distance, which are interpreted according to both psychological and social phenomena.
    • Social interactions

    • Interpersonal attitudes

    • Communication preferences

  • How does Emotion Recognition work
    Our emotion recognition technology uses sophisticated computer vision and machine learning techniques to detect more than 20 scales of affective and cognitive states, and social behavior patterns using highly accurate multimodal approach and tracking.
  • Multimodal Tracking
    Multimodality is about the usage of data from different channels for emotion analysis for Natural Data Processing. Multimodal cues we use to analyse emotion states and social signals:
    • Facial expressions

    • Vocal specifics

    • Gestures

    • Body posture

    • Interpersonal distance

    • Eye movement

    • Heart rate

    • Respiration rate

See how it works: