Publications
  • 03/19/2018
    Classification of affective and social behaviors in public interaction for affective computing and social signal processing

    There are numerous models for affective states classification and social behavior description. Despite proving their reliability, some of these classifications turn out to be redundant, while others — insufficient for…

  • 10/16/2018
    Multimodal Approach to Engagement and Disengagement Detection with Highly Imbalanced In-the-Wild Data (for ICMI 2018)

    In this paper we describe different approaches to building engagement/disengagement models working with highly imbalanced multimodal data from natural conversations.

  • 07/19/2018
    Recognition of mixed facial emotion has correlates in eye movement parameters (for ESCAN 2018)

    The aim of this study was to investigate specificity of eye movement’s parameters during mixed facial emotion recognition task.

  • 09/02/2018
    Automatic detection of multi-speaker fragments with high time resolution (for Interspeech 2018)

    The proposed method demonstrates highly accurate results and may be used for speech segmentation, speaker track- ing, content analysis such as conflict detection, and other practical purposes.

  • our client
    08/25/2018
    RAMAS: Russian Multimodal Corpus of Dyadic Interaction for Affective Computing (for SPECOM 2018)

    RAMAS is an open database that provides research community with multimodal data of faces, speech, gestures and physiology interrelation. Such material is useful for various investigations and automatic affective systems…