Emotionally-charged data to train
and test algorythms
Global online video-annotation platform made for multimodal emotion and behavior data collection and analysis.
  • What is it?
    • A set of instruments to mark up numerous English-language video fragments extracted from available public content
    • 30 000 registered users from 35+ countries and growing
    • Strict annotator selection system: organized, motivated, scalable, versatile, multi-skilled experts
  • How does it work?
    • Elaborated emotion scales system with more than 20 affective states & social signals to annotate emotionally-charged content
    • Flexible learning model with tutorials to help annotators
    • Clear payment system bound to the actual results
  • What do you get?
    • Ready-to-use qualitative data for emotion recognition systems
The annotation process
  • 01
    You upload audio- visual fragments you need to mark and indicate emotion scales relevant for your goal
  • 02
    Fragments are then assigned as tasks to annotators.
  • 03
    For each fragment annotators have to select emotions, social behavior patterns and mental states of the person.
  • 04
    After completing markup for all fragments of all videos in the tasks, a quality check is done.
  • 05
    If it is approved, annotators get paid.