OMG-EMPATHY PREDICTION CHALLENGE

To create a general affective model to be used as a modulator for learning different cognitive tasks, such as modeling intrinsic motivation, creativity, dialog processing, grounded learning and human-level communication, only emotion perception cannot be the pivotal focus. The integration of perception with intrinsic concepts of emotional understanding, such as a dynamic and evolving mood and affective memory, is required to model the necessary complexity of an interaction and realize adaptability in an agent’s social behavior. For this challenge, we designed and recorded the One-Minute Gradual Empathy dataset (OMG-Empathy) which contain multi-modal recordings of different individuals discussing predefined topics. One of them, the actor, shares a story about themselves while the other, the listener, reacts to it emotionally. We annotated each interaction based on the listener’s own assessment on how they felt while the interaction was taking place. We also collected annotations from a third individual; an independent observer, who was not involved in the interaction but watched the video of the interaction and described how they felt, continuously using an arousal/valence scale. This challenge proposes two tracks, one to predict the listener empathy and the other one to predict the observer empathy.

Organizers and contact information

Dr. Pablo Barros – University of Hamburg, barros@informatik.uni-hamburg.de

Nikhil Churamani – University of Hamburg, 5churama@informatik.uni-hamburg.de

Prof. Angelica Lim – angelica@sfu.ca

Prof. Stefan Wermter – wermter@informatik.uni-hamburg.de

WEBPAGE: https://www2.informatik.uni-hamburg.de/wtm/omgchallenges/omg_empathy.html