A Multimodal Database for Affect Recognition and Implicit Tagging

Mohammad Soleymani, Jeroen Lichtenauer, Thierry Pun, Maja Pantic

    Research output: Contribution to journalArticleAcademicpeer-review

    582 Citations (Scopus)

    Abstract

    MAHNOB-HCI is a multimodal database recorded in response to affective stimuli with the goal of emotion recognition and implicit tagging research. A multimodal setup was arranged for synchronized recording of face videos, audio signals, eye gaze data, and peripheral/central nervous system physiological signals. Twenty-seven participants from both genders and different cultural backgrounds participated in two experiments. In the first experiment, they watched 20 emotional videos and self-reported their felt emotions using arousal, valence, dominance, and predictability as well as emotional keywords. In the second experiment, short videos and images were shown once without any tag and then with correct or incorrect tags. Agreement or disagreement with the displayed tags was assessed by the participants. The recorded videos and bodily responses were segmented and stored in a database. The database is made available to the academic community via a web-based system. The collected data were analyzed and single modality and modality fusion results for both emotion recognition and implicit tagging experiments are reported. These results show the potential uses of the recorded modalities and the significance of the emotion elicitation protocol.
    Original languageUndefined
    Pages (from-to)42-55
    Number of pages14
    JournalIEEE transactions on affective computing
    Volume3
    Issue number1
    DOIs
    Publication statusPublished - Jan 2012

    Keywords

    • HMI-MI: MULTIMODAL INTERACTIONS
    • Physiological Signals
    • EWI-22940
    • Affective Computing
    • Implicit Tagging
    • IR-84218
    • Emotion Recognition
    • EEG
    • Pattern classification
    • eye gaze
    • METIS-296243
    • Facial expressions

    Cite this