Speech-based recognition of self-reported and observed emotion in a dimensional space

    Research output: Contribution to journalArticleAcademicpeer-review

    31 Citations (Scopus)
    46 Downloads (Pure)

    Abstract

    The differences between self-reported and observed emotion have only marginally been investigated in the context of speech-based automatic emotion recognition. We address this issue by comparing self-reported emotion ratings to observed emotion ratings and look at how differences between these two types of ratings affect the development and performance of automatic emotion recognizers developed with these ratings. A dimensional approach to emotion modeling is adopted: the ratings are based on continuous arousal and valence scales. We describe the TNO-Gaming Corpus that contains spontaneous vocal and facial expressions elicited via a multiplayer videogame and that includes emotion annotations obtained via self-report and observation by outside observers. Comparisons show that there are discrepancies between self-reported and observed emotion ratings which are also reflected in the performance of the emotion recognizers developed. Using Support Vector Regression in combination with acoustic and textual features, recognizers of arousal and valence are developed that can predict points in a 2-dimensional arousal-valence space. The results of these recognizers show that the self-reported emotion is much harder to recognize than the observed emotion, and that averaging ratings from multiple observers improves performance.
    Original languageUndefined
    Pages (from-to)1049-1063
    Number of pages15
    JournalSpeech communication
    Volume54
    Issue number9
    DOIs
    Publication statusPublished - Nov 2012

    Keywords

    • EC Grant Agreement nr.: FP7/231287
    • EWI-22081
    • Emotion perception
    • Emotional speech
    • Emotion annotation
    • Emotion database
    • Automatic emotion recognition
    • Video games
    • Support Vector Regression
    • METIS-287943
    • Affective Computing
    • IR-80909
    • Audiovisual database
    • Emotion elicitation

    Cite this