Machines outperform lay persons in recognizing emotions elicited by autobiographical recollection

J.H. Janssen, P. Tacken, G.J. de Vries, Egon van den Broek, J.H.D.M. Westerink, P. Haselager, W.A. IJsselsteijn

    Research output: Contribution to journalArticleAcademicpeer-review

    27 Citations (Scopus)

    Abstract

    Over the last decade, an increasing number of studies have focused on automated recognition of human emotions by machines. However, performances of machine emotion recognition studies are difficult to interpret because benchmarks have not been established. In order to provide such a benchmark, we compared machine with human emotion recognition. We gathered facial expressions, speech, and physiological signals from 17 individuals expressing 5 different emotional states. Support vector machines achieved an 82% recognition accuracy based on a physiological and facial features. In experiments with 75 humans on the same data, a maximum recognition accuracy of 62.8% was obtained. As machines outperformed humans, automated emotion recognition might be ready to be tested in more practical applications.
    Original languageUndefined
    Pages (from-to)479-517
    Number of pages39
    JournalHuman-computer interaction
    Volume28
    Issue number6
    DOIs
    Publication statusPublished - 29 Jul 2013

    Keywords

    • EWI-21692
    • IR-87031
    • HMI-CI: Computational Intelligence
    • HMI-SLT: Speech and Language Technology
    • HMI-MI: MULTIMODAL INTERACTIONS
    • Vision
    • Speech
    • Physiology
    • machine
    • autobiographical recollection
    • (Human)
    • BioSignals
    • Affect
    • Machine Learning
    • Audio
    • Multimodal
    • Facial expressions
    • Emotion
    • METIS-297587
    • HMI-HF: Human Factors

    Cite this

    Janssen, J. H., Tacken, P., de Vries, G. J., van den Broek, E., Westerink, J. H. D. M., Haselager, P., & IJsselsteijn, W. A. (2013). Machines outperform lay persons in recognizing emotions elicited by autobiographical recollection. Human-computer interaction, 28(6), 479-517. https://doi.org/10.1080/07370024.2012.755421