Over the last decade, an increasing number of studies have focused on automated recognition of human emotions by machines. However, performances of machine emotion recognition studies are difficult to interpret because benchmarks have not been established. In order to provide such a benchmark, we compared machine with human emotion recognition. We gathered facial expressions, speech, and physiological signals from 17 individuals expressing 5 different emotional states. Support vector machines achieved an 82% recognition accuracy based on a physiological and facial features. In experiments with 75 humans on the same data, a maximum recognition accuracy of 62.8% was obtained. As machines outperformed humans, automated emotion recognition might be ready to be tested in more practical applications.
- HMI-CI: Computational Intelligence
- HMI-SLT: Speech and Language Technology
- HMI-MI: MULTIMODAL INTERACTIONS
- autobiographical recollection
- Machine Learning
- Facial expressions
- HMI-HF: Human Factors