Recently, the field of automatic recognition of users' affective states has gained a great deal of attention. Automatic, implicit recognition of affective states has many applications, ranging from personalized content recommendation to automatic tutoring systems. In this work, we present some promising results of our research in classification of emotions induced by watching music videos. We show robust correlations between users' self-assessments of arousal and valence and the frequency powers of their EEG activity. We present methods for single trial classification using both EEG and peripheral physiological signals. For EEG, an average (maximum) classification rate of 55.7% (67.0%) for arousal and 58.8% (76.0%) for valence was obtained. For peripheral physiological signals, the results were 58.9% (85.5%) for arousal and 54.2% (78.5%) for valence.
|Name||Lecture Notes in Computer Science|
|Conference||Proceedings 2010 International Conference on Brain Informatics (BI 2010), Toronto|
|Period||4/09/10 → …|
- Affective Computing
- Emotion induction
- HMI-MI: MULTIMODAL INTERACTIONS
- Physiological Signals