Abstract
Affective states, moods and emotions, are an integral part of the human nature: they shape our thoughts, govern the behavior of the individual, and influence our interpersonal relationships. The last decades have seen a growing interest in the automatic detection of such states from voice, facial expression and physiological signals, primarily with the goal of enhancing human-computer interaction with an affective component. With the advent of brain-computer interface research, the idea of affective brain-computer interfaces (aBCI), enabling affect detection from brain signals, arose. In this article, we set out to survey the field of neurophysiology-based affect detection. We outline possible applications of aBCI in a general taxonomy of brain-computer interface approaches and introduce the core concepts of affect and their neurophysiological fundamentals. We show that there is a growing body of literature that evidences the capabilities, but also the limitations and challenges of affect detection from neurophysiological activity.
Original language | English |
---|---|
Pages (from-to) | 66-84 |
Number of pages | 19 |
Journal | Brain-Computer Interfaces |
Volume | 1 |
Issue number | 2 |
DOIs | |
Publication status | Published - 24 May 2014 |
Keywords
- fNIRS
- EEG
- Emotions
- HMI-MI: MULTIMODAL INTERACTIONS
- Affect
- Brain-Computer Interfaces
- IR-91451
- METIS-305859
- EWI-24554
- moods