Ubiquitous emotion-aware computing

Egon van den Broek

    Research output: Contribution to journalArticleAcademicpeer-review

    30 Citations (Scopus)
    75 Downloads (Pure)

    Abstract

    Emotions are a crucial element for personal and ubiquitous computing. What to sense and how to sense it, however, remain a challenge. This study explores the rare combination of speech, electrocardiogram, and a revised Self-Assessment Mannequin to assess people’s emotions. 40 people watched 30 International Affective Picture System pictures in either an office or a living-room environment. Additionally, their personality traits neuroticism and extroversion and demographic information (i.e., gender, nationality, and level of education) were recorded. The resulting data were analyzed using both basic emotion categories and the valence--arousal model, which enabled a comparison between both representations. The combination of heart rate variability and three speech measures (i.e., variability of the fundamental frequency of pitch (F0), intensity, and energy) explained 90% (p < .001) of the participants’ experienced valence--arousal, with 88% for valence and 99% for arousal (ps < .001). The six basic emotions could also be discriminated (p < .001), although the explained variance was much lower: 18–20%. Environment (or context), the personality trait neuroticism, and gender proved to be useful when a nuanced assessment of people’s emotions was needed. Taken together, this study provides a significant leap toward robust, generic, and ubiquitous emotion-aware computing.
    Original languageUndefined
    Pages (from-to)53-67
    Number of pages15
    JournalPersonal and ubiquitous computing
    Volume17
    Issue number1
    DOIs
    Publication statusPublished - Jan 2013

    Keywords

    • METIS-284916
    • Emotion
    • Features
    • heart rate variability
    • HMI-SLT: Speech and Language Technology
    • EWI-20758
    • HMI-MI: MULTIMODAL INTERACTIONS
    • Ubiquitous Computing
    • HMI-CI: Computational Intelligence
    • unobtrusive sensing
    • IR-79494
    • Speech
    • Personality
    • HMI-HF: Human Factors

    Cite this