Ubiquitous Emotion Recognition with Multimodal Mobile Interfaces

Shaun J. Canavan, Marvin Andujar, Lijun Yin, Anton Nijholt, Elizabeth Schotter

    Research output: Chapter in Book/Report/Conference proceedingConference contributionAcademicpeer-review

    Abstract

    In 1997 Rosalind Picard introduced fundamental concepts of affect recognition. Since this time, multimodal interfaces such as Brain-computer interfaces (BCIs), RGB and depth cameras, physiological wearables, multimodal facial data and physiological data have been used to study human emotion. Much of the work in this field focuses on a single modality to recognize emotion. However, there is a wealth of information that is available for recognizing emotions when incorporating multimodal data. Considering this, the aim of this workshop is to look at current and future research activities and trends for ubiquitous emotion recognition through the fusion of data from various multimodal, mobile devices.
    Original languageEnglish
    Title of host publicationUbiComp/ISWC 2018 - Adjunct Proceedings of the 2018 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2018 ACM International Symposium on Wearable Computers
    Place of PublicationNew York
    PublisherAssociation for Computing Machinery (ACM)
    Pages937-941
    Number of pages5
    ISBN (Electronic)978-1-4503-5966-5
    DOIs
    Publication statusPublished - 8 Oct 2018
    Event2018 ACM International Joint Conference and 2018 International Symposium on Pervasive and Ubiquitous Computing and Wearable Computers - Singapore, Singapore
    Duration: 8 Oct 201812 Oct 2018
    http://ubicomp.org/ubicomp2018/

    Conference

    Conference2018 ACM International Joint Conference and 2018 International Symposium on Pervasive and Ubiquitous Computing and Wearable Computers
    Abbreviated titleUbicomp 2018
    CountrySingapore
    CitySingapore
    Period8/10/1812/10/18
    Internet address

    Fingerprint Dive into the research topics of 'Ubiquitous Emotion Recognition with Multimodal Mobile Interfaces'. Together they form a unique fingerprint.

  • Cite this

    Canavan, S. J., Andujar, M., Yin, L., Nijholt, A., & Schotter, E. (2018). Ubiquitous Emotion Recognition with Multimodal Mobile Interfaces. In UbiComp/ISWC 2018 - Adjunct Proceedings of the 2018 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2018 ACM International Symposium on Wearable Computers (pp. 937-941). New York: Association for Computing Machinery (ACM). https://doi.org/10.1145/3267305.3274139