A Multi-layer Hybrid Framework for Dimensional Emotion Classification

Mihalis A. Nicolaou, Hatice Gunes, Maja Pantic

Research output: Chapter in Book/Report/Conference proceedingConference contributionAcademicpeer-review

22 Citations (Scopus)

Abstract

This paper investigates dimensional emotion prediction and classification from naturalistic facial expressions. Similarly to many pattern recognition problems, dimensional emotion classification requires generating multi-dimensional outputs. To date, classification for valence and arousal dimensions has been done separately, assuming that they are independent. However, various psychological findings suggest that these dimensions are correlated. We therefore propose a novel, multi-layer hybrid framework for emotion classification that is able to model inter-dimensional correlations. Firstly, we derive a novel geometric feature set based on the (a)symmetric spatio-temporal characteristics of facial expressions. Subsequently, we use the proposed feature set to train a multi-layer hybrid framework composed of a tem- poral regression layer for predicting emotion dimensions, a graphical model layer for modeling valence-arousal correlations, and a final classification and fusion layer exploiting informative statistics extracted from the lower layers. This framework (i) introduces the Auto-Regressive Coupled HMM (ACHMM), a graphical model specifically tailored to accommodate not only inter-dimensional correlations but also to exploit the internal dynamics of the actual observations, and (ii) replaces the commonly used Maximum Likelihood principle with a more robust final classification and fusion layer. Subject-independent experimental validation, performed on a naturalistic set of facial expressions, demonstrates the effectiveness of the derived feature set, and the robustness and flexibility of the proposed framework.
Original languageUndefined
Title of host publicationMM '11 : Proceedings of the 19th ACM International Conference on Multimedia
Place of PublicationNew York
PublisherAssociation for Computing Machinery (ACM)
Pages933-936
Number of pages4
ISBN (Print)978-1-4503-0616-4
DOIs
Publication statusPublished - 28 Nov 2011
Event19th ACM Multimedia Conference, MM 2011 - Scottsdale, United States
Duration: 28 Nov 20111 Dec 2011
Conference number: 19
http://www.acmmm11.org/

Publication series

Name
PublisherACM

Conference

Conference19th ACM Multimedia Conference, MM 2011
Abbreviated titleMM
CountryUnited States
CityScottsdale
Period28/11/111/12/11
Internet address

Keywords

  • METIS-285011
  • Experimentation
  • Classifier design and evaluation
  • IR-79394
  • EWI-21297
  • Feature evaluation and selection
  • EC Grant Agreement nr.: ERC/203143
  • HMI-MI: MULTIMODAL INTERACTIONS
  • Human Factors

Cite this

Nicolaou, M. A., Gunes, H., & Pantic, M. (2011). A Multi-layer Hybrid Framework for Dimensional Emotion Classification. In MM '11 : Proceedings of the 19th ACM International Conference on Multimedia (pp. 933-936). New York: Association for Computing Machinery (ACM). https://doi.org/10.1145/2072298.2071906
Nicolaou, Mihalis A. ; Gunes, Hatice ; Pantic, Maja. / A Multi-layer Hybrid Framework for Dimensional Emotion Classification. MM '11 : Proceedings of the 19th ACM International Conference on Multimedia. New York : Association for Computing Machinery (ACM), 2011. pp. 933-936
@inproceedings{385251c16ffb467183dbb101c353cc05,
title = "A Multi-layer Hybrid Framework for Dimensional Emotion Classification",
abstract = "This paper investigates dimensional emotion prediction and classification from naturalistic facial expressions. Similarly to many pattern recognition problems, dimensional emotion classification requires generating multi-dimensional outputs. To date, classification for valence and arousal dimensions has been done separately, assuming that they are independent. However, various psychological findings suggest that these dimensions are correlated. We therefore propose a novel, multi-layer hybrid framework for emotion classification that is able to model inter-dimensional correlations. Firstly, we derive a novel geometric feature set based on the (a)symmetric spatio-temporal characteristics of facial expressions. Subsequently, we use the proposed feature set to train a multi-layer hybrid framework composed of a tem- poral regression layer for predicting emotion dimensions, a graphical model layer for modeling valence-arousal correlations, and a final classification and fusion layer exploiting informative statistics extracted from the lower layers. This framework (i) introduces the Auto-Regressive Coupled HMM (ACHMM), a graphical model specifically tailored to accommodate not only inter-dimensional correlations but also to exploit the internal dynamics of the actual observations, and (ii) replaces the commonly used Maximum Likelihood principle with a more robust final classification and fusion layer. Subject-independent experimental validation, performed on a naturalistic set of facial expressions, demonstrates the effectiveness of the derived feature set, and the robustness and flexibility of the proposed framework.",
keywords = "METIS-285011, Experimentation, Classifier design and evaluation, IR-79394, EWI-21297, Feature evaluation and selection, EC Grant Agreement nr.: ERC/203143, HMI-MI: MULTIMODAL INTERACTIONS, Human Factors",
author = "Nicolaou, {Mihalis A.} and Hatice Gunes and Maja Pantic",
note = "eemcs-eprint-21297",
year = "2011",
month = "11",
day = "28",
doi = "10.1145/2072298.2071906",
language = "Undefined",
isbn = "978-1-4503-0616-4",
publisher = "Association for Computing Machinery (ACM)",
pages = "933--936",
booktitle = "MM '11 : Proceedings of the 19th ACM International Conference on Multimedia",
address = "United States",

}

Nicolaou, MA, Gunes, H & Pantic, M 2011, A Multi-layer Hybrid Framework for Dimensional Emotion Classification. in MM '11 : Proceedings of the 19th ACM International Conference on Multimedia. Association for Computing Machinery (ACM), New York, pp. 933-936, 19th ACM Multimedia Conference, MM 2011, Scottsdale, United States, 28/11/11. https://doi.org/10.1145/2072298.2071906

A Multi-layer Hybrid Framework for Dimensional Emotion Classification. / Nicolaou, Mihalis A.; Gunes, Hatice; Pantic, Maja.

MM '11 : Proceedings of the 19th ACM International Conference on Multimedia. New York : Association for Computing Machinery (ACM), 2011. p. 933-936.

Research output: Chapter in Book/Report/Conference proceedingConference contributionAcademicpeer-review

TY - GEN

T1 - A Multi-layer Hybrid Framework for Dimensional Emotion Classification

AU - Nicolaou, Mihalis A.

AU - Gunes, Hatice

AU - Pantic, Maja

N1 - eemcs-eprint-21297

PY - 2011/11/28

Y1 - 2011/11/28

N2 - This paper investigates dimensional emotion prediction and classification from naturalistic facial expressions. Similarly to many pattern recognition problems, dimensional emotion classification requires generating multi-dimensional outputs. To date, classification for valence and arousal dimensions has been done separately, assuming that they are independent. However, various psychological findings suggest that these dimensions are correlated. We therefore propose a novel, multi-layer hybrid framework for emotion classification that is able to model inter-dimensional correlations. Firstly, we derive a novel geometric feature set based on the (a)symmetric spatio-temporal characteristics of facial expressions. Subsequently, we use the proposed feature set to train a multi-layer hybrid framework composed of a tem- poral regression layer for predicting emotion dimensions, a graphical model layer for modeling valence-arousal correlations, and a final classification and fusion layer exploiting informative statistics extracted from the lower layers. This framework (i) introduces the Auto-Regressive Coupled HMM (ACHMM), a graphical model specifically tailored to accommodate not only inter-dimensional correlations but also to exploit the internal dynamics of the actual observations, and (ii) replaces the commonly used Maximum Likelihood principle with a more robust final classification and fusion layer. Subject-independent experimental validation, performed on a naturalistic set of facial expressions, demonstrates the effectiveness of the derived feature set, and the robustness and flexibility of the proposed framework.

AB - This paper investigates dimensional emotion prediction and classification from naturalistic facial expressions. Similarly to many pattern recognition problems, dimensional emotion classification requires generating multi-dimensional outputs. To date, classification for valence and arousal dimensions has been done separately, assuming that they are independent. However, various psychological findings suggest that these dimensions are correlated. We therefore propose a novel, multi-layer hybrid framework for emotion classification that is able to model inter-dimensional correlations. Firstly, we derive a novel geometric feature set based on the (a)symmetric spatio-temporal characteristics of facial expressions. Subsequently, we use the proposed feature set to train a multi-layer hybrid framework composed of a tem- poral regression layer for predicting emotion dimensions, a graphical model layer for modeling valence-arousal correlations, and a final classification and fusion layer exploiting informative statistics extracted from the lower layers. This framework (i) introduces the Auto-Regressive Coupled HMM (ACHMM), a graphical model specifically tailored to accommodate not only inter-dimensional correlations but also to exploit the internal dynamics of the actual observations, and (ii) replaces the commonly used Maximum Likelihood principle with a more robust final classification and fusion layer. Subject-independent experimental validation, performed on a naturalistic set of facial expressions, demonstrates the effectiveness of the derived feature set, and the robustness and flexibility of the proposed framework.

KW - METIS-285011

KW - Experimentation

KW - Classifier design and evaluation

KW - IR-79394

KW - EWI-21297

KW - Feature evaluation and selection

KW - EC Grant Agreement nr.: ERC/203143

KW - HMI-MI: MULTIMODAL INTERACTIONS

KW - Human Factors

U2 - 10.1145/2072298.2071906

DO - 10.1145/2072298.2071906

M3 - Conference contribution

SN - 978-1-4503-0616-4

SP - 933

EP - 936

BT - MM '11 : Proceedings of the 19th ACM International Conference on Multimedia

PB - Association for Computing Machinery (ACM)

CY - New York

ER -

Nicolaou MA, Gunes H, Pantic M. A Multi-layer Hybrid Framework for Dimensional Emotion Classification. In MM '11 : Proceedings of the 19th ACM International Conference on Multimedia. New York: Association for Computing Machinery (ACM). 2011. p. 933-936 https://doi.org/10.1145/2072298.2071906