A novel dataset for real-life evaluation of facial expression recognition methodologies

Muhammad Hameed Siddiqi, Maqbool Ali, Muhammad Idris, Oresti Banos Legran, Sungyoung Lee, Hyunseung Choo

Research output: Chapter in Book/Report/Conference proceedingConference contributionAcademicpeer-review

1 Citation (Scopus)
2 Downloads (Pure)

Abstract

One limitation seen among most of the previous methods is that they were evaluated under settings that are far from real-life scenarios. The reason is that the existing facial expression recognition (FER) datasets are mostly pose-based and assume a predefined setup. The expressions in these datasets are recorded using a fixed camera deployment with a constant background and static ambient settings. In a real-life scenario, FER systems are expected to deal with changing ambient conditions, dynamic background, varying camera angles, different face size, and other human-related variations. Accordingly, in this work, three FER datasets are collected over a period of six months, keeping in view the limitations of existing datasets. These datasets are collected from YouTube, real world talk shows, and real world interviews. The most widely used FER methodologies are implemented, and evaluated using these datasets to analyze their performance in real-life situations.
Original languageUndefined
Title of host publication29th Canadian Conference on Artificial Intelligence, CCAI 2016
Place of PublicationBerlin
PublisherSpringer
Pages89-95
Number of pages7
ISBN (Print)978-3-319-34110-1
DOIs
Publication statusPublished - 31 May 2016

Publication series

NameLecture notes in artificial intelligence
PublisherSpringer
Volume9673

Keywords

  • YouTube
  • Facial expression recognition
  • Feature Selection
  • Feature extraction
  • Recognition
  • METIS-318533
  • IR-101596
  • Real-world
  • EWI-27248

Cite this

Siddiqi, M. H., Ali, M., Idris, M., Banos Legran, O., Lee, S., & Choo, H. (2016). A novel dataset for real-life evaluation of facial expression recognition methodologies. In 29th Canadian Conference on Artificial Intelligence, CCAI 2016 (pp. 89-95). (Lecture notes in artificial intelligence; Vol. 9673). Berlin: Springer. https://doi.org/10.1007/978-3-319-34111-8_12
Siddiqi, Muhammad Hameed ; Ali, Maqbool ; Idris, Muhammad ; Banos Legran, Oresti ; Lee, Sungyoung ; Choo, Hyunseung. / A novel dataset for real-life evaluation of facial expression recognition methodologies. 29th Canadian Conference on Artificial Intelligence, CCAI 2016. Berlin : Springer, 2016. pp. 89-95 (Lecture notes in artificial intelligence).
@inproceedings{7b721337618d4b369d1971738db54b0d,
title = "A novel dataset for real-life evaluation of facial expression recognition methodologies",
abstract = "One limitation seen among most of the previous methods is that they were evaluated under settings that are far from real-life scenarios. The reason is that the existing facial expression recognition (FER) datasets are mostly pose-based and assume a predefined setup. The expressions in these datasets are recorded using a fixed camera deployment with a constant background and static ambient settings. In a real-life scenario, FER systems are expected to deal with changing ambient conditions, dynamic background, varying camera angles, different face size, and other human-related variations. Accordingly, in this work, three FER datasets are collected over a period of six months, keeping in view the limitations of existing datasets. These datasets are collected from YouTube, real world talk shows, and real world interviews. The most widely used FER methodologies are implemented, and evaluated using these datasets to analyze their performance in real-life situations.",
keywords = "YouTube, Facial expression recognition, Feature Selection, Feature extraction, Recognition, METIS-318533, IR-101596, Real-world, EWI-27248",
author = "Siddiqi, {Muhammad Hameed} and Maqbool Ali and Muhammad Idris and {Banos Legran}, Oresti and Sungyoung Lee and Hyunseung Choo",
note = "eemcs-eprint-27248",
year = "2016",
month = "5",
day = "31",
doi = "10.1007/978-3-319-34111-8_12",
language = "Undefined",
isbn = "978-3-319-34110-1",
series = "Lecture notes in artificial intelligence",
publisher = "Springer",
pages = "89--95",
booktitle = "29th Canadian Conference on Artificial Intelligence, CCAI 2016",

}

Siddiqi, MH, Ali, M, Idris, M, Banos Legran, O, Lee, S & Choo, H 2016, A novel dataset for real-life evaluation of facial expression recognition methodologies. in 29th Canadian Conference on Artificial Intelligence, CCAI 2016. Lecture notes in artificial intelligence, vol. 9673, Springer, Berlin, pp. 89-95. https://doi.org/10.1007/978-3-319-34111-8_12

A novel dataset for real-life evaluation of facial expression recognition methodologies. / Siddiqi, Muhammad Hameed; Ali, Maqbool; Idris, Muhammad; Banos Legran, Oresti; Lee, Sungyoung; Choo, Hyunseung.

29th Canadian Conference on Artificial Intelligence, CCAI 2016. Berlin : Springer, 2016. p. 89-95 (Lecture notes in artificial intelligence; Vol. 9673).

Research output: Chapter in Book/Report/Conference proceedingConference contributionAcademicpeer-review

TY - GEN

T1 - A novel dataset for real-life evaluation of facial expression recognition methodologies

AU - Siddiqi, Muhammad Hameed

AU - Ali, Maqbool

AU - Idris, Muhammad

AU - Banos Legran, Oresti

AU - Lee, Sungyoung

AU - Choo, Hyunseung

N1 - eemcs-eprint-27248

PY - 2016/5/31

Y1 - 2016/5/31

N2 - One limitation seen among most of the previous methods is that they were evaluated under settings that are far from real-life scenarios. The reason is that the existing facial expression recognition (FER) datasets are mostly pose-based and assume a predefined setup. The expressions in these datasets are recorded using a fixed camera deployment with a constant background and static ambient settings. In a real-life scenario, FER systems are expected to deal with changing ambient conditions, dynamic background, varying camera angles, different face size, and other human-related variations. Accordingly, in this work, three FER datasets are collected over a period of six months, keeping in view the limitations of existing datasets. These datasets are collected from YouTube, real world talk shows, and real world interviews. The most widely used FER methodologies are implemented, and evaluated using these datasets to analyze their performance in real-life situations.

AB - One limitation seen among most of the previous methods is that they were evaluated under settings that are far from real-life scenarios. The reason is that the existing facial expression recognition (FER) datasets are mostly pose-based and assume a predefined setup. The expressions in these datasets are recorded using a fixed camera deployment with a constant background and static ambient settings. In a real-life scenario, FER systems are expected to deal with changing ambient conditions, dynamic background, varying camera angles, different face size, and other human-related variations. Accordingly, in this work, three FER datasets are collected over a period of six months, keeping in view the limitations of existing datasets. These datasets are collected from YouTube, real world talk shows, and real world interviews. The most widely used FER methodologies are implemented, and evaluated using these datasets to analyze their performance in real-life situations.

KW - YouTube

KW - Facial expression recognition

KW - Feature Selection

KW - Feature extraction

KW - Recognition

KW - METIS-318533

KW - IR-101596

KW - Real-world

KW - EWI-27248

U2 - 10.1007/978-3-319-34111-8_12

DO - 10.1007/978-3-319-34111-8_12

M3 - Conference contribution

SN - 978-3-319-34110-1

T3 - Lecture notes in artificial intelligence

SP - 89

EP - 95

BT - 29th Canadian Conference on Artificial Intelligence, CCAI 2016

PB - Springer

CY - Berlin

ER -

Siddiqi MH, Ali M, Idris M, Banos Legran O, Lee S, Choo H. A novel dataset for real-life evaluation of facial expression recognition methodologies. In 29th Canadian Conference on Artificial Intelligence, CCAI 2016. Berlin: Springer. 2016. p. 89-95. (Lecture notes in artificial intelligence). https://doi.org/10.1007/978-3-319-34111-8_12