Research output per year
Research output per year
Maria Luísa Lima*, Willams De Lima Costa, Estefania Talavera Martínez, Veronica Teichrieb
Research output: Chapter in Book/Report/Conference proceeding › Conference contribution › Academic › peer-review
Emotion recognition is relevant for human behaviour understanding, where facial expression and speech recognition have been widely explored by the computer vision community. Literature in the field of behavioural psychology indicates that gait, described as the way a person walks, is an additional indicator of emotions. In this work, we propose a deep framework for emotion recognition through the analysis of gait. More specifically, our model is composed of a sequence of spatial-temporal Graph Convolutional Networks that produce a robust skeleton-based representation for the task of emotion classification. We evaluate our proposed framework on the E-Gait dataset, composed of a total of 2177 samples. The results obtained represent an improvement of ≈ 5% in accuracy compared to the state of the art. In addition, during training we observed a faster convergence of our model compared to the state-of-the-art methodologies.
Original language | English |
---|---|
Title of host publication | 2024 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW) |
Publisher | IEEE |
Pages | 302-310 |
Number of pages | 9 |
ISBN (Electronic) | 9798350365474 |
DOIs | |
Publication status | Published - 27 Sept 2024 |
Event | IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, CVPR 2024 - Seattle, United States Duration: 16 Jun 2024 → 22 Jun 2024 |
Name | IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops |
---|---|
ISSN (Print) | 2160-7508 |
ISSN (Electronic) | 2160-7516 |
Conference | IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, CVPR 2024 |
---|---|
Abbreviated title | CVPR 2024 |
Country/Territory | United States |
City | Seattle |
Period | 16/06/24 → 22/06/24 |
Research output: Working paper › Preprint › Academic