Abstract
Users react differently to non-relevant and relevant tags associated with content. These spontaneous reactions can be used for labeling large multimedia databases. We present a method to assess tag relevance to images using the non-verbal bodily responses, namely, electroencephalogram (EEG), facial expressions, and eye gaze. We conducted experiments in which 28 images were shown to 28 subjects once with correct and another time with incorrect tags. The goal of our system is to detect the responses to non-relevant tags and consequently filter them out. Therefore, we trained classifiers to detect the tag relevance from bodily responses. We evaluated the performance of our system using a subject independent approach. The precision at top 5% and top 10% detections were calculated and results of different modalities and different classifiers were compared. The results show that eye gaze outperforms the other modalities in tag relevance detection both overall and for top ranked results.
Original language | Undefined |
---|---|
Title of host publication | Proceedings of the 21st ACM international conference on Multimedia, MM 2013 |
Place of Publication | New York |
Publisher | Association for Computing Machinery |
Pages | 657-660 |
Number of pages | 4 |
ISBN (Print) | 978-1-4503-2404-5 |
DOIs | |
Publication status | Published - Oct 2013 |
Event | 21st ACM Multimedia Conference, MM 2013 - Barcelona, Spain Duration: 21 Oct 2013 → 25 Oct 2013 Conference number: 21 http://acmmm13.org/general-info/about-acm-multimedia-2013/ |
Publication series
Name | |
---|---|
Publisher | ACM |
Conference
Conference | 21st ACM Multimedia Conference, MM 2013 |
---|---|
Abbreviated title | MM |
Country/Territory | Spain |
City | Barcelona |
Period | 21/10/13 → 25/10/13 |
Internet address |
Keywords
- EWI-24344
- HMI-HF: Human Factors
- EEG
- METIS-302663
- IR-89374
- Implicit Tagging
- Facial expressions
- eye gaze