Abstract
Touch behavior is of great importance during social interaction. Automatic recognition of social touch is necessary to transfer the touch modality from interpersonal interaction to other areas such as Human-Robot Interaction (HRI). This paper describes a PhD research program on the automatic detection, classification and interpretation of touch in social interaction between humans and artifacts. Progress thus far includes the recording of a Corpus of Social Touch (CoST) consisting of pressure sensor data of 14 different touch gestures and first classification results. Classification of these 14 gestures resulted in an overall accuracy of 53% using Bayesian classifiers. Further work includes the enhancement of the gesture recognition, building an embodied system for real-time classification and testing this system in a possible application scenario.
Original language | Undefined |
---|---|
Title of host publication | Proceedings of the 16th International Conference on Multimodal Interaction, ICMI 2014 |
Place of Publication | New York |
Publisher | Association for Computing Machinery |
Pages | 344-348 |
Number of pages | 5 |
ISBN (Print) | 978-1-4503-2885-2 |
DOIs | |
Publication status | Published - Nov 2014 |
Event | 16th International Conference on Multimodal Interaction, ICMI 2014 - Istanbul, Turkey, Istanbul, Turkey Duration: 12 Nov 2014 → 16 Nov 2014 Conference number: 16 |
Publication series
Name | |
---|---|
Publisher | ACM |
Conference
Conference | 16th International Conference on Multimodal Interaction, ICMI 2014 |
---|---|
Abbreviated title | ICMI |
Country/Territory | Turkey |
City | Istanbul |
Period | 12/11/14 → 16/11/14 |
Other | 12-16 November 2014 |
Keywords
- EWI-25279
- Touch gesture recognition
- Touch sensing
- METIS-309647
- Touch corpus
- IR-93287
- Human-Robot Interaction (HRI)
- Social Touch
Datasets
-
Corpus of Social Touch (CoST)
Jung, M. M. (Creator), University of Twente, 1 Jun 2016
DOI: 10.4121/uuid:5ef62345-3b3e-479c-8e1d-c922748c9b29
Dataset