Abstract
For an artifact such as a robot or a virtual agent to respond appropriately to human social touch behavior, it should be able to automatically detect and recognize touch. This paper describes the data collection of CoST: Corpus of Social Touch, a data set containing 7805 captures of 14 different social touch gestures. All touch gestures were performed in three variants: gentle, normal and rough on a pressure sensor grid wrapped around a mannequin arm. Recognition of these 14 gesture classes using various classifiers yielded accuracies up to 60 %; moreover, gentle gestures proved to be harder to classify than normal and rough gestures. We further investigated how different classifiers, interpersonal differences, gesture confusions and gesture variants affected the recognition accuracy. Finally, we present directions for further research to ensure proper transfer of the touch modality from interpersonal interaction to areas such as human–robot interaction (HRI).
Original language | English |
---|---|
Pages (from-to) | 81-96 |
Number of pages | 16 |
Journal | Journal on multimodal user interfaces |
Volume | 11 |
Issue number | 1 |
DOIs | |
Publication status | Published - Mar 2017 |
Keywords
- HMI-MI: MULTIMODAL INTERACTIONS
- Social Touch
- EWI-27340
- IR-102549
- Touch gesture recognition
- METIS-319467
- Touch corpus
Fingerprint
Dive into the research topics of 'Automatic recognition of touch gestures in the corpus of social touch'. Together they form a unique fingerprint.Datasets
-
Corpus of Social Touch (CoST)
Jung, M. M. (Creator), University of Twente, 1 Jun 2016
DOI: 10.4121/uuid:5ef62345-3b3e-479c-8e1d-c922748c9b29
Dataset