For an artifact such as a robot or a virtual agent to respond appropriately to human social touch behavior, it should be able to automatically detect and recognize touch. This paper describes the data collection of CoST: Corpus of Social Touch, a data set containing 7805 captures of 14 different social touch gestures. All touch gestures were performed in three variants: gentle, normal and rough on a pressure sensor grid wrapped around a mannequin arm. Recognition of these 14 gesture classes using various classifiers yielded accuracies up to 60 %; moreover, gentle gestures proved to be harder to classify than normal and rough gestures. We further investigated how different classifiers, interpersonal differences, gesture confusions and gesture variants affected the recognition accuracy. Finally, we present directions for further research to ensure proper transfer of the touch modality from interpersonal interaction to areas such as human–robot interaction (HRI).
- HMI-MI: MULTIMODAL INTERACTIONS
- Social Touch
- Touch gesture recognition
- Touch corpus
Jung, M. M., Poel, M., Poppe, R. W., & Heylen, D. K. J. (2017). Automatic recognition of touch gestures in the corpus of social touch. Journal on multimodal user interfaces, 11(1), 81-96. https://doi.org/10.1007/s12193-016-0232-9