Automatic recognition of touch gestures in the corpus of social touch

  • 2 Citations

Abstract

For an artifact such as a robot or a virtual agent to respond appropriately to human social touch behavior, it should be able to automatically detect and recognize touch. This paper describes the data collection of CoST: Corpus of Social Touch, a data set containing 7805 captures of 14 different social touch gestures. All touch gestures were performed in three variants: gentle, normal and rough on a pressure sensor grid wrapped around a mannequin arm. Recognition of these 14 gesture classes using various classifiers yielded accuracies up to 60 %; moreover, gentle gestures proved to be harder to classify than normal and rough gestures. We further investigated how different classifiers, interpersonal differences, gesture confusions and gesture variants affected the recognition accuracy. Finally, we present directions for further research to ensure proper transfer of the touch modality from interpersonal interaction to areas such as human–robot interaction (HRI).
Original languageEnglish
Pages (from-to)81-96
Number of pages16
JournalJournal on multimodal user interfaces
Volume11
Issue number1
DOIs
StatePublished - Mar 2017

Fingerprint

Classifiers
Human robot interaction
Pressure sensors
Robots

Keywords

  • HMI-MI: MULTIMODAL INTERACTIONS
  • Social Touch
  • EWI-27340
  • IR-102549
  • Touch gesture recognition
  • METIS-319467
  • Touch corpus

Cite this

Jung, Merel Madeleine; Poel, Mannes; Poppe, Ronald Walter; Heylen, Dirk K.J. / Automatic recognition of touch gestures in the corpus of social touch.

In: Journal on multimodal user interfaces, Vol. 11, No. 1, 03.2017, p. 81-96.

Research output: Scientific - peer-reviewArticle

@article{fcc5962eef314dd6952abda3b86d90a0,
title = "Automatic recognition of touch gestures in the corpus of social touch",
abstract = "For an artifact such as a robot or a virtual agent to respond appropriately to human social touch behavior, it should be able to automatically detect and recognize touch. This paper describes the data collection of CoST: Corpus of Social Touch, a data set containing 7805 captures of 14 different social touch gestures. All touch gestures were performed in three variants: gentle, normal and rough on a pressure sensor grid wrapped around a mannequin arm. Recognition of these 14 gesture classes using various classifiers yielded accuracies up to 60 %; moreover, gentle gestures proved to be harder to classify than normal and rough gestures. We further investigated how different classifiers, interpersonal differences, gesture confusions and gesture variants affected the recognition accuracy. Finally, we present directions for further research to ensure proper transfer of the touch modality from interpersonal interaction to areas such as human–robot interaction (HRI).",
keywords = "HMI-MI: MULTIMODAL INTERACTIONS, Social Touch, EWI-27340, IR-102549, Touch gesture recognition, METIS-319467, Touch corpus",
author = "Jung, {Merel Madeleine} and Mannes Poel and Poppe, {Ronald Walter} and Heylen, {Dirk K.J.}",
year = "2017",
month = "3",
doi = "10.1007/s12193-016-0232-9",
volume = "11",
pages = "81--96",
journal = "Journal on multimodal user interfaces",
issn = "1783-7677",
publisher = "Springer Verlag",
number = "1",

}

Automatic recognition of touch gestures in the corpus of social touch. / Jung, Merel Madeleine; Poel, Mannes; Poppe, Ronald Walter; Heylen, Dirk K.J.

In: Journal on multimodal user interfaces, Vol. 11, No. 1, 03.2017, p. 81-96.

Research output: Scientific - peer-reviewArticle

TY - JOUR

T1 - Automatic recognition of touch gestures in the corpus of social touch

AU - Jung,Merel Madeleine

AU - Poel,Mannes

AU - Poppe,Ronald Walter

AU - Heylen,Dirk K.J.

PY - 2017/3

Y1 - 2017/3

N2 - For an artifact such as a robot or a virtual agent to respond appropriately to human social touch behavior, it should be able to automatically detect and recognize touch. This paper describes the data collection of CoST: Corpus of Social Touch, a data set containing 7805 captures of 14 different social touch gestures. All touch gestures were performed in three variants: gentle, normal and rough on a pressure sensor grid wrapped around a mannequin arm. Recognition of these 14 gesture classes using various classifiers yielded accuracies up to 60 %; moreover, gentle gestures proved to be harder to classify than normal and rough gestures. We further investigated how different classifiers, interpersonal differences, gesture confusions and gesture variants affected the recognition accuracy. Finally, we present directions for further research to ensure proper transfer of the touch modality from interpersonal interaction to areas such as human–robot interaction (HRI).

AB - For an artifact such as a robot or a virtual agent to respond appropriately to human social touch behavior, it should be able to automatically detect and recognize touch. This paper describes the data collection of CoST: Corpus of Social Touch, a data set containing 7805 captures of 14 different social touch gestures. All touch gestures were performed in three variants: gentle, normal and rough on a pressure sensor grid wrapped around a mannequin arm. Recognition of these 14 gesture classes using various classifiers yielded accuracies up to 60 %; moreover, gentle gestures proved to be harder to classify than normal and rough gestures. We further investigated how different classifiers, interpersonal differences, gesture confusions and gesture variants affected the recognition accuracy. Finally, we present directions for further research to ensure proper transfer of the touch modality from interpersonal interaction to areas such as human–robot interaction (HRI).

KW - HMI-MI: MULTIMODAL INTERACTIONS

KW - Social Touch

KW - EWI-27340

KW - IR-102549

KW - Touch gesture recognition

KW - METIS-319467

KW - Touch corpus

U2 - 10.1007/s12193-016-0232-9

DO - 10.1007/s12193-016-0232-9

M3 - Article

VL - 11

SP - 81

EP - 96

JO - Journal on multimodal user interfaces

T2 - Journal on multimodal user interfaces

JF - Journal on multimodal user interfaces

SN - 1783-7677

IS - 1

ER -