Touch Challenge ‘15: Recognizing Social Touch Gestures

Merel Madeleine Jung, Xi Laura Cang, Mannes Poel, Karon E. MacLean

    Research output: Chapter in Book/Report/Conference proceedingConference contributionAcademic

    16 Citations (Scopus)
    125 Downloads (Pure)

    Abstract

    Advances in the field of touch recognition could open up applications for touch-based interaction in areas such as Human-Robot Interaction (HRI). We extended this challenge to the research community working on multimodal interaction with the goal of sparking interest in the touch modality and to promote exploration of the use of data processing techniques from other more mature modalities for touch recognition. Two data sets were made available containing labeled pressure sensor data of social touch gestures that were performed by touching a touch-sensitive surface with the hand. Each set was collected from similar sensor grids, but under conditions reflecting different application orientations: CoST: Corpus of Social Touch and HAART: The Human-Animal Affective Robot Touch gesture set. In this paper we describe the challenge protocol and summa- rize the results from the touch challenge hosted in conjunction with the 2015 ACM International Conference on Multi- modal Interaction (ICMI). The most important outcomes of the challenges were: (1) transferring techniques from other modalities, such as image processing, speech, and human action recognition provided valuable feature sets; (2) gesture classification confusions were similar despite the various data processing methods used. Categories
    Original languageUndefined
    Title of host publicationProceedings of the 2015 ACM on International Conference on Multimodal Interaction, ICMI 2015
    Place of PublicationNew York
    PublisherAssociation for Computing Machinery (ACM)
    Pages387 -390
    Number of pages4
    ISBN (Print)978-1-4503-3912-4
    DOIs
    Publication statusPublished - Nov 2015
    Event17th ACM International Conference on Multimodal Interaction, ICMI 2015 - Seattle, United States
    Duration: 9 Nov 201513 Nov 2015
    Conference number: 17

    Publication series

    Name
    PublisherACM

    Conference

    Conference17th ACM International Conference on Multimodal Interaction, ICMI 2015
    Abbreviated titleICMI
    CountryUnited States
    CitySeattle
    Period9/11/1513/11/15

    Keywords

    • EWI-26507
    • IR-98396
    • METIS-315060

    Cite this

    Jung, M. M., Cang, X. L., Poel, M., & MacLean, K. E. (2015). Touch Challenge ‘15: Recognizing Social Touch Gestures. In Proceedings of the 2015 ACM on International Conference on Multimodal Interaction, ICMI 2015 (pp. 387 -390). New York: Association for Computing Machinery (ACM). https://doi.org/10.1145/2818346.2829993
    Jung, Merel Madeleine ; Cang, Xi Laura ; Poel, Mannes ; MacLean, Karon E. / Touch Challenge ‘15: Recognizing Social Touch Gestures. Proceedings of the 2015 ACM on International Conference on Multimodal Interaction, ICMI 2015. New York : Association for Computing Machinery (ACM), 2015. pp. 387 -390
    @inproceedings{2bd70ce8514b485b9917658aff46702d,
    title = "Touch Challenge ‘15: Recognizing Social Touch Gestures",
    abstract = "Advances in the field of touch recognition could open up applications for touch-based interaction in areas such as Human-Robot Interaction (HRI). We extended this challenge to the research community working on multimodal interaction with the goal of sparking interest in the touch modality and to promote exploration of the use of data processing techniques from other more mature modalities for touch recognition. Two data sets were made available containing labeled pressure sensor data of social touch gestures that were performed by touching a touch-sensitive surface with the hand. Each set was collected from similar sensor grids, but under conditions reflecting different application orientations: CoST: Corpus of Social Touch and HAART: The Human-Animal Affective Robot Touch gesture set. In this paper we describe the challenge protocol and summa- rize the results from the touch challenge hosted in conjunction with the 2015 ACM International Conference on Multi- modal Interaction (ICMI). The most important outcomes of the challenges were: (1) transferring techniques from other modalities, such as image processing, speech, and human action recognition provided valuable feature sets; (2) gesture classification confusions were similar despite the various data processing methods used. Categories",
    keywords = "EWI-26507, IR-98396, METIS-315060",
    author = "Jung, {Merel Madeleine} and Cang, {Xi Laura} and Mannes Poel and MacLean, {Karon E.}",
    note = "eemcs-eprint-26507",
    year = "2015",
    month = "11",
    doi = "10.1145/2818346.2829993",
    language = "Undefined",
    isbn = "978-1-4503-3912-4",
    publisher = "Association for Computing Machinery (ACM)",
    pages = "387 --390",
    booktitle = "Proceedings of the 2015 ACM on International Conference on Multimodal Interaction, ICMI 2015",
    address = "United States",

    }

    Jung, MM, Cang, XL, Poel, M & MacLean, KE 2015, Touch Challenge ‘15: Recognizing Social Touch Gestures. in Proceedings of the 2015 ACM on International Conference on Multimodal Interaction, ICMI 2015. Association for Computing Machinery (ACM), New York, pp. 387 -390, 17th ACM International Conference on Multimodal Interaction, ICMI 2015, Seattle, United States, 9/11/15. https://doi.org/10.1145/2818346.2829993

    Touch Challenge ‘15: Recognizing Social Touch Gestures. / Jung, Merel Madeleine; Cang, Xi Laura; Poel, Mannes; MacLean, Karon E.

    Proceedings of the 2015 ACM on International Conference on Multimodal Interaction, ICMI 2015. New York : Association for Computing Machinery (ACM), 2015. p. 387 -390.

    Research output: Chapter in Book/Report/Conference proceedingConference contributionAcademic

    TY - GEN

    T1 - Touch Challenge ‘15: Recognizing Social Touch Gestures

    AU - Jung, Merel Madeleine

    AU - Cang, Xi Laura

    AU - Poel, Mannes

    AU - MacLean, Karon E.

    N1 - eemcs-eprint-26507

    PY - 2015/11

    Y1 - 2015/11

    N2 - Advances in the field of touch recognition could open up applications for touch-based interaction in areas such as Human-Robot Interaction (HRI). We extended this challenge to the research community working on multimodal interaction with the goal of sparking interest in the touch modality and to promote exploration of the use of data processing techniques from other more mature modalities for touch recognition. Two data sets were made available containing labeled pressure sensor data of social touch gestures that were performed by touching a touch-sensitive surface with the hand. Each set was collected from similar sensor grids, but under conditions reflecting different application orientations: CoST: Corpus of Social Touch and HAART: The Human-Animal Affective Robot Touch gesture set. In this paper we describe the challenge protocol and summa- rize the results from the touch challenge hosted in conjunction with the 2015 ACM International Conference on Multi- modal Interaction (ICMI). The most important outcomes of the challenges were: (1) transferring techniques from other modalities, such as image processing, speech, and human action recognition provided valuable feature sets; (2) gesture classification confusions were similar despite the various data processing methods used. Categories

    AB - Advances in the field of touch recognition could open up applications for touch-based interaction in areas such as Human-Robot Interaction (HRI). We extended this challenge to the research community working on multimodal interaction with the goal of sparking interest in the touch modality and to promote exploration of the use of data processing techniques from other more mature modalities for touch recognition. Two data sets were made available containing labeled pressure sensor data of social touch gestures that were performed by touching a touch-sensitive surface with the hand. Each set was collected from similar sensor grids, but under conditions reflecting different application orientations: CoST: Corpus of Social Touch and HAART: The Human-Animal Affective Robot Touch gesture set. In this paper we describe the challenge protocol and summa- rize the results from the touch challenge hosted in conjunction with the 2015 ACM International Conference on Multi- modal Interaction (ICMI). The most important outcomes of the challenges were: (1) transferring techniques from other modalities, such as image processing, speech, and human action recognition provided valuable feature sets; (2) gesture classification confusions were similar despite the various data processing methods used. Categories

    KW - EWI-26507

    KW - IR-98396

    KW - METIS-315060

    U2 - 10.1145/2818346.2829993

    DO - 10.1145/2818346.2829993

    M3 - Conference contribution

    SN - 978-1-4503-3912-4

    SP - 387

    EP - 390

    BT - Proceedings of the 2015 ACM on International Conference on Multimodal Interaction, ICMI 2015

    PB - Association for Computing Machinery (ACM)

    CY - New York

    ER -

    Jung MM, Cang XL, Poel M, MacLean KE. Touch Challenge ‘15: Recognizing Social Touch Gestures. In Proceedings of the 2015 ACM on International Conference on Multimodal Interaction, ICMI 2015. New York: Association for Computing Machinery (ACM). 2015. p. 387 -390 https://doi.org/10.1145/2818346.2829993