A First Step toward the Automatic Understanding of Social Touch for Naturalistic Human–Robot Interaction

    Research output: Contribution to journalArticleAcademicpeer-review

    1 Citation (Scopus)
    56 Downloads (Pure)

    Abstract

    Social robots should be able to automatically understand and respond to human touch. The meaning of touch does not only depend on the form of touch but also on the context in which the touch takes place. To gain more insight into the factors that are relevant to interpret the meaning of touch within a social context we elicited touch behaviors by letting participants interact with a robot pet companion in the context of different affective scenarios. In a contextualized lab setting, participants (n = 31) acted as if they were coming home in different emotional states (i.e., stressed, depressed, relaxed, and excited) without being given specific instructions on the kinds of behaviors that they should display. Based on video footage of the interactions and interviews we explored the use of touch behaviors, the expressed social messages, and the expected robot pet responses. Results show that emotional state influenced the social messages that were communicated to the robot pet as well as the expected responses. Furthermore, it was found that multimodal cues were used to communicate with the robot pet, that is, participants often talked to the robot pet while touching it and making eye contact. Additionally, the findings of this study indicate that the categorization of touch behaviors into discrete touch gesture categories based on dictionary definitions is not a suitable approach to capture the complex nature of touch behaviors in less controlled settings. These findings can inform the design of a behavioral model for robot pet companions and future directions to interpret touch behaviors in less controlled settings are discussed.
    Original languageEnglish
    Pages (from-to)3
    Number of pages11
    JournalFrontiers in ICT
    Volume4
    DOIs
    Publication statusPublished - 17 Mar 2017

    Fingerprint

    Robots
    Glossaries

    Keywords

    • Social Touch
    • Multi-modal interaction
    • affective context
    • Behavior analysis
    • human–robot interaction
    • HMI-MI: MULTIMODAL INTERACTIONS
    • HMI-IA: Intelligent Agents
    • EWI-27850
    • touch recognition
    • robot pet companion

    Cite this

    @article{012077d946b746b680ac46490fafe8d9,
    title = "A First Step toward the Automatic Understanding of Social Touch for Naturalistic Human–Robot Interaction",
    abstract = "Social robots should be able to automatically understand and respond to human touch. The meaning of touch does not only depend on the form of touch but also on the context in which the touch takes place. To gain more insight into the factors that are relevant to interpret the meaning of touch within a social context we elicited touch behaviors by letting participants interact with a robot pet companion in the context of different affective scenarios. In a contextualized lab setting, participants (n = 31) acted as if they were coming home in different emotional states (i.e., stressed, depressed, relaxed, and excited) without being given specific instructions on the kinds of behaviors that they should display. Based on video footage of the interactions and interviews we explored the use of touch behaviors, the expressed social messages, and the expected robot pet responses. Results show that emotional state influenced the social messages that were communicated to the robot pet as well as the expected responses. Furthermore, it was found that multimodal cues were used to communicate with the robot pet, that is, participants often talked to the robot pet while touching it and making eye contact. Additionally, the findings of this study indicate that the categorization of touch behaviors into discrete touch gesture categories based on dictionary definitions is not a suitable approach to capture the complex nature of touch behaviors in less controlled settings. These findings can inform the design of a behavioral model for robot pet companions and future directions to interpret touch behaviors in less controlled settings are discussed.",
    keywords = "Social Touch, Multi-modal interaction, affective context, Behavior analysis, human–robot interaction, HMI-MI: MULTIMODAL INTERACTIONS, HMI-IA: Intelligent Agents, EWI-27850, touch recognition, robot pet companion",
    author = "Jung, {Merel Madeleine} and Mannes Poel and Dennis Reidsma and Heylen, {Dirk K.J.}",
    year = "2017",
    month = "3",
    day = "17",
    doi = "10.3389/fict.2017.00003",
    language = "English",
    volume = "4",
    pages = "3",
    journal = "Frontiers in ICT",
    issn = "2297-198X",
    publisher = "Frontiers Research Foundation",

    }

    A First Step toward the Automatic Understanding of Social Touch for Naturalistic Human–Robot Interaction. / Jung, Merel Madeleine; Poel, Mannes; Reidsma, Dennis; Heylen, Dirk K.J.

    In: Frontiers in ICT, Vol. 4, 17.03.2017, p. 3.

    Research output: Contribution to journalArticleAcademicpeer-review

    TY - JOUR

    T1 - A First Step toward the Automatic Understanding of Social Touch for Naturalistic Human–Robot Interaction

    AU - Jung, Merel Madeleine

    AU - Poel, Mannes

    AU - Reidsma, Dennis

    AU - Heylen, Dirk K.J.

    PY - 2017/3/17

    Y1 - 2017/3/17

    N2 - Social robots should be able to automatically understand and respond to human touch. The meaning of touch does not only depend on the form of touch but also on the context in which the touch takes place. To gain more insight into the factors that are relevant to interpret the meaning of touch within a social context we elicited touch behaviors by letting participants interact with a robot pet companion in the context of different affective scenarios. In a contextualized lab setting, participants (n = 31) acted as if they were coming home in different emotional states (i.e., stressed, depressed, relaxed, and excited) without being given specific instructions on the kinds of behaviors that they should display. Based on video footage of the interactions and interviews we explored the use of touch behaviors, the expressed social messages, and the expected robot pet responses. Results show that emotional state influenced the social messages that were communicated to the robot pet as well as the expected responses. Furthermore, it was found that multimodal cues were used to communicate with the robot pet, that is, participants often talked to the robot pet while touching it and making eye contact. Additionally, the findings of this study indicate that the categorization of touch behaviors into discrete touch gesture categories based on dictionary definitions is not a suitable approach to capture the complex nature of touch behaviors in less controlled settings. These findings can inform the design of a behavioral model for robot pet companions and future directions to interpret touch behaviors in less controlled settings are discussed.

    AB - Social robots should be able to automatically understand and respond to human touch. The meaning of touch does not only depend on the form of touch but also on the context in which the touch takes place. To gain more insight into the factors that are relevant to interpret the meaning of touch within a social context we elicited touch behaviors by letting participants interact with a robot pet companion in the context of different affective scenarios. In a contextualized lab setting, participants (n = 31) acted as if they were coming home in different emotional states (i.e., stressed, depressed, relaxed, and excited) without being given specific instructions on the kinds of behaviors that they should display. Based on video footage of the interactions and interviews we explored the use of touch behaviors, the expressed social messages, and the expected robot pet responses. Results show that emotional state influenced the social messages that were communicated to the robot pet as well as the expected responses. Furthermore, it was found that multimodal cues were used to communicate with the robot pet, that is, participants often talked to the robot pet while touching it and making eye contact. Additionally, the findings of this study indicate that the categorization of touch behaviors into discrete touch gesture categories based on dictionary definitions is not a suitable approach to capture the complex nature of touch behaviors in less controlled settings. These findings can inform the design of a behavioral model for robot pet companions and future directions to interpret touch behaviors in less controlled settings are discussed.

    KW - Social Touch

    KW - Multi-modal interaction

    KW - affective context

    KW - Behavior analysis

    KW - human–robot interaction

    KW - HMI-MI: MULTIMODAL INTERACTIONS

    KW - HMI-IA: Intelligent Agents

    KW - EWI-27850

    KW - touch recognition

    KW - robot pet companion

    U2 - 10.3389/fict.2017.00003

    DO - 10.3389/fict.2017.00003

    M3 - Article

    VL - 4

    SP - 3

    JO - Frontiers in ICT

    JF - Frontiers in ICT

    SN - 2297-198X

    ER -