Detecting Perceived Appropriateness of a Robot's Social Positioning Behavior from Non-Verbal Cues: a Dataset

Dataset

Description

What if a robot could detect when you think it got too close to you during its approach? This would allow it to correct or compensate for its social ‘mistake’. It would also allow for a responsive approach, where that robot would reactively find suitable approach behavior through and during the interaction.
We investigated if it is possible to automatically detect such social feedback cues in the context of a robot approaching a person.
We collected a dataset in which our robot would repeatedly approach people (n=30) to verbally deliver a message. Approach distance and environmental noise were manipulated, and our participants were tracked (position and orientation of upper body and head). We evaluated their perception of the robots behavior through questionnaires and found no single or joint effects of the manipulations, showing that, in this case, personal differences are more important than contextual cues – thus highlighting the importance of responding to behavioral feedback.

Giraff, Head position/orientation, Non-verbal cues, OptiTrack, Perception of a robot's behavior, Social feedback cues, Social robotics, Tracking, Upper body position/orientation
Date made available13 Dec 2019
Publisher4TU.Centre for Research Data
Date of data production2016

Cite this

Vroon, J. H. (Creator), Englebienne, G. (Creator), Evers, V. (Creator) (13 Dec 2019). Detecting Perceived Appropriateness of a Robot's Social Positioning Behavior from Non-Verbal Cues: a Dataset. 4TU.Centre for Research Data. 10.4121/uuid:b76c3a6f-f7d5-418e-874a-d6140853e1fa