Abstract
People with Autism Spectrum Condition have issues navigating social situations. Typically, in therapy, robots teach people with ASC desirable social interaction according to traditional models which focus on the cognitive, rather than emotions or intuitions. Participatory sense- making could provide new insights in the theory of this area. To establish participatory sense- making, joint attention needs to be reached. We analyzed footage of a robot expressing emotions of therapy sessions in Serbia, during which a child with ASC has to guess the emotion. We used conversation analysis from the perspective of participatory sense-making with a focus on body language. Not speaking the language allowed us to focus on the body language without distraction. During the analysis 3 types of situations occurred: participatory sense-making, missed opportunity and non-compliance. The results showed that more elements of coordination lead to better participatory sense- making was established. We argue that a robot could provide support for a therapist when establishing participatory sense-making.
Original language | English |
---|---|
Publication status | Published - 2017 |
Event | Student Interaction Design Research Conference, SIDeR 2017 - Delft University of Technology, Delft, Netherlands Duration: 1 Apr 2017 → 2 Apr 2017 Conference number: 2017 http://id.tudelft.nl/sider/2017/ |
Conference
Conference | Student Interaction Design Research Conference, SIDeR 2017 |
---|---|
Abbreviated title | SIDeR |
Country/Territory | Netherlands |
City | Delft |
Period | 1/04/17 → 2/04/17 |
Internet address |
Keywords
- Human robot interaction
- Autism
- Conversation analysis
- Robot design
- Interaction design