Abstract
This study explores non-verbal co-design techniques with multisensory wearables to give the body a voice. Sessions were led with professional caregivers, parents, and clients with PIMD (profound intellectual and multiple disabilities) to find fundamental building blocks for a common language based on tangible technologies. To provide an agent for communication we employed the tools of extimacy - translating biodata to visual, auditory, or tactile interactive displays. The caregivers expressed the need for action – reaction “Actie Reactie” to keep attention, which was an update from the Multisensory Environment (MSE) rooms previously used to calm. In the co-design sessions, we found the on-the-body wearables held the most focus. The final discovery from the study became the outline for creating a modular, highly personalized kit for a Multisensory Wearable (MSW) to inspire surprise and wonder.
Original language | English |
---|---|
Title of host publication | CHI EA '21 |
Subtitle of host publication | Extended Abstracts of the 2021 CHI Conference on Human Factors in Computing Systems |
Editors | Yoshifumi Kitamura, Aaron Quigley, Katherine Isbister, Takeo Igarashi |
Publisher | ACM SigCHI |
Pages | 1 |
Number of pages | 6 |
ISBN (Print) | 978-1-4503-8095-9 |
DOIs | |
Publication status | Published - 8 May 2021 |
Event | Conference on Human Factors in Computing Systems, CHI 2021 - Virtual Conference Duration: 8 May 2021 → 13 May 2021 |
Conference
Conference | Conference on Human Factors in Computing Systems, CHI 2021 |
---|---|
Abbreviated title | CHI 2021 |
City | Virtual Conference |
Period | 8/05/21 → 13/05/21 |