Abstract
The aim of the research presented in this dissertation was to explore the opportunities that novel technologies, such as wearable consumer electronics, offer to support persons with a visual impairment (PVIs) in their daily lives by following the principles of human-centered design. We identified problems of daily life for PVIs and explored wishes and needs for the development and innovation agenda of wearable technologies. It was decided to focus on recognition of nonverbal communication and navigation in unknown environments.
We developed and evaluated a wearable system consisting of a head mounted camera, a tablet running emotion recognition software, and a haptic belt with small vibration motors to convey information to its users. Haptic feedback was chosen over other means of feedback to keep the eyes and ears of users free, as PVIs are highly dependent on their hearing – and in some cases remaining vision – to perceive their surroundings.
Initially we found that, under ideal conditions, vibrotactile cues can be used to convey facial expressions of emotions to PVIs in real-time. After making adjustments based on feedback, the research continued under more realistic conditions. Additionally, we explored whether a wearable GPS based navigation system using the same vibrotactile belt could support PVIs to find their way in unfamiliar outdoor environments. Despite positive comments from the participants, we concluded the system required improvements in performance and wearability before it is ready to use in interactions and wayfinding.
Our studies showed that participants can learn, interpret, and use the vibrotactile signals conveyed by the system easily and that they are enthusiastic about the concept. Although various technical and usability challenges were identified which need to be addressed in future development, the research has led to valuable insights in the possibilities that tactile feedback offers to support PVIs in their daily lives.
We developed and evaluated a wearable system consisting of a head mounted camera, a tablet running emotion recognition software, and a haptic belt with small vibration motors to convey information to its users. Haptic feedback was chosen over other means of feedback to keep the eyes and ears of users free, as PVIs are highly dependent on their hearing – and in some cases remaining vision – to perceive their surroundings.
Initially we found that, under ideal conditions, vibrotactile cues can be used to convey facial expressions of emotions to PVIs in real-time. After making adjustments based on feedback, the research continued under more realistic conditions. Additionally, we explored whether a wearable GPS based navigation system using the same vibrotactile belt could support PVIs to find their way in unfamiliar outdoor environments. Despite positive comments from the participants, we concluded the system required improvements in performance and wearability before it is ready to use in interactions and wayfinding.
Our studies showed that participants can learn, interpret, and use the vibrotactile signals conveyed by the system easily and that they are enthusiastic about the concept. Although various technical and usability challenges were identified which need to be addressed in future development, the research has led to valuable insights in the possibilities that tactile feedback offers to support PVIs in their daily lives.
Original language | English |
---|---|
Qualification | Doctor of Philosophy |
Awarding Institution |
|
Supervisors/Advisors |
|
Award date | 12 Feb 2021 |
Place of Publication | Enschede |
Publisher | |
Print ISBNs | 978-90-365-5121-2 |
DOIs | |
Publication status | Published - 12 Feb 2021 |