Multimodal (and more physical) Interaction to Improve In-Car User Experience

Champika Ranasinghe*

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingChapterProfessional

478 Downloads (Pure)

Abstract

Automated vehicles have the potential to provide increased mobility to a broader range of users such as the elderly, children, people with physical limitations (e.g., visually impaired people, people with broken legs/ arms), or people with other types of limitations such as for example people who are nervous of driving [1] [2]. On the other hand, driving becomes a secondary task and the drivers (and the passengers) can engage in various other tasks such as work, leisure, or even sleep. This requires interacting with the vehicle at various levels such as for example to take the control of driving when mediation is necessary, making driving-related decisions or for the purpose of other tasks the user is engaged in (such as online meetings of collogues using the car’s infrastructure). This often involves two types of interaction: interaction for the purpose of the primary task (what the user is currently doing, for example playing a game) and the interaction for secondary tasks while the user is engaged in and her attention is on another primary task (such as for example, for receiving the status of the traffic ahead while the user is playing a game with the aid of car’s infrastructure). Towards better facilitating these interactions, autonomous vehicles can benefit from multimodal interaction, and the use of different (and often multiple) human sensory modalities. On the one hand, not much research has been done on how different human sensory modalities can be best used to facilitate in-car interaction. On the other hand, users of autonomous vehicles, use situations, and what interaction requirements these users have remained largely unexplored. Except for speech and haptic-based interaction, a little is known about using other modalities such as gestures, olfaction, and sonification and how they can be used for different types of users and use situations. We aim to fill this gap by exploring how various types of sensory modalities can be used to enrich in-car interaction of various user groups and use situations.
Original languageEnglish
Title of host publicationHuman-Computer Interaction to Support Work and Wellbeing in Mobile Environments
Place of PublicationDagstuhl, Germany
PublisherDagstuhl
Pages33-34
DOIs
Publication statusPublished - 2021
EventDagstuhl Seminar on Human-Computer Interaction to Support Work and Wellbeing in Mobile Environments 2021 - Schloss Dagstuhl - Leibniz-Zentrum für Informatik, Leibniz, Germany
Duration: 6 Jun 202111 Jun 2021

Publication series

NameDagstuhl Reports
PublisherSchloss Dagstuhl - Leibniz-Zentrum für Informatik
Number5
Volume11
ISSN (Print)2192-5283

Seminar

SeminarDagstuhl Seminar on Human-Computer Interaction to Support Work and Wellbeing in Mobile Environments 2021
Country/TerritoryGermany
CityLeibniz
Period6/06/2111/06/21

Fingerprint

Dive into the research topics of 'Multimodal (and more physical) Interaction to Improve In-Car User Experience'. Together they form a unique fingerprint.

Cite this