Abstract
The vision of Ambient Intelligence (AmI) presumes a plethora of embedded services
and devices that all endeavor to support humans in their daily activities
as unobtrusively as possible. Hardware gets distributed throughout the environment,
occupying even the fabric of our clothing. The environment is equipped
with a diversity of sensors, the information of which can be accessed from all
over the AmI network. Individual services are distributed over hardware, share
sensors with other services and are generally detached from the traditional single
access-point computer (see also the paper of Pantic et al in this volume [51]).
‘Unobtrusive support’ means that where possible the user should be freed
from the necessity of entering into an explicit dialog with all these services and
devices. The environment shifts towards the use of implicit interaction, that is,
“interactions that may occur without the behest or awareness of the user��? [36].
However, not all interactions between user and environment will be implicit.
It may not be possible, or it may not be desirable, e.g. because the user does
not want to feel a loss of control over certain aspects of his environment. So how
does the user achieve the necessary explicit interaction? Will (s)he address every
query for information to the specific device or service that ultimately provides
the information? Will (s)he give commands to the heating system, the blinds
and the room lighting separately? Will each service and each device carry its
own interaction interface? Clearly not. Interfaces will come to be developed that
abstract from individual services and devices and offer the user access to certain
combined functionalities of the system. The interfaces should support the mixture
of explicit and implicit, and reactive and proactive, interaction required for
a successful AmI environment. Finally, AmI environments are inherently multiuser,
so the interface needs to be able to understand and engage in addressed
multi party interaction [48]. We argue that Virtual Humans (VHs) are eminently
suited to fulfill the role of such interfaces.
An AmI environment can serve various purposes. It can be a home environment,
an office environment, a public space or it can be used in an educational
setting. Virtual humans can be available, among others, as friend, exercise adviser,
health care specialist, butler, conversation partner or tutor. Sometimes they know
things better than you do, sometimes they have more control over
parts of the AmI environment than you have and sometimes they persuade you
to do things differently. You may not always like all aspects of the virtual humans
that cohabit your house. Maybe the virtual tutor that is available to monitor
your children’s homework sometimes takes decisions that are not liked by your
children at all. Your virtual friend is not very interesting if it always agrees
with your opinions. A health care agent has to be strict. A virtual human that
acts as a conversational partner for your grandmother may have some peculiar
behavior sometimes (like a dog or cat has; remember the Tamagotchi). As in collaborative
virtual environments we can have remote participation in activities in
AmI environments. Virtual humans can then represent family members (with all
their characteristics, including weaknesses) that are abroad and that nevertheless
take part in family activities. Transformation of communicative behavior of
virtual humans that represent real humans can be useful too [4]. Summarizing,
in the AmI environments we foresee that virtual humans can play human-like
roles and need human-like properties, including (semi-) autonomous behavior,
personalities, individual characteristics and peculiarities.
However, the vast majority of existing, implemented applications of Virtual
Humans are focused around one clear task, such as selling tickets, learning a
skill, answering questions or booking flights and hotels, and are accessed in a
clearly explicit manner. There, one can take it as a given that the attention of
the user is on the system and the user is primarily engaged with the interaction.
In an AmI environment this is no longer true. A dialog with a Virtual Human
may be altogether secondary to several other activities of the user. A dialog with
a Virtual Human may also be about many different issues, pertaining to different
aspects of the environment, in parallel. This has a lot of impact on many aspects
of a Virtual Human.
In the rest of this paper we will examine a (not necessarily exhaustive) number
of aspects to Virtual Humans that we feel as most relevant to their introduction
in a complex AmI environment. Some of these aspects relate to the
embedding of the Human / Virtual Human interaction in ongoing daily activities:
issues of synchronization, turn taking and control. Other points touch upon
the fictional/real polemic: how realistic should VHs be? Should a VH interface
exhibit ‘socially inspired’ behaviour? Should a VH exhibit also the imperfections
and shortcomings so characteristic of human communication? Some of the points
will be illustrated with examples from our recent work on Virtual Humans, summarized
in Section 4.
Original language | English |
---|---|
Title of host publication | Artifical Intelligence for Human Computing |
Editors | T Huang, Antinus Nijholt, Maja Pantic, A. Pentland |
Place of Publication | Berlin |
Publisher | Springer |
Pages | 316-338 |
Number of pages | 23 |
ISBN (Print) | 978-3-540-72346-2 |
DOIs | |
Publication status | Published - Jun 2007 |
Event | 8th International Conference on Multimodal Interfaces, ICMI 2006 - Banff, Canada Duration: 2 Nov 2006 → 4 Nov 2006 Conference number: 8 |
Publication series
Name | Lecture Notes in Artificial Intelligence |
---|---|
Publisher | Springer Verlag |
Number | 4451 |
ISSN (Print) | 0302-9743 |
ISSN (Electronic) | 1611-3349 |
Conference
Conference | 8th International Conference on Multimodal Interfaces, ICMI 2006 |
---|---|
Abbreviated title | ICMI |
Country/Territory | Canada |
City | Banff |
Period | 2/11/06 → 4/11/06 |
Keywords
- IR-66952
- EWI-9284
- EC Grant Agreement nr.: FP6/033812
- METIS-242042