In the Sensitive Artificial Listener project research is performed with the aim to design an embodied agent that not only generates the appropriate nonverbal behaviors that accompany speech, but that also displays verbal and nonverbal behaviors during the production of speech by its conversational partner. Apart from many applications for embodied agents where natural interaction between agent and human partner also require this behavior, the results of this project are also meant to play a role in research on emotional behavior during conversations. In this paper, our research and implementation efforts in this project are discussed and illustrated with examples of experiments, research approaches and interfaces in development.
|Title of host publication||Verbal and Nonverbal Communication Behaviours|
|Editors||A. Esposito, M. Faunder-Zanny, E. Keller, M. Marinaro|
|Place of Publication||Berlin|
|Number of pages||11|
|Publication status||Published - 6 Oct 2007|
|Name||Lecture Notes in Computer Science|
- HMI-MI: MULTIMODAL INTERACTIONS