In the Sensitive Artificial Listener project research is performed with the aim to design an embodied agent that not only generates the appropriate nonverbal behaviors that accompany speech, but that also displays verbal and nonverbal behaviors during the production of speech by its conversational partner. Apart from many applications for embodied agents where natural interaction between agent and human partner also require this behavior, the results of this project are also meant to play a role in research on emotional behavior during conversations. In this paper, our research and implementation efforts in this project are discussed and illustrated with examples of experiments, research approaches and interfaces in development.
|Name||Lecture Notes in Computer Science|
|Conference||Verbal and Nonverbal Communication Behaviours, Vietri sul Mare, Italy|
|Period||6/10/07 → …|
- HMI-MI: MULTIMODAL INTERACTIONS