In this paper we investigate the introduction of haptics in a multimodal tutoring environment. In this environment a haptic device is used to control a virtual injection needle and speech input and output is provided to interact with a virtual tutor, available as a talking head, and a virtual patient. We survey the agent-based architecture of the system and discuss the different interaction modalities. One of the agents, the virtual tutor monitors the actions of the student, provides feedback and is able to demonstrate. Incorporated is a simple emotion model that the tutor tries to maintain and update by considering the student’s actions and its progress. The model allows the tutor to show affective behavior to the student.
|Conference||International Conference on Computing, Communications and Control Technologies, CCCT|
|Period||14/08/04 → 17/08/04|
|Other||August 14-17, 2004|