Multimodal integration of haptics, speech, and affect in an educational environment

    Research output: Chapter in Book/Report/Conference proceedingConference contributionAcademicpeer-review

    Abstract

    In this paper we investigate the introduction of haptics in a multimodal tutoring environment. In this environment a haptic device is used to control a virtual injection needle and speech input and output is provided to interact with a virtual tutor, available as a talking head, and a virtual patient. We survey the agent-based architecture of the system and discuss the different interaction modalities. One of the agents, the virtual tutor monitors the actions of the student, provides feedback and is able to demonstrate. Incorporated is a simple emotion model that the tutor tries to maintain and update by considering the student’s actions and its progress. The model allows the tutor to show affective behavior to the student.
    Original languageUndefined
    Title of host publicationProceedings International Conference on Computing, Communications and Control Technologies (Volume II), Austin, Texas
    EditorsH-W Chu, M. Savoie, B. Sanchez
    Place of PublicationUSA
    PublisherInternational Institute of Informatics and Systemics (IIIS)
    Pages94-97
    Number of pages4
    ISBN (Print)980-6560-17-5
    Publication statusPublished - May 2004
    EventInternational Conference on Computing, Communications and Control Technologies, CCCT - Austin, Texas
    Duration: 14 Aug 200417 Aug 2004

    Publication series

    Name
    PublisherIIIS

    Conference

    ConferenceInternational Conference on Computing, Communications and Control Technologies, CCCT
    Period14/08/0417/08/04
    OtherAugust 14-17, 2004

    Keywords

    • IR-63396
    • EWI-6783
    • METIS-221647

    Cite this