Multimodal plan representation for adaptable BML scheduling

Herwin van Welbergen*, Dennis Reidsma, Job Zwiers

*Corresponding author for this work

    Research output: Contribution to journalArticleAcademicpeer-review

    2 Citations (Scopus)

    Abstract

    Natural human interaction is characterized by interpersonal coordination: interlocutors converge in their speech rates, smoothly switch speaking turns with virtually no delay, provide their interlocutors with verbal and nonverbal backchannel feedback, wait for and react to such feedback, execute physical tasks in tight synchrony, etc. If virtual humans are to achieve such interpersonal coordination they require very flexible behavior plans that are adjustable on-the-fly. In this paper we discuss how such plans are represented, maintained and constructed in our BML realizer Elckerlyc. We argue that behavior scheduling for Virtual Humans can be viewed as a constraint satisfaction problem, and show how Elckerlyc uses this view in its flexible behavior plan representation that allows one to make on-the-fly adjustments to behaviors while keeping the specified constraints between them intact.
    Original languageEnglish
    Pages (from-to)305-327
    Number of pages23
    JournalAutonomous agents and multi-agent systems
    Volume27
    Issue number2
    DOIs
    Publication statusPublished - Sep 2013

    Keywords

    • Virtual humans
    • Behavior Markup Language
    • SAIBA
    • Multimodal plan representation
    • Interpersonal coordination

    Fingerprint Dive into the research topics of 'Multimodal plan representation for adaptable BML scheduling'. Together they form a unique fingerprint.

    Cite this