The multimodal EchoBorg: Not as smart as it looks

Sara Falcone*, Jan Kolkmeier, Merijn Bruijnes, Dirk Heylen

*Corresponding author for this work

Research output: Contribution to journalArticleAcademicpeer-review

20 Downloads (Pure)

Abstract

In this paper we present a Multimodal Echoborg interface to explore the effect of different embodiments of an Embodied Conversational Agent (ECA) in an interaction. We compared an interaction where the ECA was embodied as a virtual human (VH) with one where it was embodied as an Echoborg, i.e, a person whose actions are covertly controlled by a dialogue system. The Echoborg in our study not only shadowed the speech output of the dialogue system but also its non-verbal actions. The interactions were structured as a debate between three participants on an ethical dilemma. First, we collected a corpus of debate sessions with three humans debaters. This we used as baseline to design and implement our ECAs. For the experiment, we designed two debate conditions. In one the participant interacted with two ECAs both embodied by virtual humans). In the other the participant interacted with one ECA embodied by a VH and the other by an Echoborg. Our results show that a human embodiment of the ECA overall scores better on perceived social attributes of the ECA. In many other respects the Echoborg scores as poorly as the VH except copresence.

Original languageEnglish
Pages (from-to)293-302
Number of pages10
JournalJournal on multimodal user interfaces
Volume16
Issue number3
Early online date5 May 2022
DOIs
Publication statusPublished - Sept 2022

Keywords

  • Believability
  • EchoBorg
  • Embodiment
  • HCI
  • Multimodality
  • UT-Hybrid-D

Fingerprint

Dive into the research topics of 'The multimodal EchoBorg: Not as smart as it looks'. Together they form a unique fingerprint.

Cite this