This paper describes an attempt to reveal the user’s intention from dialogue acts, thereby improving the effectiveness of natural interfaces to pedagogical agents. It focuses on cases where the intention is unclear from the dialogue context or utterance structure, but where the intention may still be identified using the emotional state of the user. The recognition of emotions is based on physiological user input. Our initial user study gave promising results that support our hypothesis that physiological evidence of emotions could be used to disambiguate dialogue acts. This paper presents our approach to the integration of natural language and emotions as well as our first empirical results, which may be used to endow interactive agents with emotional capabilities.
|Number of pages||8|
|Publication status||Published - 13 Jan 2004|
|Event||9th International Conference on Intelligent User Interfaces, IUI 2004 - Funchal, Portugal|
Duration: 13 Jan 2004 → 16 Jan 2004
Conference number: 9
|Conference||9th International Conference on Intelligent User Interfaces, IUI 2004|
|Period||13/01/04 → 16/01/04|