Humans tend to attribute human qualities to computers. It is expected that people, when using their natural communicational skills, can perform cognitive tasks with computers in a more enjoyable and effective way. For these reasons, human-like embodied conversational agents (ECAs) as components of user interfaces have received a lot of attention. It has been shown that the style of the agent's look and behaviour strongly influences the user's attitude. In this paper we discuss our GESTYLE language making it possible to endow ECAs with style. Style is defined in terms of when and how the ECA uses certain gestures, and how it modulates its speech (e.g. to indicate emphasis or sadness). There are also GESTYLE tags to annotate text, which has to be uttered by an ECA to prescribe the usage of hand, head and facial gestures accompanying the speech in order to augment the communication. The annotation ranges from direct, low level (e.g. perform a specific gesture) to indirect, high level (e.g. take turn in a conversation) instructions, which will be interpreted with respect to the style defined. Using style dictionaries and defining different aspects like age and culture of an ECA, it is possible to tune the behaviour of an ECA to suit a given user or target group the best.
|Number of pages||19|
|Journal||International journal of human-computer studies|
|Publication status||Published - 2005|
- Multimodal communication
- Embodied Conversational Agent
- Mark up language