Abstract
An experiment was conducted to investigate whether human observers use knowledge of the differences in focus of attention in multiparty interaction to identify the speaker amongst the meeting participants. A virtual environment was used to have good stimulus control. Head orientations were displayed as the only cue for focus attention. The orientations were derived from a corpus of tracked head movements. We present some properties of the relation between head orientations and speaker–listener status, as found in the corpus. With respect to the experiment, it appears that people use knowledge of the patterns in focus of attention to distinguish the speaker from the listeners. However, the human speaker identification results were rather low. Head orientations (or focus of attention) alone do not provide a sufficient cue for reliable identification of the speaker in a multiparty setting.
Original language | Undefined |
---|---|
Article number | 10.1145/1658349.1658351 |
Pages (from-to) | 2:1-2:13 |
Number of pages | 13 |
Journal | ACM transactions on applied perception |
Volume | 7 |
Issue number | 1 |
DOIs | |
Publication status | Published - Jan 2010 |
Keywords
- EWI-17246
- HMI-IA: Intelligent Agents
- HMI-VRG: Virtual Reality and Graphics
- multiparty conversation
- gaze behavior
- IR-69695
- perception of gaze
- virtual environments
- EC Grant Agreement nr.: FP6/033812
- Head orientation
- METIS-277395
- focus of attention