Abstract
In this paper we propose the novel task of detecting groups of conversing people using only a single body-worn accelerometer per person. Our approach estimates each individual's social actions and uses the co-ordination of these social actions between pairs to identify group membership. The aim of such an approach is to be deployed in dense crowded environments. Our work differs significantly from previous approaches, which have tended to rely on audio and/or proximity sensing, often in much less crowded scenarios, for estimating whether people are talking together or who is speaking. Ultimately, we are interested in detecting who is speaking, who is conversing with whom, and from that, to infer socially relevant information about the interaction such as whether people are enjoying themselves, or the quality of their relationship in these extremely dense crowded scenarios. Striving towards this long-term goal, this paper presents a systematic study to understand how to detect groups of people who are conversing together in this setting, where we achieve a $64%$ classification accuracy using a fully automated system.
Original language | English |
---|---|
Title of host publication | ICMI '14: Proceedings of the 16th International Conference on Multimodal Interaction |
Pages | 84–91 |
DOIs | |
Publication status | Published - 2014 |
Externally published | Yes |
Event | 16th International Conference on Multimodal Interaction, ICMI 2014 - Istanbul, Turkey, Istanbul, Turkey Duration: 12 Nov 2014 → 16 Nov 2014 Conference number: 16 |
Conference
Conference | 16th International Conference on Multimodal Interaction, ICMI 2014 |
---|---|
Abbreviated title | ICMI |
Country/Territory | Turkey |
City | Istanbul |
Period | 12/11/14 → 16/11/14 |
Other | 12-16 November 2014 |
Keywords
- n/a OA procedure