Classifying social actions with a single accelerometer

Hayley Hung, Gwenn Englebienne, Jeroen Kools

Research output: Chapter in Book/Report/Conference proceedingConference contributionAcademicpeer-review

43 Citations (Scopus)

Abstract

In this paper, we estimate different types of social actions from a single body-worn accelerometer in a crowded social setting. Accelerometers have many advantages in such settings: they are impervious to environmental noise, unobtrusive, cheap, low-powered, and their readings are specific to a single person. Our experiments show that they are surprisingly informative of different types of social actions. The social actions we address in this paper are whether a person is speaking, laughing, gesturing, drinking, or stepping. To our knowledge, this is the first work to carry out experiments on estimating social actions from conversational behavior using only a wearable accelerometer. The ability to estimate such actions using just the acceleration opens up the potential for analyzing more about social aspects of people's interactions without explicitly recording what they are saying.

Original languageEnglish
Title of host publicationUbiComp 2013
Subtitle of host publicationProceedings of the 2013 ACM International Joint Conference on Pervasive and Ubiquitous Computing
EditorsFriedemann Mattern
PublisherACM Publishing
Pages207-210
Number of pages4
ISBN (Print)978-1-4503-1770-2
DOIs
Publication statusPublished - 2013
Externally publishedYes
Event2013 ACM International Joint Conference on Pervasive and Ubiquitous Computing, UbiComp 2013 - Zurich, Switzerland
Duration: 8 Sept 201312 Sept 2013

Conference

Conference2013 ACM International Joint Conference on Pervasive and Ubiquitous Computing, UbiComp 2013
Country/TerritorySwitzerland
CityZurich
Period8/09/1312/09/13

Keywords

  • Human behavior
  • Social actions
  • Wearable sensors
  • n/a OA procedure

Fingerprint

Dive into the research topics of 'Classifying social actions with a single accelerometer'. Together they form a unique fingerprint.

Cite this