Lifestyle understanding through the analysis of egocentric photo-streams

Research output: ThesisPhD Thesis - Research external, graduation external


At 8:15, before going to work, Rose puts on her pullover and attaches to it the small portable camera that looks like a hanger. The camera will take two images per minute throughout the day and will record almost everything Rose experiences: the people she meets, how long she sits in front of her computer, what she eats, where she goes, etc. These images show an objective description of Rose's experiences. This thesis addresses the development of automatic computer vision tools for the study of people's behaviours. To this end, we rely on the analysis of the visual data offered by these collected sequences of images by wearable cameras. Our developed models have demonstrated to be a powerful tool for the extraction of information about the behaviours of people in society. Examples of applications: 1) selected images as cues to trigger autobiographical memory about past events for prevention of cognitive and functional decline and memory enhancement in elderly people. 2) Self-monitoring devices as people want to increase their self-knowledge through quantitative analysis, expecting that it will lead to psychological well-being and the improvement of their lifestyle. 3) businesses are already making use of such data regarding information about their employees and clients, in order to improve productivity, well-being and customer satisfaction. The ultimate goal is to help people like Rose to improve the quality of our life by creating awareness about our habits and life balance.
Original languageEnglish
  • Petkov, Nicolai, Supervisor, External person
  • Radeva, Petia, Supervisor, External person
Award date14 Feb 2020
Place of PublicationGroningen
Print ISBNs978-94-034-2313-5
Publication statusPublished - 2020
Externally publishedYes


Dive into the research topics of 'Lifestyle understanding through the analysis of egocentric photo-streams'. Together they form a unique fingerprint.

Cite this