Abstract
We present a novel non-parametric Bayesian model to jointly discover the dynamics of low-level actions and high-level behaviors of tracked objects. In our approach, actions capture both linear, low-level object dynamics, and an additional spatial distribution on where the dynamic occurs. Furthermore, behavior classes capture high-level temporal motion dependencies in Markov chains of actions, thus each learned behavior is a switching linear dynamical system. The number of actions and behaviors is discovered from the data itself using Dirichlet Processes. We are especially interested in cases where tracks can exhibit large kinematic and spatial variations, e.g. person tracks in open environments, as found in the visual surveillance and intelligent vehicle domains. The model handles real-valued features directly, so no information is lost by quantizing measurements into `visual words', and variations in standing, walking and running can be discovered without discrete thresholds. We describe inference using Markov Chain Monte Carlo sampling and validate our approach on several artificial and real-world pedestrian track datasets from the surveillance and intelligent vehicle domain. We show that our model can distinguish between relevant behavior patterns that an existing state-of-the-art hierarchical model for clustering and simpler model variants cannot. The software and the artificial and surveillance datasets are made publicly available for benchmarking purposes.
Original language | English |
---|---|
Pages (from-to) | 322 - 334 |
Number of pages | 13 |
Journal | IEEE transactions on pattern analysis and machine intelligence |
Volume | 38 |
Issue number | 2 |
Early online date | 9 Jun 2015 |
DOIs | |
Publication status | Published - Feb 2016 |
Externally published | Yes |
Keywords
- n/a OA procedure