Analytics of movement through checkpoints

Yaguang Tao* (Corresponding Author), Alan Both, Matt Duckham

*Corresponding author for this work

Research output: Contribution to journalArticleAcademicpeer-review

6 Citations (Scopus)


This article concerns the opportunities for analysis of data about movement past spatial checkpoints. Checkpoint data are generated by an object’s movement past fixed sensors (‘checkpoints’) distributed throughout geographic space. Example sources of checkpoint data include road-toll gantries, social media check-ins, WiFi hotspots, access swipe cards, and public transport smart cards. Many existing movement analytics techniques, which frequently rely on precise coordinate locations, are ill-suited to the inherent and variable spatial granularity of checkpoint data. However, this spatial granularity also brings advantages in linking movement more closely to its environmental context, in particular the characteristics of the regions through which an object is moving. In this article, we propose a general model for representing checkpoint data of moving objects, based on two fundamentally different types of sensors: transaction and presence. Experiments using real movement data illustrate the diversity of queries that can be efficiently supported by our model. An example movement classification task, identifying vehicle type from checkpoint data recording vehicle movement, provides an illustration of the opportunities for more closely linking movement with its environmental context through the use of checkpoint data analysis.

Original languageEnglish
Pages (from-to)1282-1303
Number of pages22
JournalInternational journal of geographical information science
Issue number7
Publication statusPublished - 3 Jul 2018
Externally publishedYes


  • context awareness
  • data modeling
  • imprecision
  • movement analytics
  • movement classification
  • Spatial checkpoint
  • UT-Hybrid-D


Dive into the research topics of 'Analytics of movement through checkpoints'. Together they form a unique fingerprint.

Cite this