Bridge Damage Detection Approach Using a Roving Camera Technique

Darragh Lydon, Myra Lydon, Rolands Kromanis, Chuan-Zhi Dong, Necati Catbas, Su Taylor

Research output: Contribution to journalArticleAcademicpeer-review

3 Citations (Scopus)
30 Downloads (Pure)

Abstract

Increasing extreme climate events, intensifying traffic patterns and long-term underinvestment have led to the escalated deterioration of bridges within our road and rail transport networks. Structural Health Monitoring (SHM) systems provide a means of objectively capturing and quantifying deterioration under operational conditions. Computer vision technology has gained considerable attention in the field of SHM due to its ability to obtain displacement data using non-contact methods at long distances. Additionally, it provides a low cost, rapid instrumentation solution with low interference to the normal operation of structures. However, even in the case of a medium span bridge, the need for many cameras to capture the global response can be cost-prohibitive. This research proposes a roving camera technique to capture a complete derivation of the response of a laboratory model bridge under live loading, in order to identify bridge damage. Displacement is identified as a suitable damage indicator, and two methods are used to assess the magnitude of the change in global displacement under changing boundary conditions in the laboratory bridge model. From this study, it is established that either approach could detect damage in the simulation model, providing an SHM solution that negates the requirement for complex sensor installations.
Original languageEnglish
Article number1246
Pages (from-to)1-21
Number of pages21
JournalSensors (Switzerland)
Volume21
Issue number4
DOIs
Publication statusPublished - 10 Feb 2021

Keywords

  • Computer vision
  • Damage detection
  • Structural health monitoring
  • Sensor roving

Fingerprint

Dive into the research topics of 'Bridge Damage Detection Approach Using a Roving Camera Technique'. Together they form a unique fingerprint.

Cite this