Cost-effective solution to synchronised audio-visual data capture using multiple sensors

T. Lee (Editor), Jeroen Lichtenauer, S. Soatto (Editor), Jie Shen, Michel Valstar, Maja Pantic

    Research output: Contribution to journalArticleAcademicpeer-review

    11 Citations (Scopus)


    Applications such as surveillance and human behaviour analysis require high-bandwidth recording from multiple cameras, as well as from other sensors. In turn, sensor fusion has increased the required accuracy of synchronisation between sensors. Using commercial off-the-shelf components may compromise quality and accuracy due to several challenges, such as dealing with the combined data rate from multiple sensors; unknown offset and rate discrepancies between independent hardware clocks; the absence of trigger inputs or -outputs in the hardware; as well as the different methods for time-stamping the recorded data. To achieve accurate synchronisation, we centralise the synchronisation task by recording all trigger- or timestamp signals with a multi-channel audio interface. For sensors that don't have an external trigger signal, we let the computer that captures the sensor data periodically generate timestamp signals from its serial port output. These signals can also be used as a common time base to synchronise multiple asynchronous audio interfaces. Furthermore, we show that a consumer PC can currently capture 8-bit video data with 1024 × 1024 spatial- and 59.1 Hz temporal resolution, from at least 14 cameras, together with 8 channels of 24-bit audio at 96 kHz. We thus improve the quality/cost ratio of multi-sensor systems data capture systems.
    Original languageUndefined
    Pages (from-to)666-680
    Number of pages15
    JournalImage and vision computing
    Issue number10
    Publication statusPublished - Sep 2011


    • Synchronisation
    • EC Grant Agreement nr.: FP7/211486
    • EC Grant Agreement nr.: ERC/203143
    • IR-79390
    • Audio recording
    • Multisensor systems
    • Video recording
    • METIS-284996
    • EWI-21248

    Cite this