Association of Camera and Radar Detections Using Neural Networks

Konstantinos Fatseas, Marco J.G. Bekooij

Research output: Chapter in Book/Report/Conference proceedingConference contributionAcademicpeer-review


Automotive radar and camera fusion relies on linear point transformations from one sensor's coordinate system to the other. However, these transformations cannot handle non-linear dynamics and are susceptible to sensor noise. Furthermore, they operate on a point-to-point basis, so it is impossible to capture all the characteristics of an object. This paper introduces a method that performs detection-to-detection association by projecting heterogeneous object features from the two sensors into a common high-dimensional space. We associate 2D bounding boxes and radar detections based on the Euclidean distance between their projections. Our method utilizes deep neural networks to transform feature vectors instead of single points. Therefore, we can leverage real-world data to learn non-linear dynamics and utilize several features to provide a better description for each object. We evaluate our association method against a traditional rule-based method, showing that it improves the accuracy of the association algorithm and it is more robust in complex scenarios with multiple objects.

Original languageEnglish
Title of host publicationRadarConf23 - 2023 IEEE Radar Conference, Proceedings
ISBN (Electronic)9781665436694
Publication statusPublished - 21 Jun 2023
EventIEEE Radar Conference, RadarConf 2023 - San Antonia, United States
Duration: 1 May 20235 May 2023

Publication series

NameProceedings of the IEEE Radar Conference
ISSN (Print)1097-5764
ISSN (Electronic)2375-5318


ConferenceIEEE Radar Conference, RadarConf 2023
Abbreviated titleRadarConf 2023
Country/TerritoryUnited States
CitySan Antonia


  • n/a OA procedure
  • radar
  • sensor fusion
  • camera


Dive into the research topics of 'Association of Camera and Radar Detections Using Neural Networks'. Together they form a unique fingerprint.

Cite this