Advanced search
1 file | 991.05 KB Add to list

People tracking by cooperative fusion of RADAR and camera sensors

Martin Dimitrievski (UGent) , Lennert Jacobs (UGent) , Peter Veelaert (UGent) and Wilfried Philips (UGent)
Author
Organization
Abstract
Accurate 3D tracking of objects from monocular camera poses challenges due to the loss of depth during projection. Although ranging by RADAR has proven effective in highway environments, people tracking remains beyond the capability of single sensor systems. In this paper, we propose a cooperative RADAR-camera fusion method for people tracking on the ground plane. Using average person height, joint detection likelihood is calculated by back-projecting detections from the camera onto the RADAR Range-Azimuth data. Peaks in the joint likelihood, representing candidate targets, are fed into a Particle Filter tracker. Depending on the association outcome, particles are updated using the associated detections (Tracking by Detection), or by sampling the raw likelihood itself (Tracking Before Detection). Utilizing the raw likelihood data has the advantage that lost targets are continuously tracked even if the camera or RADAR signal is below the detection threshold. We show that in single target, uncluttered environments, the proposed method entirely outperforms camera-only tracking. Experiments in a real-world urban environment also confirm that the cooperative fusion tracker produces significantly better estimates, even in difficult and ambiguous situations.
Keywords
radar, sensor fusion, pedestrian tracking, autonomous vehicles

Downloads

  • ITSC 2019 camera ready.pdf
    • full text
    • |
    • open access
    • |
    • PDF
    • |
    • 991.05 KB

Citation

Please use this url to cite or link to this publication:

MLA
Dimitrievski, Martin, et al. “People Tracking by Cooperative Fusion of RADAR and Camera Sensors.” IEEE Intelligent Transportation Systems Conference - ITSC 2019, IEEE, 2019.
APA
Dimitrievski, M., Jacobs, L., Veelaert, P., & Philips, W. (2019). People tracking by cooperative fusion of RADAR and camera sensors. In IEEE Intelligent Transportation Systems Conference - ITSC 2019. Auckland, New Zealand: IEEE.
Chicago author-date
Dimitrievski, Martin, Lennert Jacobs, Peter Veelaert, and Wilfried Philips. 2019. “People Tracking by Cooperative Fusion of RADAR and Camera Sensors.” In IEEE Intelligent Transportation Systems Conference - ITSC 2019. IEEE.
Chicago author-date (all authors)
Dimitrievski, Martin, Lennert Jacobs, Peter Veelaert, and Wilfried Philips. 2019. “People Tracking by Cooperative Fusion of RADAR and Camera Sensors.” In IEEE Intelligent Transportation Systems Conference - ITSC 2019. IEEE.
Vancouver
1.
Dimitrievski M, Jacobs L, Veelaert P, Philips W. People tracking by cooperative fusion of RADAR and camera sensors. In: IEEE Intelligent Transportation Systems Conference - ITSC 2019. IEEE; 2019.
IEEE
[1]
M. Dimitrievski, L. Jacobs, P. Veelaert, and W. Philips, “People tracking by cooperative fusion of RADAR and camera sensors,” in IEEE Intelligent Transportation Systems Conference - ITSC 2019, Auckland, New Zealand, 2019.
@inproceedings{8623169,
  abstract     = {Accurate 3D tracking of objects from monocular
camera poses challenges due to the loss of depth during
projection. Although ranging by RADAR has proven effective
in highway environments, people tracking remains beyond the
capability of single sensor systems. In this paper, we propose a
cooperative RADAR-camera fusion method for people tracking
on the ground plane. Using average person height, joint detection likelihood is calculated by back-projecting detections from
the camera onto the RADAR Range-Azimuth data. Peaks in the
joint likelihood, representing candidate targets, are fed into a
Particle Filter tracker. Depending on the association outcome,
particles are updated using the associated detections (Tracking
by Detection), or by sampling the raw likelihood itself (Tracking
Before Detection). Utilizing the raw likelihood data has the
advantage that lost targets are continuously tracked even if
the camera or RADAR signal is below the detection threshold.
We show that in single target, uncluttered environments, the
proposed method entirely outperforms camera-only tracking.
Experiments in a real-world urban environment also confirm
that the cooperative fusion tracker produces significantly better
estimates, even in difficult and ambiguous situations.},
  author       = {Dimitrievski, Martin and Jacobs, Lennert and Veelaert, Peter and Philips, Wilfried},
  booktitle    = {IEEE Intelligent Transportation Systems Conference - ITSC 2019},
  keywords     = {radar,sensor fusion,pedestrian tracking,autonomous vehicles},
  language     = {eng},
  location     = {Auckland, New Zealand},
  publisher    = {IEEE},
  title        = {People tracking by cooperative fusion of RADAR and camera sensors},
  url          = {https://telin.ugent.be/~mdimitri/tracking.html},
  year         = {2019},
}