Advanced search
1 file | 932.83 KB

Illumination-robust people tracking using a smart camera network

Bo Bo Nyan (UGent) , Peter Van Hese (UGent) , Junzhi Guan (UGent) , Sebastian Grünwedel (UGent) , Jorge Niño Castañeda (UGent) , Dimitri Van Cauwelaert (UGent) , Dirk Van Haerenborgh (UGent) , Peter Veelaert (UGent) and Wilfried Philips (UGent)
Author
Organization
Abstract
Many computer vision based applications require reliable tracking of multiple people under unpredictable lighting conditions. Many existing trackers do not handle illumination changes well, especially sudden changes in illumination. This paper presents a system to track multiple people reliably even under rapid illumination changes using a network of calibrated smart cameras with overlapping views. Each smart camera extracts foreground features by detecting texture changes between the current image and a static background image. The foreground features belonging to each person are tracked locally on each camera but these local estimates are sent to a fusion center which combines them to generate more accurate estimates. The final estimates are fed back to all smart cameras, which use them as prior information for tracking in the next frame. The texture based approach makes our method very robust to illumination changes. We tested the performance of our system on six video sequences, some containing sudden illumination changes and up to four walking persons. The results show that our tracker can track multiple people accurately with an average tracking error as low as 8 cm even when the illumination varies rapidly. Performance comparison to a state-of-the-art tracking system shows that our method outperforms.
Keywords
data fusion, feature subtraction, illumination-robustness, Multi-camera tracking

Downloads

  • (...).pdf
    • full text
    • |
    • UGent only
    • |
    • PDF
    • |
    • 932.83 KB

Citation

Please use this url to cite or link to this publication:

Chicago
Nyan, Bo Bo, Peter Van Hese, Junzhi Guan, Sebastian Grünwedel, Jorge Niño Castañeda, Dimitri Van Cauwelaert, Dirk Van Haerenborgh, Peter Veelaert, and Wilfried Philips. 2014. “Illumination-robust People Tracking Using a Smart Camera Network.” In Proceedings of SPIE, ed. J Roning and D Casasent. Vol. 9025. San Francisco, California, USA: IS&T/SPIE.
APA
Nyan, B. B., Van Hese, P., Guan, J., Grünwedel, S., Niño Castañeda, J., Van Cauwelaert, D., Van Haerenborgh, D., et al. (2014). Illumination-robust people tracking using a smart camera network. In J. Roning & D. Casasent (Eds.), Proceedings of SPIE (Vol. 9025). Presented at the Conference on Intelligent Robots and Computer Vision XXXI - Algorithms and Techniques, San Francisco, California, USA: IS&T/SPIE.
Vancouver
1.
Nyan BB, Van Hese P, Guan J, Grünwedel S, Niño Castañeda J, Van Cauwelaert D, et al. Illumination-robust people tracking using a smart camera network. In: Roning J, Casasent D, editors. Proceedings of SPIE. San Francisco, California, USA: IS&T/SPIE; 2014.
MLA
Nyan, Bo Bo, Peter Van Hese, Junzhi Guan, et al. “Illumination-robust People Tracking Using a Smart Camera Network.” Proceedings of SPIE. Ed. J Roning & D Casasent. Vol. 9025. San Francisco, California, USA: IS&T/SPIE, 2014. Print.
@inproceedings{4289321,
  abstract     = {Many computer vision based applications require reliable tracking of multiple people under unpredictable lighting conditions. Many existing trackers do not handle illumination changes well, especially sudden changes in illumination. This paper presents a system to track multiple people reliably even under rapid illumination changes using a network of calibrated smart cameras with overlapping views. Each smart camera extracts foreground features by detecting texture changes between the current image and a static background image. The foreground features belonging to each person are tracked locally on each camera but these local estimates are sent to a fusion center which combines them to generate more accurate estimates. The final estimates are fed back to all smart cameras, which use them as prior information for tracking in the next frame. The texture based approach makes our method very robust to illumination changes. We tested the performance of our system on six video sequences, some containing sudden illumination changes and up to four walking persons. The results show that our tracker can track multiple people accurately with an average tracking error as low as 8 cm even when the illumination varies rapidly. Performance comparison to a state-of-the-art tracking system shows that our method outperforms.},
  articleno    = {90250G},
  author       = {Nyan, Bo Bo and Van Hese, Peter and Guan, Junzhi and Gr{\"u}nwedel, Sebastian and Ni{\~n}o Casta{\~n}eda, Jorge and Van Cauwelaert, Dimitri and Van Haerenborgh, Dirk and Veelaert, Peter and Philips, Wilfried},
  booktitle    = {Proceedings of SPIE},
  editor       = {Roning, J and Casasent, D },
  isbn         = {9780819499424},
  issn         = {0277-786X},
  keyword      = {data fusion,feature subtraction,illumination-robustness,Multi-camera tracking},
  language     = {eng},
  location     = {San Francisco, California, USA},
  pages        = {10},
  publisher    = {IS\&T/SPIE},
  title        = {Illumination-robust people tracking using a smart camera network},
  url          = {http://dx.doi.org/10.1117/12.2036764},
  volume       = {9025},
  year         = {2014},
}

Altmetric
View in Altmetric
Web of Science
Times cited: