Advanced search
1 file | 550.12 KB

Demo: real-time indoors people tracking in scalable camera networks

Author
Organization
Abstract
In this demo we present a people tracker in indoor environments. The tracker executes in a network of smart cameras with overlapping views. Special attention is given to real-time processing by distribution of tasks between the cameras and the fusion server. Each camera performs tasks of processing the images and tracking of people in the image plane. Instead of camera images, only metadata (a bounding box per person) are sent from each camera to the fusion server. The metadata are used on the server side to estimate the position of each person in real-world coordinates. Although the tracker is designed to suit any indoor environment, in this demo the tracker's performance is presented in a meeting scenario, where occlusions of people by other people and/or furniture are significant and occur frequently. Multiple cameras insure views from multiple angles, which keeps tracking accurate even in cases of severe occlusions in some of the views.

Downloads

  • Final manuscript.pdf
    • full text
    • |
    • open access
    • |
    • PDF
    • |
    • 550.12 KB

Citation

Please use this url to cite or link to this publication:

Chicago
Jelača, Vedran, Sebastian Grünwedel, Jorge Niño Castañeda, Peter Van Hese, Dimitri Van Cauwelaert, Peter Veelaert, and Wilfried Philips. 2011. “Demo: Real-time Indoors People Tracking in Scalable Camera Networks.” In 2011 Fifth ACM/IEEE International Conference on Distributed Smart Cameras. Piscataway, NJ, USA: IEEE.
APA
Jelača, V., Grünwedel, S., Niño Castañeda, J., Van Hese, P., Van Cauwelaert, D., Veelaert, P., & Philips, W. (2011). Demo: real-time indoors people tracking in scalable camera networks. 2011 Fifth ACM/IEEE international conference on distributed smart cameras. Presented at the 5th ACM/IEEE International conference on Distributed Smart Cameras (ICDSC 2011), Piscataway, NJ, USA: IEEE.
Vancouver
1.
Jelača V, Grünwedel S, Niño Castañeda J, Van Hese P, Van Cauwelaert D, Veelaert P, et al. Demo: real-time indoors people tracking in scalable camera networks. 2011 Fifth ACM/IEEE international conference on distributed smart cameras. Piscataway, NJ, USA: IEEE; 2011.
MLA
Jelača, Vedran, Sebastian Grünwedel, Jorge Niño Castañeda, et al. “Demo: Real-time Indoors People Tracking in Scalable Camera Networks.” 2011 Fifth ACM/IEEE International Conference on Distributed Smart Cameras. Piscataway, NJ, USA: IEEE, 2011. Print.
@inproceedings{1908348,
  abstract     = {In this demo we present a people tracker in indoor environments. The tracker executes in a network of smart cameras with overlapping views. Special attention is given to real-time processing by distribution of tasks between the cameras and the fusion server. Each camera performs tasks of processing the images and tracking of people in the image plane. Instead of camera images, only metadata (a bounding box per person) are sent from each camera to the fusion server. The metadata are used on the server side to estimate the position of each person in real-world coordinates. Although the tracker is designed to suit any indoor environment, in this demo the tracker's performance is presented in a meeting scenario, where occlusions of people by other people and/or furniture are significant and occur frequently. Multiple cameras insure views from multiple angles, which keeps tracking accurate even in cases of severe occlusions in some of the views.},
  author       = {Jela\v{c}a, Vedran and Gr{\"u}nwedel, Sebastian and Ni{\~n}o Casta{\~n}eda, Jorge and Van Hese, Peter and Van Cauwelaert, Dimitri and Veelaert, Peter and Philips, Wilfried},
  booktitle    = {2011 Fifth ACM/IEEE international conference on distributed smart cameras},
  isbn         = {9781457717079},
  language     = {eng},
  location     = {Ghent, Belgium},
  pages        = {2},
  publisher    = {IEEE},
  title        = {Demo: real-time indoors people tracking in scalable camera networks},
  year         = {2011},
}

Web of Science
Times cited: