Advanced search
1 file | 939.96 KB

Efficient approximate foreground detection for low-resource devices

Author
Organization
Abstract
A broad range of very powerful foreground detection methods exist because this is an essential step in many computer vision algorithms. However, because of memory and computational constraints, simple static background subtraction is very often the technique that is used in practice on a platform with limited resources such as a smart camera. In this paper we propose to apply more powerful techniques on a reduced scan line version of the captured image to construct an approximation of the actual foreground without overburdening the smart camera. We show that the performance of static background subtraction quickly drops outside of a controlled laboratory environment, and that this is not the case for the proposed method because of its ability to update its background model. Furthermore we provide a comparison with foreground detection on a subsampled version of the captured image. We show that with the proposed foreground approximation higher true positive rates can be achieved.
Keywords
low resources, moving object detection, scan line, foreground detection, smart camera

Downloads

  • (...).pdf
    • full text
    • |
    • UGent only
    • |
    • PDF
    • |
    • 939.96 KB

Citation

Please use this url to cite or link to this publication:

Chicago
Tessens, Linda, Marleen Morbée, Wilfried Philips, Richard Kleihorst, and Hamid Aghajan. 2009. “Efficient Approximate Foreground Detection for Low-resource Devices.” In 2009 Third ACM/IEEE International Conference on Distributed Smart Cameras, 84–91. New York, NY, USA: IEEE.
APA
Tessens, L., Morbée, M., Philips, W., Kleihorst, R., & Aghajan, H. (2009). Efficient approximate foreground detection for low-resource devices. 2009 Third ACM/IEEE international conference on distributed smart cameras (pp. 84–91). Presented at the 3rd ACM/IEEE International conference on Distributed Smart Cameras (ICDSC 2009), New York, NY, USA: IEEE.
Vancouver
1.
Tessens L, Morbée M, Philips W, Kleihorst R, Aghajan H. Efficient approximate foreground detection for low-resource devices. 2009 Third ACM/IEEE international conference on distributed smart cameras. New York, NY, USA: IEEE; 2009. p. 84–91.
MLA
Tessens, Linda, Marleen Morbée, Wilfried Philips, et al. “Efficient Approximate Foreground Detection for Low-resource Devices.” 2009 Third ACM/IEEE International Conference on Distributed Smart Cameras. New York, NY, USA: IEEE, 2009. 84–91. Print.
@inproceedings{716061,
  abstract     = {A broad range of very powerful foreground detection methods exist because this is an essential step in many computer vision algorithms. However, because of memory and computational constraints, simple static background subtraction is very often the technique that is used in practice on a platform with limited resources such as a smart camera.
In this paper we propose to apply more powerful techniques on a reduced scan line version of the captured image to construct an approximation of the actual foreground without overburdening the smart camera. We show that the performance of static background subtraction quickly drops outside of a controlled laboratory environment, and that this is not the case for the proposed method because of its ability to update its background model. Furthermore we provide a comparison with foreground detection on a subsampled version of the captured image. We show that with the proposed foreground approximation higher true positive rates can be achieved.},
  author       = {Tessens, Linda and Morb{\'e}e, Marleen and Philips, Wilfried and Kleihorst, Richard and Aghajan, Hamid},
  booktitle    = {2009 Third ACM/IEEE international conference on distributed smart cameras},
  isbn         = {9781424446193},
  language     = {eng},
  location     = {Como, Italy},
  pages        = {84--91},
  publisher    = {IEEE},
  title        = {Efficient approximate foreground detection for low-resource devices},
  url          = {http://dx.doi.org/10.1109/ICDSC.2009.5289416},
  year         = {2009},
}

Altmetric
View in Altmetric
Web of Science
Times cited: