Advanced search
1 file | 1.98 MB Add to list

Parameter-unaware autocalibration for occupancy mapping

Author
Organization
Abstract
People localization and occupancy mapping are common and important tasks for multi-camera systems. In this paper, we present a novel approach to overcome the hurdle of manual extrinsic calibration of the multi-camera system. Our approach is completely parameter unaware, meaning that the user does not need to know the focal length, position or viewing angle in advance, nor will these values be calibrated as such. The only requirement to the multi-camera setup is that the views overlap substantially and are mounted at approximately the same height, requirements that are satisfied in most typical multi-camera configurations. The proposed method uses the observed height of an object or person moving through the space to estimate the distance to the object or person. Using this distance to backproject the lowest point of each detected object, we obtain a rotated and anisotropically scaled view of the ground plane for each camera. An algorithm is presented to estimate the anisotropic scaling parameters and rotation for each camera, after which ground plane positions can be computed up to an isotropic scale factor. Lens distortion is not taken into account. The method is tested in simulation yielding average accuracies within 5cm, and in a real multi-camera environment with an accuracy within 15cm.
Keywords
CALIBRATION

Downloads

  • icdsc2013 dvhamme.pdf
    • full text
    • |
    • open access
    • |
    • PDF
    • |
    • 1.98 MB

Citation

Please use this url to cite or link to this publication:

MLA
Van Hamme, David, et al. “Parameter-Unaware Autocalibration for Occupancy Mapping.” 2013 Seventh International Conference on Distributed Smart Cameras (ICDSC), IEEE, 2013, pp. 49–54, doi:10.1109/ICDSC.2013.6778205.
APA
Van Hamme, D., Slembrouck, M., Van Haerenborgh, D., Van Cauwelaert, D., Veelaert, P., & Philips, W. (2013). Parameter-unaware autocalibration for occupancy mapping. 2013 Seventh International Conference on Distributed Smart Cameras (ICDSC), 49–54. https://doi.org/10.1109/ICDSC.2013.6778205
Chicago author-date
Van Hamme, David, Maarten Slembrouck, Dirk Van Haerenborgh, Dimitri Van Cauwelaert, Peter Veelaert, and Wilfried Philips. 2013. “Parameter-Unaware Autocalibration for Occupancy Mapping.” In 2013 Seventh International Conference on Distributed Smart Cameras (ICDSC), 49–54. New York, NY, USA: IEEE. https://doi.org/10.1109/ICDSC.2013.6778205.
Chicago author-date (all authors)
Van Hamme, David, Maarten Slembrouck, Dirk Van Haerenborgh, Dimitri Van Cauwelaert, Peter Veelaert, and Wilfried Philips. 2013. “Parameter-Unaware Autocalibration for Occupancy Mapping.” In 2013 Seventh International Conference on Distributed Smart Cameras (ICDSC), 49–54. New York, NY, USA: IEEE. doi:10.1109/ICDSC.2013.6778205.
Vancouver
1.
Van Hamme D, Slembrouck M, Van Haerenborgh D, Van Cauwelaert D, Veelaert P, Philips W. Parameter-unaware autocalibration for occupancy mapping. In: 2013 Seventh international conference on distributed smart cameras (ICDSC). New York, NY, USA: IEEE; 2013. p. 49–54.
IEEE
[1]
D. Van Hamme, M. Slembrouck, D. Van Haerenborgh, D. Van Cauwelaert, P. Veelaert, and W. Philips, “Parameter-unaware autocalibration for occupancy mapping,” in 2013 Seventh international conference on distributed smart cameras (ICDSC), Palm Springs, CA, USA, 2013, pp. 49–54.
@inproceedings{4186165,
  abstract     = {{People localization and occupancy mapping are common and important tasks for multi-camera systems. In this paper, we present a novel approach to overcome the hurdle of manual extrinsic calibration of the multi-camera system. Our approach is completely parameter unaware, meaning that the user does not need to know the focal length, position or viewing angle in advance, nor will these values be calibrated as such. The only requirement to the multi-camera setup is that the views overlap substantially and are mounted at approximately the same height, requirements that are satisfied in most typical multi-camera configurations. The proposed method uses the observed height of an object or person moving through the space to estimate the distance to the object or person. Using this distance to backproject the lowest point of each detected object, we obtain a rotated and anisotropically scaled view of the ground plane for each camera. An algorithm is presented to estimate the anisotropic scaling parameters and rotation for each camera, after which ground plane positions can be computed up to an isotropic scale factor. Lens distortion is not taken into account. The method is tested in simulation yielding average accuracies within 5cm, and in a real multi-camera environment with an accuracy within 15cm.}},
  author       = {{Van Hamme, David and Slembrouck, Maarten and Van Haerenborgh, Dirk and Van Cauwelaert, Dimitri and Veelaert, Peter and Philips, Wilfried}},
  booktitle    = {{2013 Seventh international conference on distributed smart cameras (ICDSC)}},
  isbn         = {{9781479921669}},
  keywords     = {{CALIBRATION}},
  language     = {{eng}},
  location     = {{Palm Springs, CA, USA}},
  pages        = {{49--54}},
  publisher    = {{IEEE}},
  title        = {{Parameter-unaware autocalibration for occupancy mapping}},
  url          = {{http://doi.org/10.1109/ICDSC.2013.6778205}},
  year         = {{2013}},
}

Altmetric
View in Altmetric
Web of Science
Times cited: