Advanced search
1 file | 2.60 MB

Handheld pose tracking using vision-inertial sensors with occlusion handling

Author
Organization
Abstract
Tracking of a handheld device’s three-dimensional (3-D) position and orientation is fundamental to various application domains, including augmented reality (AR), virtual reality, and interaction in smart spaces. Existing systems still offer limited performance in terms of accuracy, robustness, computational cost, and ease of deployment. We present a low-cost, accurate, and robust system for handheld pose tracking using fused vision and inertial data. The integration of measurements from embedded accelerometers reduces the number of unknown parameters in the six-degree-of-freedom pose calculation. The proposed system requires two light-emitting diode (LED) markers to be attached to the device, which are tracked by external cameras through a robust algorithm against illumination changes. Three data fusion methods have been proposed, including the triangulation-based stereo-vision system, constraint-based stereo-vision system with occlusion handling, and triangulation-based multivision system. Real-time demonstrations of the proposed system applied to AR and 3-D gaming are also included. The accuracy assessment of the proposed system is carried out by comparing with the data generated by the state-of-the-art commercial motion tracking system OptiTrack. Experimental results show that the proposed system has achieved high accuracy of few centimeters in position estimation and few degrees in orientation estimation.
Keywords
camera networks, augmented reality, pose tracking, sensor fusion

Downloads

  • JEI-15905SS online.pdf
    • full text
    • |
    • open access
    • |
    • PDF
    • |
    • 2.60 MB

Citation

Please use this url to cite or link to this publication:

Chicago
Li, Juan, Maarten Slembrouck, Francis Deboeverie, Ana M Bernardos, Juan A Besada, Peter Veelaert, Hamid Aghajan, José R Casar, and Wilfried Philips. 2016. “Handheld Pose Tracking Using Vision-inertial Sensors with Occlusion Handling .” Journal of Electronic Imaging 25 (4).
APA
Li, Juan, Slembrouck, M., Deboeverie, F., Bernardos, A. M., Besada, J. A., Veelaert, P., Aghajan, H., et al. (2016). Handheld pose tracking using vision-inertial sensors with occlusion handling . JOURNAL OF ELECTRONIC IMAGING, 25(4).
Vancouver
1.
Li J, Slembrouck M, Deboeverie F, Bernardos AM, Besada JA, Veelaert P, et al. Handheld pose tracking using vision-inertial sensors with occlusion handling . JOURNAL OF ELECTRONIC IMAGING. United States: SPIE; 2016;25(4).
MLA
Li, Juan, Maarten Slembrouck, Francis Deboeverie, et al. “Handheld Pose Tracking Using Vision-inertial Sensors with Occlusion Handling .” JOURNAL OF ELECTRONIC IMAGING 25.4 (2016): n. pag. Print.
@article{8057512,
  abstract     = {Tracking of a handheld device{\textquoteright}s three-dimensional (3-D) position and orientation is fundamental to various application domains, including augmented reality (AR), virtual reality, and interaction in smart spaces. Existing systems still offer limited performance in terms of accuracy, robustness, computational cost, and ease of deployment. We present a low-cost, accurate, and robust system for handheld pose tracking using fused vision and inertial data. The integration of measurements from embedded accelerometers reduces the number of unknown parameters in the six-degree-of-freedom pose calculation. The proposed system requires two light-emitting diode (LED) markers to be attached to the device, which are tracked by external cameras through a robust algorithm against illumination changes. Three data fusion methods have been proposed, including the triangulation-based stereo-vision system, constraint-based stereo-vision system with occlusion handling, and triangulation-based multivision system. Real-time demonstrations of the proposed system applied to AR and 3-D gaming are also included. The accuracy assessment of the proposed system is carried out by comparing with the data generated by the state-of-the-art commercial motion tracking system OptiTrack. Experimental results show that the proposed system has achieved high accuracy of few centimeters in position estimation and few degrees in orientation estimation.},
  articleno    = {041012},
  author       = {Li, Juan and Slembrouck, Maarten and Deboeverie, Francis and Bernardos, Ana M and Besada, Juan A and Veelaert, Peter and Aghajan, Hamid and Casar, Jos{\'e} R and Philips, Wilfried},
  issn         = {1017-9909},
  journal      = {JOURNAL OF ELECTRONIC IMAGING},
  keyword      = {camera networks,augmented reality,pose tracking,sensor fusion},
  language     = {eng},
  number       = {4},
  pages        = {14},
  publisher    = {SPIE},
  title        = {Handheld pose tracking using vision-inertial sensors with occlusion handling },
  url          = {http://dx.doi.org/10.1117/1.JEI.25.4.041012},
  volume       = {25},
  year         = {2016},
}

Altmetric
View in Altmetric
Web of Science
Times cited: