Advanced search
1 file | 1.02 MB Add to list

An unsupervised transfer learning framework for visible-thermal pedestrian detection

Chengjin Lyu (UGent) , Patrick Heyer Wollenberg (UGent) , Bart Goossens (UGent) and Wilfried Philips (UGent)
(2022) SENSORS. 22(12).
Author
Organization
Project
  • ACHIEVE (AdvanCed Hardware/Software Components for Integrated/Embedded Vision SystEms (ACHIEVE))
Abstract
Dual cameras with visible-thermal multispectral pairs provide both visual and thermal appearance, thereby enabling detecting pedestrians around the clock in various conditions and applications, including autonomous driving and intelligent transportation systems. However, due to the greatly varying real-world scenarios, the performance of a detector trained on a source dataset might change dramatically when evaluated on another dataset. A large amount of training data is often necessary to guarantee the detection performance in a new scenario. Typically, human annotators need to conduct the data labeling work, which is time-consuming, labor-intensive and unscalable. To overcome the problem, we propose a novel unsupervised transfer learning framework for multispectral pedestrian detection, which adapts a multispectral pedestrian detector to the target domain based on pseudo training labels. In particular, auxiliary detectors are utilized and different label fusion strategies are introduced according to the estimated environmental illumination level. Intermediate domain images are generated by translating the source images to mimic the target ones, acting as a better starting point for the parameter update of the pedestrian detector. The experimental results on the KAIST and FLIR ADAS datasets demonstrate that the proposed method achieves new state-of-the-art performance without any manual training annotations on the target data.
Keywords
pedestrian detection, unsupervised transfer learning, domain adaptation, deep learning, multispectral fusion, FUSION

Downloads

  • sensors.pdf
    • full text (Published version)
    • |
    • open access
    • |
    • PDF
    • |
    • 1.02 MB

Citation

Please use this url to cite or link to this publication:

MLA
Lyu, Chengjin, et al. “An Unsupervised Transfer Learning Framework for Visible-Thermal Pedestrian Detection.” SENSORS, vol. 22, no. 12, 2022, doi:10.3390/s22124416.
APA
Lyu, C., Heyer Wollenberg, P., Goossens, B., & Philips, W. (2022). An unsupervised transfer learning framework for visible-thermal pedestrian detection. SENSORS, 22(12). https://doi.org/10.3390/s22124416
Chicago author-date
Lyu, Chengjin, Patrick Heyer Wollenberg, Bart Goossens, and Wilfried Philips. 2022. “An Unsupervised Transfer Learning Framework for Visible-Thermal Pedestrian Detection.” SENSORS 22 (12). https://doi.org/10.3390/s22124416.
Chicago author-date (all authors)
Lyu, Chengjin, Patrick Heyer Wollenberg, Bart Goossens, and Wilfried Philips. 2022. “An Unsupervised Transfer Learning Framework for Visible-Thermal Pedestrian Detection.” SENSORS 22 (12). doi:10.3390/s22124416.
Vancouver
1.
Lyu C, Heyer Wollenberg P, Goossens B, Philips W. An unsupervised transfer learning framework for visible-thermal pedestrian detection. SENSORS. 2022;22(12).
IEEE
[1]
C. Lyu, P. Heyer Wollenberg, B. Goossens, and W. Philips, “An unsupervised transfer learning framework for visible-thermal pedestrian detection,” SENSORS, vol. 22, no. 12, 2022.
@article{8760549,
  abstract     = {{Dual cameras with visible-thermal multispectral pairs provide both visual and thermal appearance, thereby enabling detecting pedestrians around the clock in various conditions and applications, including autonomous driving and intelligent transportation systems. However, due to the greatly varying real-world scenarios, the performance of a detector trained on a source dataset might change dramatically when evaluated on another dataset. A large amount of training data is often necessary to guarantee the detection performance in a new scenario. Typically, human annotators need to conduct the data labeling work, which is time-consuming, labor-intensive and unscalable. To overcome the problem, we propose a novel unsupervised transfer learning framework for multispectral pedestrian detection, which adapts a multispectral pedestrian detector to the target domain based on pseudo training labels. In particular, auxiliary detectors are utilized and different label fusion strategies are introduced according to the estimated environmental illumination level. Intermediate domain images are generated by translating the source images to mimic the target ones, acting as a better starting point for the parameter update of the pedestrian detector. The experimental results on the KAIST and FLIR ADAS datasets demonstrate that the proposed method achieves new state-of-the-art performance without any manual training annotations on the target data.}},
  articleno    = {{4416}},
  author       = {{Lyu, Chengjin and Heyer Wollenberg, Patrick and Goossens, Bart and Philips, Wilfried}},
  issn         = {{1424-8220}},
  journal      = {{SENSORS}},
  keywords     = {{pedestrian detection,unsupervised transfer learning,domain adaptation,deep learning,multispectral fusion,FUSION}},
  language     = {{eng}},
  number       = {{12}},
  pages        = {{20}},
  title        = {{An unsupervised transfer learning framework for visible-thermal pedestrian detection}},
  url          = {{http://dx.doi.org/10.3390/s22124416}},
  volume       = {{22}},
  year         = {{2022}},
}

Altmetric
View in Altmetric
Web of Science
Times cited: