Advanced search
1 file | 1.46 MB

Fusion of pixel and object-based features for weed mapping using unmanned aerial vehicle imagery

Author
Organization
Abstract
The developments in the use of unmanned aerial vehicles (UAVs) and advanced imaging sensors provide new opportunities for ultra-high resolution (e.g., less than a 10 cm ground sampling distance (GSD)) crop field monitoring and mapping in precision agriculture applications. In this study, we developed a strategy for interand intra-row weed detection in early season maize fields from aerial visual imagery. More specifically, the Hough transform algorithm (HT) was applied to the orthomosaicked images for inter-row weed detection. A semi-automatic Object-Based Image Analysis (OBIA) procedure was developed with Random Forests (RF) combined with feature selection techniques to classify soil, weeds and maize. Furthermore, the two binary weed masks generated from HT and OBIA were fused for accurate binary weed image. The developed RF classifier was evaluated by 5-fold cross validation, and it obtained an overall accuracy of 0.945, and Kappa value of 0.912. Finally, the relationship of detected weeds and their ground truth densities was quantified by a fitted linear model with a coefficient of determination of 0.895 and a root mean square error of 0.026. Besides, the importance of input features was evaluated, and it was found that the ratio of vegetation length and width was the most significant feature for the classification model. Overall, our approach can yield a satisfactory weed map, and we expect that the obtained accurate and timely weed map from UAV imagery will be applicable to realize site-specific weed management (SSWM) in early season crop fields for reducing spraying non-selective herbicides and costs.
Keywords
UAVs, Inter- and intra-row weed detection, Feature fusion, OBIA, Random forests, Hyperparameter tuning, Feature evaluation

Downloads

  • (...).pdf
    • full text
    • |
    • UGent only
    • |
    • PDF
    • |
    • 1.46 MB

Citation

Please use this url to cite or link to this publication:

Chicago
Gao, Junfeng, Wenzhi Liao, David Nuyttens, Peter Lootens, Jürgen Vangeyte, Aleksandra Pizurica, Yong He, and Jan Pieters. 2018. “Fusion of Pixel and Object-based Features for Weed Mapping Using Unmanned Aerial Vehicle Imagery.” International Journal of Applied Earth Observation and Geoinformation 67: 43–53.
APA
Gao, Junfeng, Liao, W., Nuyttens, D., Lootens, P., Vangeyte, J., Pizurica, A., He, Y., et al. (2018). Fusion of pixel and object-based features for weed mapping using unmanned aerial vehicle imagery. INTERNATIONAL JOURNAL OF APPLIED EARTH OBSERVATION AND GEOINFORMATION, 67, 43–53.
Vancouver
1.
Gao J, Liao W, Nuyttens D, Lootens P, Vangeyte J, Pizurica A, et al. Fusion of pixel and object-based features for weed mapping using unmanned aerial vehicle imagery. INTERNATIONAL JOURNAL OF APPLIED EARTH OBSERVATION AND GEOINFORMATION. Elsevier BV; 2018;67:43–53.
MLA
Gao, Junfeng, Wenzhi Liao, David Nuyttens, et al. “Fusion of Pixel and Object-based Features for Weed Mapping Using Unmanned Aerial Vehicle Imagery.” INTERNATIONAL JOURNAL OF APPLIED EARTH OBSERVATION AND GEOINFORMATION 67 (2018): 43–53. Print.
@article{8544720,
  abstract     = {The developments in the use of unmanned aerial vehicles (UAVs) and advanced imaging sensors provide new opportunities for ultra-high resolution (e.g., less than a 10 cm ground sampling distance (GSD)) crop field monitoring and mapping in precision agriculture applications. In this study, we developed a strategy for interand intra-row weed detection in early season maize fields from aerial visual imagery. More specifically, the Hough transform algorithm (HT) was applied to the orthomosaicked images for inter-row weed detection. A semi-automatic Object-Based Image Analysis (OBIA) procedure was developed with Random Forests (RF) combined with feature selection techniques to classify soil, weeds and maize. Furthermore, the two binary weed masks generated from HT and OBIA were fused for accurate binary weed image. The developed RF classifier was evaluated by 5-fold cross validation, and it obtained an overall accuracy of 0.945, and Kappa value of 0.912. Finally, the relationship of detected weeds and their ground truth densities was quantified by a fitted linear model with a coefficient of determination of 0.895 and a root mean square error of 0.026. Besides, the importance of input features was evaluated, and it was found that the ratio of vegetation length and width was the most significant feature for the classification model. Overall, our approach can yield a satisfactory weed map, and we expect that the obtained accurate and timely weed map from UAV imagery will be applicable to realize site-specific weed management (SSWM) in early season crop fields for reducing spraying non-selective herbicides and costs.},
  author       = {Gao, Junfeng and Liao, Wenzhi and Nuyttens, David and Lootens, Peter and Vangeyte, J{\"u}rgen and Pizurica, Aleksandra and He, Yong and Pieters, Jan},
  issn         = {0303-2434},
  journal      = {INTERNATIONAL JOURNAL OF APPLIED EARTH OBSERVATION AND GEOINFORMATION},
  keyword      = {UAVs,Inter- and intra-row weed detection,Feature fusion,OBIA,Random forests,Hyperparameter tuning,Feature evaluation},
  language     = {eng},
  pages        = {43--53},
  publisher    = {Elsevier BV},
  title        = {Fusion of pixel and object-based features for weed mapping using unmanned aerial vehicle imagery},
  url          = {http://dx.doi.org/10.1016/j.jag.2017.12.012},
  volume       = {67},
  year         = {2018},
}

Altmetric
View in Altmetric
Web of Science
Times cited: