Loughborough University
Browse
water_US.pdf (1.08 MB)

Machine learning-based crop drought mapping system by UAV remote sensing RGB imagery

Download (1.08 MB)
journal contribution
posted on 2019-08-15, 09:53 authored by Jinya Su, Matthew CoombesMatthew Coombes, Cunjia LiuCunjia Liu, Yongchao Zhu, Xingyang Song, Shibo Fang, Lei Guo, Wen-Hua ChenWen-Hua Chen
Water stress has adverse effects on crop growth and yield, where its monitoring plays a vital role in precision crop management. This paper aims at initially exploiting the potentials of UAV aerial RGB image in crop water stress assessment by developing a simple but effective supervised learning system. Various techniques are seamlessly integrated into the system including vegetation segmentation, feature engineering, Bayesian optimization and Support Vector Machine (SVM) classifier. In particular, wheat pixels are first segmented from soil background by using the classical vegetation index thresholding. Rather than performing pixel-wise classification, pixel squares of appropriate dimension are defined as samples, from which various features for pure vegetation pixels are extracted including spectral and colour index features. SVM with Bayesian optimization is adopted as the classifier. To validate the developed system, a UAV survey is performed to collect high-resolution atop canopy RGB imageries by using DJI S1000 for the experimental wheat fields of Gucheng town, Heibei Province, China. Two levels of soil moisture were designed after seedling establishment for wheat plots by using intelligent irrigation and rain shelter, where field measurements were to obtain ground soil water ratio for each wheat plot. Comparative experiments by three-fold cross-validation demonstrate that pixel-wise classification, with a high computation load, can only achieve an accuracy of 82.8% with poor F1 score of 71.7%; however, the developed system can achieve an accuracy of 89.9% with F1 score of 87.7% by using only spectral intensities, and the accuracy can be further improved to 92.8% with F1 score of 91.5% by fusing both spectral intensities and colour index features. Future work is focused on incorporating more spectral information and advanced feature extraction algorithms to further improve the performance.

Funding

Science and Technology Facilities Council (STFC) under Newton fund with grant number ST/N006852/1

National Natural Science Foundation of China (NSFC) with grant number 61661136005.

History

School

  • Aeronautical, Automotive, Chemical and Materials Engineering

Department

  • Aeronautical and Automotive Engineering

Published in

Unmanned Systems

Volume

8

Issue

1

Pages

71-83

Publisher

World Scientific Publishing

Version

  • AM (Accepted Manuscript)

Rights holder

© World Scientific Publishing Company

Publisher statement

Electronic version of an article published as Unmanned Systems, 8(1), pp. 71-83 . https://doi.org/10.1142/S2301385020500053 © [copyright World Scientific Publishing Company] https://www.worldscientific.com/worldscinet/us

Acceptance date

2019-07-27

Publication date

2019-09-19

Copyright date

2020

Notes

Part of the work in this paper has been presented on the 37th Chinese Control Conference, Wuhan, China.

ISSN

2301-3850

eISSN

2301-3869

Language

  • en

Depositor

Dr Cunjia Liu

Usage metrics

    Loughborough Publications

    Categories

    No categories selected

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC