TY - JOUR
T1 - Deep convolutional neural networks for estimating maize above-ground biomass using multi-source UAV images
T2 - a comparison with traditional machine learning algorithms
AU - Yu, Danyang
AU - Zha, Yuanyuan
AU - Sun, Zhigang
AU - Li, Jing
AU - Jin, Xiuliang
AU - Zhu, Wanxue
AU - Bian, Jiang
AU - Ma, Li
AU - Zeng, Yijian
AU - Su, Zhongbo
N1 - Funding Information:
This work is supported by National Key Research & Development Program of China (2021YFC3201204), Key Research and Development Program in Guangxi (AB19245039), Pudong New Area Science & Technology Development Fund (PKX2020-R07), Fundamental Research Funds for the Central Universities (2042021kf0200), and Open Research Fund of Guangxi Key Laboratory of Water Engineering Materials and Structures, Guangxi Institute of Water Resources Research (GXHRI-WEMS-2022-01).
Publisher Copyright:
© 2022, The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature.
PY - 2022/7/2
Y1 - 2022/7/2
N2 - Accurate estimation of above-ground biomass (AGB) plays a significant role in characterizing crop growth status. In precision agriculture area, a widely-used method for measuring AGB is to develop regression relationships between AGB and agronomic traits extracted from multi-source remotely sensed images based on unmanned aerial vehicle (UAV) systems. However, such approach requires expert knowledges and causes the information loss of raw images. The objectives of this study are to (i) determine how multi-source images contribute to AGB estimation in single and whole growth stages; (ii) evaluate the robustness and adaptability of deep convolutional neural networks (DCNN) and other machine learning algorithms regarding AGB estimation. To establish multi-source image datasets, this study collected UAV red-green-blue (RGB), multispectral (MS) images and constructed the raster data for crop surface models (CSMs). Agronomic features were derived from the above-mentioned images and interpreted by the multiple linear regression, random forest, and support vector machine models. Then, a DCNN model was developed via an image-fusion architecture. Results show that the DCNN model provides the best estimation of maize AGB when a single type of image is considered, while the performance of DCNN degrades when sufficient agronomic features are used. Besides, the information of above three image datasets changes with various growth stages. The structure information derived from CSM images are more valuable than spectrum information derived from RGB and MS images in the vegetative stage, but less useful in the reproductive stage. Finally, a data fusion strategy was proposed according to the onboard sensors (or cost).
AB - Accurate estimation of above-ground biomass (AGB) plays a significant role in characterizing crop growth status. In precision agriculture area, a widely-used method for measuring AGB is to develop regression relationships between AGB and agronomic traits extracted from multi-source remotely sensed images based on unmanned aerial vehicle (UAV) systems. However, such approach requires expert knowledges and causes the information loss of raw images. The objectives of this study are to (i) determine how multi-source images contribute to AGB estimation in single and whole growth stages; (ii) evaluate the robustness and adaptability of deep convolutional neural networks (DCNN) and other machine learning algorithms regarding AGB estimation. To establish multi-source image datasets, this study collected UAV red-green-blue (RGB), multispectral (MS) images and constructed the raster data for crop surface models (CSMs). Agronomic features were derived from the above-mentioned images and interpreted by the multiple linear regression, random forest, and support vector machine models. Then, a DCNN model was developed via an image-fusion architecture. Results show that the DCNN model provides the best estimation of maize AGB when a single type of image is considered, while the performance of DCNN degrades when sufficient agronomic features are used. Besides, the information of above three image datasets changes with various growth stages. The structure information derived from CSM images are more valuable than spectrum information derived from RGB and MS images in the vegetative stage, but less useful in the reproductive stage. Finally, a data fusion strategy was proposed according to the onboard sensors (or cost).
KW - Above-ground biomass
KW - DCNN
KW - Machine learning
KW - Multi-source data
KW - Unmanned aerial vehicle
KW - 22/4 OA procedure
KW - ITC-ISI-JOURNAL-ARTICLE
U2 - 10.1007/s11119-022-09932-0
DO - 10.1007/s11119-022-09932-0
M3 - Article
AN - SCOPUS:85133297381
SN - 1385-2256
VL - 24
SP - 92
EP - 113
JO - Precision agriculture
JF - Precision agriculture
ER -