Full Text

Turn on search term navigation

© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.

Abstract

Unmanned aerial vehicle (UAV)-based remote sensing is gaining momentum in a variety of agricultural and environmental applications. Very-high-resolution remote sensing image sets collected repeatedly throughout a crop growing season are becoming increasingly common. Analytical methods able to learn from both spatial and time dimensions of the data may allow for an improved estimation of crop traits, as well as the effects of genetics and the environment on these traits. Multispectral and geometric time series imagery was collected by UAV on 11 dates, along with ground-truth data, in a field trial of 866 genetically diverse biomass sorghum accessions. We compared the performance of Convolution Neural Network (CNN) architectures that used image data from single dates (two spatial dimensions, 2D) versus multiple dates (two spatial dimensions + temporal dimension, 3D) to estimate lodging detection and severity. Lodging was detected with 3D-CNN analysis of time series imagery with 0.88 accuracy, 0.92 Precision, and 0.83 Recall. This outperformed the best 2D-CNN on a single date with 0.85 accuracy, 0.84 Precision, and 0.76 Recall. The variation in lodging severity was estimated by the best 3D-CNN analysis with 9.4% mean absolute error (MAE), 11.9% root mean square error (RMSE), and goodness-of-fit (R2) of 0.76. This was a significant improvement over the best 2D-CNN analysis with 11.84% MAE, 14.91% RMSE, and 0.63 R2. The success of the improved 3D-CNN analysis approach depended on the inclusion of “before and after” data, i.e., images collected on dates before and after the lodging event. The integration of geometric and spectral features with 3D-CNN architecture was also key to the improved assessment of lodging severity, which is an important and difficult-to-assess phenomenon in bioenergy feedstocks such as biomass sorghum. This demonstrates that spatio-temporal CNN architectures based on UAV time series imagery have significant potential to enhance plant phenotyping capabilities in crop breeding and Precision agriculture applications.

Details

Title
Implementing Spatio-Temporal 3D-Convolution Neural Networks and UAV Time Series Imagery to Better Predict Lodging Damage in Sorghum
Author
Varela, Sebastian 1   VIAFID ORCID Logo  ; Pederson, Taylor L 2 ; Leakey, Andrew D B 3 

 Center for Advanced Bioenergy and Bioproducts Innovation, Urbana, IL 61801, USA; [email protected] (S.V.); [email protected] (T.L.P.) 
 Center for Advanced Bioenergy and Bioproducts Innovation, Urbana, IL 61801, USA; [email protected] (S.V.); [email protected] (T.L.P.); Institute for Genomic Biology, University of Illinois at Urbana Champaign, Urbana, IL 61801, USA 
 Center for Advanced Bioenergy and Bioproducts Innovation, Urbana, IL 61801, USA; [email protected] (S.V.); [email protected] (T.L.P.); Institute for Genomic Biology, University of Illinois at Urbana Champaign, Urbana, IL 61801, USA; Department of Plant Biology, University of Illinois at Urbana Champaign, Urbana, IL 61801, USA; Center for Digital Agriculture, University of Illinois at Urbana Champaign, Urbana, IL 61801, USA; Department of Crop Sciences, University of Illinois at Urbana Champaign, Urbana, IL 61801, USA 
First page
733
Publication year
2022
Publication date
2022
Publisher
MDPI AG
e-ISSN
20724292
Source type
Scholarly Journal
Language of publication
English
ProQuest document ID
2627828247
Copyright
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.