1. Introduction
According to the report “The State of Food Security and Nutrition in the World 2024” [1], the global incidence of food insecurity has risen sharply since the outbreak of COVID-19 in 2019 and has remained at almost the same level for three years. As of 2023, about 713 to 757 million people globally are suffering from hunger, which is a serious deviation from the goal of eradicating hunger, food insecurity, and all forms of malnutrition by 2030. It can be seen that changes in grain crop yields have a significant impact on the stability of global society and the basic needs of people. By estimating food crop yields, we can provide key information for achieving the 2030 target in order to develop a more rational implementation plan.
However, yield estimation must be accurate, efficient, and timely to facilitate crop variety selection and optimize field management for precision agriculture. Based on this, researchers have developed grain crop growth models such as WOFOST [2], DSSAT [3], and APSIM [4]. Grain crop yield estimation is an extremely complex task as it is strongly influenced by crop varieties, growing environment (soil, climate, etc.), irrigation, fertilizer application, pests, and diseases [5]. In order to obtain better yield estimation results, these models require a large amount of crop growth parameters, soil, and meteorological data, which undoubtedly increases the cost, complexity, and uncertainty of yield estimation.
In recent years, remote sensing techniques have been developed and widely used for agricultural monitoring [6]. For example, Zhai et al. [7] and Sadeh et al. [8] use remotely sensed data for monitoring the soil and plant analyzer development (SPAD) and leaf area index (LAI) of crops. Anitha [9] and Park [10] use remotely sensed monitoring data for monitoring crop diseases and pests. Undoubtedly, it has also been widely used for estimating grain yield. Several studies have been conducted to estimate crop yields based on satellite remote sensing data, e.g., Gómez et al. [11] combined satellite remote sensing with climate data to achieve regional wheat yield estimation. Zhuo et al. [12] and Xie et al. [13] improved the accuracy of winter wheat yield estimation by inverting soil moisture, LAI, and other related parameters from satellite remote sensing data and assimilating them into crop growth models. Unmanned aerial vehicles (UAV) can perform low-altitude remote sensing, which has the advantages of high flexibility, short operational cycles, high spatial and temporal resolution, and low cost. [14,15]. They can obtain data with different spatial and temporal resolutions by carrying different sensors to meet the needs of researchers for different data, such as RGB images, multispectral, and hyperspectral images, etc. Therefore, they are, thus, widely being used in areas such as power infrastructure monitoring [16,17], geological hazards [18,19], atmospheric exploration [20], and precision agricultural management [21,22,23].
Machine learning (ML), as an important branch of artificial intelligence, was introduced by Arthur L. Samuel in 1959 [24]. Later, its concept was derived to understand and analyze new knowledge by relying on historical experience and combining it with probabilistic knowledge [25]. It is a common and effective classification and prediction method that can handle not only simple linear problems but also complex nonlinear problems. However, traditional ML is limited in its ability to capture complex nonlinear data, whereas deep learning (DL) [26,27], which can extract multiscale and multilevel features and abstract these features into high-level features [28,29], combined with its powerful self-learning capabilities and portability, has been applied to object detection [30,31], weather prediction [32,33], and natural language processing [34]. This provides new methods and research ideas for predicting grain crop yields.
In this study, a systematic review of the literature was conducted with the aim of summarizing the current application status of ML based on UAV remote sensing data in estimating the yield of grain crops (including cereal crops, potato crops, and legume crops), analyzing the shortcomings of current research, and providing ideas for future research. All studies reviewed were obtained from electronic databases and organized to answer the research questions. See Section 2.3 for more details.
The rest of this paper is structured as follows. Section 2 elaborates on the process of the systematic literature review and proposes research questions. In Section 3, an overall analysis is conducted for the systematic review. Section 4 provides a summary and brief analysis based on the research question. Finally, Section 5 discusses the challenges and opportunities of ML based on UAV remote sensing data in grain crop yield estimation applications. Section 6 summarizes the full paper.
2. Research Methods
2.1. Review Methodology
First, a review protocol must be established. This review was completed based on the well-known review guidelines provided by Kitchenham [35]. Then, the research questions were identified and relevant studies were retrieved from Scopus and Web of Science. Next, the selected articles were screened based on exclusion criteria. Finally, the data extracted from the articles were used to answer the research question.
2.2. Research Questions
The purpose of this study is to comprehensively summarize ML research on grain crop yield prediction using UAV remote sensing data. By reading a large number of research papers in the literature on yield estimation, we summarize the process of yield estimation as follows: determining the research object and content, setting experimental plans, data collection and preprocessing, feature selection (feature dimensionality reduction), model establishment, and validation evaluation; see Figure 1.
According to the process in Figure 1, we developed seven research questions to guide the systematic review and better achieve the research objectives. These questions will provide a clearer understanding of the shortcomings and challenges of the current research and help researchers to better perform their related work. The seven research questions are as follows.
-
RQ1. Which grain crops are more popular for yield estimation using ML based on UAV remote sensing data?
-
RQ2. Which research questions have received widespread attention in yield estimation research?
-
RQ3. How can a dataset for yield estimation be obtained?
-
RQ4. What are the data features used to estimate grain crop yields?
-
RQ5. Which ML algorithms are better for grain crop yield estimation?
-
RQ6. When is the best time for yield estimation for different crops?
-
RQ7. What are the issues faced in the field of yield estimation?
2.3. Search Strategy
We answer these research questions by conducting a systematic review of the literature and identifying more precise keywords to complete the literature search. A search using the single keywords “UAV” and “machine learning” returned numerous published studies, most of which were not relevant to our research area. Therefore, to improve the effectiveness of the search and to reduce the risk of missing relevant studies, we defined more complex search strings, as shown in Table 1.
2.4. Exclusion Criteria
Relevant studies were initially screened using the search string. To further optimize the search results and eliminate results that were not relevant to the research question, we defined four exclusion criteria and clarified the boundaries of the review.
-
Exclusion criteria 1: Articles written in a language other than English;
-
Exclusion criteria 2: Review, conference article, book, book chapter, data paper;
-
Exclusion criteria 3: Articles where the full text is not available;
-
Exclusion criteria 4: Articles that are duplicated across search databases.
2.5. Data Extraction
In order to accurately answer the research questions posed, we created a data extraction table that includes basic information of the article (source and year of publication) and detailed information (research grain crops, problems solved, methods used, features, assessment and evaluation metrics, optimal model). These data were obtained through careful reading and review of the selected papers. Part of the content can be found in Appendix A.
3. Analysis of Selected Publications
3.1. Study Selection
According to our determined search strategy, a total of 625 articles were obtained, of which 574 were in the Web of Science database and 51 in the Scopus database. After passing the exclusion criteria, 460 articles remained (165 were excluded). We downloaded these articles and analyzed their abstracts and title to understand the research content of the retrieved articles. Finally, 74 articles were identified (386 were excluded). In Figure 2, we illustrate the process of screening articles from the selected databases according to Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA). It is important to explain here why these 386 articles contain all the search strings but they still do not meet our needs, mainly due to the following two aspects:
Their research task is not yield estimation, but monitoring of crop growth or crop pests and diseases. Such as references [36,37,38], the abstract contains all the search strings, but the final task is to monitor the LAI. This is due to the mentioned importance of LAI for yield estimation and use of UAV remote sensing data and ML methods.
Their research objective is not grain crops, such as references [39,40].
3.2. Overview of Reviewed Publications
We summarized and analyzed 74 articles retrieved from two electronic databases based on the specific exclusion criteria in this review. Figure 3 shows that research on estimating grain crop yield based on UAV remote sensing data and ML began in 2016. In this study [41], the authors collected multispectral images of soybeans via a UAV platform and modeled yield estimation using the random forest (RF) algorithm, demonstrating the feasibility of the method. Meanwhile, this study also conducted research on soybean maturity classification. There were no relevant reports in 2017 and 2018, until 2020, when the number of reports increased significantly. By 2023, this number reached 24 articles. This may be due to the rapid development of DL, which is favored by a large number of researchers and has been gradually applied to the field of agriculture, providing new ideas and methods for yield estimation research. Of course, this is only a result obtained based on two databases and specific exclusion criteria. Relevant research results conducted in 2017 and 2018 may be retrieved from other databases, but we believe that these data can also represent a development trend.
In addition, we also counted the nationality information of the authors who published the relevant research articles (the data were derived from the nationality of the first author). As shown in Figure 4, China and the United States are far ahead in the number of reports on crop yield estimation based on UAV remote sensing and ML. After further exploring their main research crops, we found that for China, wheat yield estimation received more attention, while for the United States, it was corn. We also conducted a statistical analysis of the articles included in these reports, with 74 articles published in 31 different journals, focusing mainly on Remote Sensing, Computers and Electronics in Agriculture, Frontiers in Plant Science, Precision Agriculture, Agronomy, Drones, and The International Journal of Applied Earth Observation and Geoinformation. The number of articles published is 17, 9, 5, 5, 4, 3, and 3, respectively. For more details, please refer to Table 2.
4. Result and Analysis
This section returns to the proposed research question through the detailed information we extracted. We have visualized and analyzed the data for the detailed information of the systematic review, as shown in Figure 5. Most research questions can be answered based on this figure. In the figure, IFM represents information from multispectral images, IFR represents information from RGB images, IFH represents information from hyperspectral images, and IFT represents information from thermal images (the information mentioned here includes vegetation indices extracted from images, reflectance bands, calculated plant height, and texture features). Unimodal refers to data sourced from a single channel, such as using IFM in research, while multimodal refers to data sourced from multiple channels, such as using IFM and IFR in research. R-image represents RGB image, M-image represents multispectral image, and T-image represents thermal image. RF stands for random forest, SVM stands for support vector machine, KNN stands for k-nearest neighbor, MLP stands for multilayer perceptron, GPR stands for Gaussian process regression, PLSR stands for partial least squares regression, RR stands for ridge regression, EL stands for ensemble learning, GBR stands for gradient boosting regression, DNN stands for deep neural network, ANN stands for artificial neural network, CRT stands for classification and regression tree, LR stands for linear regression, CNN stands for convolutional neural network, and LSTM stands for long short-term memory. GML stands for generalized machine learning, and DL stands for deep learning.
4.1. RQ1—Which Grain Crops Are More Popular for Yield Estimation Using ML Based on UAV Remote Sensing Data?
By extracting and summarizing the data, we found that UAVs and ML have been used in yield estimation studies for a variety of grain crops, as shown in Figure 6. Since the crops studied in reference [42] include wheat, barley, and oats, there were 76 crop counts, even though we retrieved 74 articles. From the chart, we can clearly see that the studies were mainly focused on cereal crops, with 75%, followed by legume crops with 23%, while very few studies were performed on potato crops. Among the cereal crops, the major studies were on the three major cereal crops of wheat, corn, and rice, accounting for 96%. Soybean is the main legume crop with 72%. The potato crop was the subject of only one study. The emergence of this phenomenon is in fact closely related to the importance of grain crops. The more important the crop is, and the larger the area planted, the more attention researchers pay to it. There will also be more planning and agricultural management methods in experimental design. This is also related to drones. Especially for potato crops, since their fruits grow underground, UAV carrying sensors can only obtain canopy characteristics to indirectly estimate their yield, but cannot directly obtain information about the fruits (e.g., number, color, etc.), which limits the ability to estimate potato yield through DL. This is because it is not possible to obtain information about fruits such as wheat ears through image data, as in the case of wheat. For example, Peng et al. [43] use a DL algorithm (Mask R-NN) to extract three ear phenotypic features of wheat at the ripening stage, including ear count, ear size, and ear anomaly index. Based on these features, RF is used for yield estimation modeling.
4.2. RQ2—Which Research Questions Have Received Widespread Attention in Yield Estimation Research?
Why do we discuss this issue here? After reading 74 retrieved articles, we found that some studies are not specifically focused on yield estimation. Their research involves estimating aboveground biomass [44], maturity [41], plant height [45], and many other crop growth factors [46,47,48]. Alternatively, experiments are designed under different nitrogen fertilizer treatments, irrigation treatments, genomes, and multiple varieties to analyze the impact of these factors on yield, in order to serve crop breeding and precise field management. For example, in reference [49], the researcher designed six nitrogen levels with the aim of determining the optimal nitrogen application rate for maize, and the estimated yield provides favorable support for the diagnosis of nitrogen status. Similarly, in reference [50], it mainly revealed that soil properties exacerbate water stress and the impact of water stress on crop productivity, and the estimation of yield is only a validation work for this study, rather than a core work. In such cases, most articles do not conduct innovative research on algorithms or feature selection.
Undoubtedly, more research is focused on improving the performance of yield estimation. It is mainly divided into three directions. The first is feature selection or feature dimensionality reduction. In references [51,52,53], researchers use principal component analysis (PCA), recursive feature elimination (RFE), and other related feature engineering methods to achieve dimensionality reduction, maximize the role of advantageous features, and improve the accuracy of yield estimation. References [54,55] mention the correlation coefficient between features and yield, sort the importance of features, and select a certain number of features for research to improve the accuracy of yield. The second direction is to conduct research on multi-feature fusion, realize the application of multimodal data, and enhance prediction capabilities. The multi-sensor data fused by [56,57,58], including multispectral, RGB, thermal infrared, etc., also prove that multi-feature fusion can indeed improve the results of yield prediction. The third direction of research is to optimize the target algorithm, which is mainly reflected in the application of DL, such as increasing input dimensions [42,59], adding attention mechanism modules [60,61], etc. Of course, in GML, there are also a small number of studies that seek the best hyperparameters through grid search methods to improve the performance of the algorithm.
In summary, the research issues currently receiving widespread attention include:
(1). The impact of nitrogen fertilizer, irrigation, variety, genes, and other factors on yield;
(2). Feature selection, including optimal feature screening and multi-feature fusion;
(3). Algorithm improvement, mainly focusing on network structure.
4.3. RQ3—How Can a Dataset for Yield Estimation Be Obtained?
In order to ensure a high diversity of data, researchers conducted field experiment designs from multiple aspects such as different varieties [62,63,64], irrigation level [65,66], and nitrogen fertilizer level [67,68,69]. Some experiments [70,71] effectively combined any two aspects (such as variety and nitrogen fertilizer level) to design experiments, and each experimental design was repeated at least twice. Figure 7 illustrates a classic data acquisition and preprocessing process to obtain the final dataset used in the experiment. In this section, we will provide a detailed introduction to data collection and dataset construction. We summarized the 74 articles retrieved and found that the commonly used datasets currently consist of three types of data: UAV remote sensing data, ground data, and environmental data.
4.3.1. UAV Remote Sensing Data
UAV remote sensing data are obtained through UAV platforms equipped with different types of sensors, such as RGB cameras, multispectral cameras, hyperspectral cameras, LiDAR, thermal infrared, etc. By effectively collecting data information from these sensors in the research area, and then processing the data according to the research technology route, the dataset used in the research is obtained. Usually, these data are derived into two forms, the first being vegetation index features, texture features, and spectral bands extracted from the image, and the second being the image itself (usually an RGB image). We can also see this from Figure 5. The corresponding data processing flow for different data forms are found in Figure 7 and Figure 8, respectively. The processed data will be used as components of the dataset.
Figure 7 shows the data acquisition and processing process that requires obtaining more information through image data. In this case, professional data processing software is mainly used to synthesize and extract features from the image data. These software are usually Pix4d 4.5.6 (Switzerland), ENVI 5,3 (United States), and ArcGIS 10.2 (United States), although there are also software with similar functions used in research. In Figure 8, data augmentation is performed using methods such as translation, flip, noise, and brightness transformation to establish a complete dataset. This is because overfitting is prone to occur during model training, and data augmentation is an effective method to avoid overfitting [72]. Therefore, when directly applying image data, data augmentation is a key step in data processing, which can enable the model to learn features from different dimensions of the image and improve model performance. It is worth noting that in order to obtain high-quality remote sensing data, drone flights should be conducted between 10 am and 2 pm local time, under clear, cloudless, and sunny conditions. When using multispectral and hyperspectral sensors to obtain image data, the diffuse reflectance target plate needs to be calibrated before use to accurately obtain the reflectance of different bands. Figure 8
The data processing process when directly using the image itself as a dataset [73].
[Figure omitted. See PDF]
4.3.2. Ground Data
Ground data include measured yield data and crop growth data (such as plant height, SPAD, leaf nitrogen content, and LAI). The yield data are obtained by selecting a certain number of standard sample points within the study area to complete crop harvesting after the crops have matured, as shown in Figure 9a,b, followed by drying, threshing, and weighing, as shown in Figure 9c–f, to obtain the yield data of the sampling points. Crop growth data are usually collected and manually recorded using professional instruments and equipment, for example, recording crop height with a tower ruler, measuring LAI with a leaf area meter, and obtaining SPAD values with a chlorophyll meter. When measuring crop growth data, in order to ensure the validity of the data, it is generally necessary to select plants at fixed positions for measurement.
4.3.3. Environmental Data
Environmental data mainly include meteorological data [75,76] and soil data [77]. Meteorological data generally include precipitation, temperature, solar radiation, etc., obtained through government meteorological departments. Soil data include PH, organic matter, humidity, etc.
4.3.4. Construction of Dataset
After obtaining the necessary data for the experiment, it is necessary to construct a dataset. In the dataset, actual measurement data of production is essential, as it is the key to establishing the model. In addition, at least one type of data (vegetation index features or texture features or RGB images or environmental data) should be included as a feature input for the model.
4.4. RQ4—What Are the Data Features Used to Estimate Grain Crop Yields?
From Figure 5, it can be clearly seen that the features used in related research are divided into two types: one is unimodal and the other is multimodal. In our systematic review of 74 articles, there were 39 based on single modal features and 35 based on multimodal features, with no significant difference between the two. Under the characteristics of single modality, IFM is most commonly used, followed by IFR, IFH, and, finally, R-Image. In multimodal features, researchers choose a variety of feature combinations that are also relatively scattered, but we can still see that the combination of IFR and IFM modalities is more commonly used. At the same time, it should be noted that IFM also plays a role in other combinations in most cases. Based on this, we can see that whether it is single modal or multimodal, IFM is an important feature of data. Table 3 presents detailed information on IFMs that are frequently used in systematic reviews.
4.5. RQ5—Which ML Algorithms Are Better for Graian Crop Yield Estimation?
Table 4 shows the top five algorithms used. It should be noted that the data we provide represent the number of occurrences of algorithms with higher accuracy in yield estimation, rather than the algorithm with the most applications, as these are two completely different concepts. For example, if linear regression is used as the baseline algorithm but it is not the best estimated algorithm, it will be applied again statistically, but it will not be included in the algorithm with the best application results. Table 4 shows that RF is the most commonly used algorithm and is more suitable for solving yield estimation research problems, followed by CNN (including 2D-CNN, 3D-CNN, and CNN–LSTM), SVM, DNN, and EL. That is to say, in GML algorithms, RF is the optimal choice, while in DL algorithms, CNN is the optimal choice. For example, as shown in Figure 10, this study integrates a deep multiple instance learning framework and a multi-head attention mechanism to achieve yield estimation for cross-channel information fusion. Specifically, four types of data, including multispectral images, thermal images, digital elevation models, and genetic information (single nucleotide polymorphism), were inputted through multiple channels and converted into fixed-dimensional (256 dimensional) vectors. These vector representations were then combined into a 256-dimensional vector using attention mechanisms and input into a fully connected network to achieve yield estimation. Meanwhile, the code used in the research is publicly available on GitHub (
In Section 4.4, we have sorted out the most important features currently applied, and based on this, we believe that the optimal algorithm is also closely related to feature selection, as shown in Figure 5. We can see that most of the features flowing to RF come from IFM and “IFR + IFM”, indicating that almost all features based on RF modeling are extracted from RGB images and multispectral images, such as vegetation index or texture features. Similar GML algorithms SVM and EL also have a similar trend, with features almost entirely derived from information extracted from RGB, multispectral, and hyperspectral images. On the contrary, most of the features flowing to CNN are RGB images, which directly use the image itself as a feature and leverage CNN’s absolute advantage in the field of computer vision. Compared to GML, DNN has similar feature sources.
4.6. RQ6—When Is the Best Time for Yield Estimation for Different Crops?
Of the 74 studies reviewed herein, some estimated yields using data from a single growing period without comparing it with data from other growing periods, whereas others estimated yields using data from multiple periods, making it very difficult to determine which growing period was more appropriate for estimating yields. Based on the valid information retrieved from 74 articles, we provide a preliminary result: for wheat, the flowering stage and filling stage are the optimal yield estimation periods [47,52,57,60,79], while the filling stage and maturity stage are the optimal yield estimation periods for rice [51,73,80]. Other crops are not explicitly mentioned. This result can be used as a reference, but we are more inclined to conduct in-depth research on this issue in the future, especially based on different data characteristics, as the optimal estimated fertility period may also vary.
4.7. RQ7—What Are the Issues Faced in the Field of Yield Estimation?
The reviewed studies mentioned different challenges that impacted the results. The main issues include: (1) Most current models are based on one year of crop growth data (i.e., datasets are insufficient) and the applicability of the model to the following year’s growth conditions is not guaranteed; (2) many factors affect yield prediction, and yield estimation results obtained using only vegetation indices, RGB images, morphological parameters, and texture features are mediocre; feature screening and fusion studies remain lacking; (3) most yield estimation studies used specific experimental designs, such as different irrigation levels and different nitrogen fertilizer levels, rather than the natural conditions found in the field; (4) different yield estimation models are currently used for different crop types, different varieties of the same crop, and different field environments, and uniform or universally applicable models are lacking; (5) only mature ML algorithms are currently being applied to establish yield estimation models, while optimization of the algorithmic parameters, algorithm fusion, and other algorithmic innovations remain lacking.
5. Discussion
In this study, an initial search based on broad keywords produced 625 published articles. After applying exclusion criteria and removing duplicates, 74 articles remained. This search process is replicable; even if the replicated search results differ slightly, the overall search results and direction do not change. Differences may result from personal judgements or from papers published after the search was conducted in this study. Different and new studies may have been obtained if other similar search terms had been used in the search process, nonetheless, the search terms used in this study were sufficient for yielding numerous relevant studies, indicating that these and similar search terms are sufficient.
5.1. Current Challenges
5.1.1. Data Quantity for Yield Estimation
It is well known that the amount of data is crucial for ML modeling, especially for DL. The more data involved in training, the better the performance of the model, and the stronger its robustness and generalization. On the contrary, the performance of the model trained with a small amount of data will deteriorate [25,81]. In field experiments for estimating grain crop yields, most studies were conducted using data from around 200 sample points selected in the experimental area. In most areas, the same grain crop undergoes only one full growth cycle in a year, which means that our data collection frequency is once a year. In this low-frequency collection state, we have a limited amount of data. This requires us to conduct ML modeling based on approximately 200 pieces of data within a year, which is somewhat lacking in scale.
For image data, data augmentation can be achieved through various methods such as image rotation, scaling, and color transformation. The feasibility and effectiveness of these methods have been verified in the field of computer vision and have been widely applied. In recent years, few-shot learning (FSL) has been popular in solving problems such as sample imbalance and insufficient image feature labeling, and has been applied to fault-type diagnosis and UAV image classification [82,83]. Research has also been conducted in the field of agriculture, for example, using FSL to solve the problem of large training datasets required by traditional DL for avocado maturity estimation [84]. In chrysanthemum classification, FSL can effectively solve the recognition accuracy of minority classes and solve the problem of sample imbalance [85]. For yield estimation, we believe that the long time span of collecting data throughout the crop growth cycle requires setting up specific experiments for obtaining data under special scenarios such as water stress and nitrogen fertilizer stress. These will limit the size of the dataset and affect the data balance. FSL can be used to accurately estimate yields under special scenarios or rare scenarios by using image data (which can include historical data) over the full life span of the grain, which is significant for yield estimation under special circumstances such as natural disasters.
For numerical data such as vegetation index and texture features, there are far fewer ways to achieve data augmentation than for image data. Most studies expand the dataset with 2 or more consecutive years of experimental data to further improve modeling performance. It can be imagined that this approach requires a significant amount of time, manpower, and material costs, whereas relevant data processing techniques have been used to extend such data in other research areas, such as clinical data classification [86], earthquake risk assessment [87], and air quality forecasting [88]. This provides new ideas for establishing datasets for estimating grain yield.
5.1.2. Features for Yield Estimation
According to the results we retrieved, some studies have investigated the importance of features, for example, in reference [69], the researcher ranked the importance of all features and proved that NDVI, NDRE, and GNDVI have a higher impact ranking in maize yield estimation, which makes them the optimal feature combination for yield estimation. This method can effectively reduce the number of features. In reference [53], the researcher used three methods, RFE, Boruta, and Pearson correlation coefficient, to select features for hyperspectral indices. The feature importance rankings of the three methods were compared and it was found that the ranking results were not identical. Modeling for yield estimation was performed based on the combination of features obtained from each of the three methods. The final results show that the feature combinations selected by the Boruta method were the most effective for wheat yield estimation. The role of features in model performance is evident from these studies.
Figure 5 clearly reveals the commonly used feature modalities in estimating grain crop yields, and we can also clearly see that there is almost no difference in the number of studies based on multimodal features and unimodal features.
Among the unimodal features, IFM is the most commonly used feature, and researchers use the spectral index feature of a certain growth stage (single temporal) or several growth stages (multi temporal) during crop growth to estimate yield. In reference [57], the researcher collected multispectral and thermal images of wheat throughout its entire growth period and extracted vegetation indices as features of the yield estimation models. Experiments were conducted to compare the yield estimation results under single temporal features and multi temporal features. The results show that under single temporal characteristics, the grain filling stage was the optimal yield estimation period for wheat (R2 = 0.667), while under multi temporal characteristics, the fusion of index features from jointing, booting, heading, flowering, grain filling, and maturity as features provided a greater improvement in wheat yield estimation effect (R2 = 0.725), with an increase in estimation accuracy of about 4%. Similar findings have been reported in corn [61] and soybean [89] yield estimation.
Although spectral indices can reflect the growth of crops, they still have some limitations in information representation due to their main reflection of the interaction between light and crop absorption and reflection [52]. Therefore, the limited accuracy of yield estimation models based on unimodal spectral indices has been verified [90,91].
Zhou et al. [80] used computer vision techniques to estimate rice yield directly using RGB images (R2 = 0.97, MAPE = 3.98%). This study has more significant advantages compared to yield estimation based on multispectral vegetation index, texture features, and color index as features (R2 = 0.84, MAPE = 7.86%). In Yang et al.’s study [92], it was also demonstrated that an image-based model was superior to the spectral index-based model. However, due to the limitations of publicly available datasets in the field of yield, in most cases, it is impossible to achieve a complete mapping between images and yield. There are already many well-established publicly available datasets in the field of crop diseases and pest research that have advanced the technology in this area. References [93,94,95,96] are all based on public datasets to establish disease and pest identification models and conduct case verification.
Among the multimodal features, the most commonly used feature is the combination of IFM and IFR, which is still essentially an exponential combination. In fact, crop yield is influenced by many factors, such as climate factors (e.g., temperature, rainfall, light, etc.), soil properties (e.g., organic matter, moisture, etc.), and anthropogenic management practices (e.g., fertilizer, irrigation, pesticides) [97]. Nevavuori et al. [98] proposed that model training should be conducted by combining climatic, soil, and imagery data as well as crop time series to improve the accuracy of the model. This is consistent with the idea proposed by other studies to consider multi-source data fusion to improve yield estimation results [99,100]. Han et al. [101] used VI, climate, and soil variables to predict winter wheat yields in different agricultural regions and counties in China. Fiorentini et al. [68] trained a set of ML algorithms to predict yields in south-central Italy using 16 variables: fertilizer application, nitrogen management, soils, and remote sensing data. They demonstrated that multimodal feature combinations can improve model performance, and studies by [102,103,104] reached the same conclusion. This is consistent with the research results in the fields of medicine [105,106], mining [107], and emotional analysis [108], where the application of multimodal features can be effective in improving research results. By using more multimodal target features, a more comprehensive and complete understanding of the target can be obtained, which is more conducive to establishing accurate models.
5.1.3. Growth Stages for Yield Estimation
Crop growth performance varies at different growth stages. During the mature stage of rice, the withering of leaves leads to a decrease in greenness, which weakens the predictive ability of vegetation index [92]. This indicates that there may be differences in the characteristics used for yield estimation during different growth stages of crops, which is consistent with the conclusions of other studies [89,109]. We can see an example, in reference [70], multi spectral images were collected during the heading, flowering, early grain filling, and mid-grain filling stages of wheat growth, and vegetation indices were extracted as modeling features for yield estimation. Four different ML algorithms were used for modeling. Regardless of which algorithm was chosen, the yield estimation effect based on the index characteristics of different growth stages was different. During the flowering period, the Gaussian process is the best algorithm. In the early grain filling stage, the best algorithm is RF. In the mid-grain filling stage, the best algorithm is ridge regression. This is consistent with the background and problem addressed by Fei et al. [57] in their research on yield estimation.
5.1.4. Selection and Application of Algorithms
In Section 4.5, we explained the basic relationship between the best algorithm and the features, such as numerical features such as indices and texture features extracted from UAV remote sensing images. In traditional machine learning, the best algorithm is RF, whereas when the images themselves are used as features, the best algorithm in DL is CNN. This proves that the choice of algorithm is strongly correlated with the type of feature. It also poses a great challenge for the selection and application of modeling algorithms.
Traditional ML performs better on smaller datasets and low feature dimensions [25], which has also been confirmed in research on yield estimation. Among these studies [49,69,109,110], RF has the best yield estimation effect, with feature dimensions generally not exceeding 10 and reaching up to 26. For some studies with high-dimensional features (more than 30 features), they will use DNN for processing [54,79,111]. In recently published studies, Yuan et al.’s study [112] also confirmed that DNN performs better than traditional ML when using high-dimensional numerical features. This also explains why feature dimensionality reduction is necessary when using estimating yield using traditional ML. Meanwhile, if DL techniques are applied to low-dimensional data, their accuracy will not be as good as traditional ML, because DL is more suitable for datasets that are large and have high-dimensional features, especially for image data, which can even have thousands or tens of thousands of features [81]. Therefore, CNNs perform more prominently in yield estimation studies when using image data as features. In the description of Section 5.1.2, we conclude that multimodal features are the mainstream trend for the future, which can comprehensively reflect research objectives. However, it is worth noting that current research on multimodal feature fusion is still focused on single-type multimodal features, lacking research on multiple types of multimodal features. Therefore, the fusion of numerical features such as spectral indices and texture features with image features to improve the accuracy of yield estimation requires higher requirements for algorithm selection and is also an important direction for future research. For example, in the research of disease recognition, Feng et al. [113] fused the features of both image and text information modalities and established a vegetable disease recognition model based on YoLov5s. This model further improved the accuracy of field vegetable disease recognition by utilizing the correlation and complementarity of the two modal features, and achieved good results on small datasets.
5.1.5. UAVs for Yield Estimation
UAVs have been extensively studied in yield estimation research because of their advantages, but they also have certain limitations, mainly including endurance time and flight restrictions. The sustainable flight time of UAVs is relatively short, and in long-distance flight missions, batteries need to be replaced midway to complete flight operations. As a result, UAVs are mainly used to estimate yield of field plots, which is more difficult in regional yield estimation tasks. The yield of the plot can be used to calculate the yield of the region. However, such a result may lead to significant bias because the growing environment may vary significantly from plot to plot. In order to enhance the capability of UAV regional yield estimation, it is necessary to improve the transfer capability of the yield estimation model and to ensure the diversity of the training data. This is why plot designs with different nitrogen fertilizer levels, irrigation levels, and varieties may occur in the experimental design. In terms of UAV flights, on the one hand, there will be local flight restrictions, which will prevent the UAV from performing normal flight operations. On the other hand, it is highly affected by weather factors, especially during rainy days when data collection cannot be accomplished. If there are consecutive rainy days, it will result in missing growth data of crops during the critical growing period. However, with the deep application of deep learning technology in the field of yield estimation, it can be combined with real-time detection techniques such as edge computing, and has the opportunity to become an effective way to achieve real-time yield estimation.
5.2. Future Work
The research conducted in the retrieved articles is based on different research ideas, different types of crops and planting areas, and also applies many different features. We have also identified existing research problems and factors that constrain the performance of yield estimation models. We believe that future research can be conducted from four aspects: feature selection, data augmentation, algorithm improvement, and real-time yield estimation.
Feature application: The results of the review show that there are many factors that affect crop yield, including nitrogen fertilizer level, irrigation amount, plant height, SPAD, LAI, soil, and weather. Therefore, in future research, we need to obtain comprehensive information on crop growth and growth environment data as much as possible to provide sufficient input features for modeling and improve prediction accuracy. This work has high difficulty in both experimental design and data collection. We can also use drone remote sensing data alone for yield estimation, and through multimodal fusion, further clarify which features can have better yield estimation results when the accuracy meets actual needs. Of course, regardless of which approach is adopted, multiple feature engineering methods need to be attempted to achieve feature selection or feature dimensionality reduction in order to reduce the time complexity of the model.
Data augmentation: In the application of ML algorithms, the amount of data has a significant impact on the quality of modeling. However, in actual production, due to the long growth cycle of grain crops and the fact that most of them only experience one growth cycle per year, there is a serious problem of insufficient data, and there are currently not many mature public datasets available, which limits the application of deep learning in yield estimation research. Therefore, we need to explore the use of appropriate methods to expand the dataset and increase the amount of data, especially for numerical data such as vegetation indices. In future research, overfitting or underfitting methods can be used as data augmentation techniques to achieve reasonable expansion of data.
Algorithm improvement: Through a systematic review, we have learned that the characteristics that can reflect crop yield are different at different growth stages of crops. However, our current research on multi temporal data lacks research in this area. Therefore, in the next step of research, we need to determine the optimal features for different growth stages, and, on the other hand, we also need to use the idea of increasing weights to assign weights to the features of different growth stages in the architecture design of the entire model, further improving the effectiveness of yield estimation. Meanwhile, most of the algorithms currently applied in this field are mature and lack adjustments to algorithm parameters and optimization of network structures. We will introduce swarm intelligence optimization algorithms to improve the model, dynamically adjust model parameters based on actual datasets, and enhance the accuracy and generalization ability of the yield estimation model.
Real-time yield estimation: Our current main research focuses on how to obtain a more accurate yield estimation model, but we have not performed much research on the application of yield estimation models. In the next step, we can make full use of the advantages of UAVs and combine edge computing technology to achieve real-time yield estimation.
6. Conclusions
This article provides a comprehensive review of recent research on crop yield estimation based on UAV and ML. We have studied the progress of grain crop yield estimation, including feature applications, research concerns, optimal estimation models, and optimal yield estimation periods. On this basis, the challenges faced by the research were proposed, providing a basis and suggestion for future research.
We used a systematic literature review method to analyze seven research questions. The review results also provided good answers to seven research questions. In this research field, wheat, corn, rice, and soybean are the main research targets (RQ1). In the modeling process, feature selection is a key step to improve model robustness and accuracy, and has received widespread attention from researchers. Nitrogen application rate, irrigation rate, variety diversity, and gene diversity are the main issues of concern in the experimental design stage (RQ2). Most studies have conducted data collection under such designs, using drones equipped with different sensors (RGB, multispectral, hyperspectral, LiDAR) to obtain data on crop growth stages, in order to further extract crop phenotype features as model inputs (RQ3). Whether based on single modal features or multimodal features for yield estimation research, multispectral images are the main source of feature information (RQ4). The selection of ML algorithms varies depending on different feature types. Based on image class data, the transformation and improvement based on CNN are the most commonly used and perform the best, while based on numerical data such as spectral indices, RF performs the best (RQ5). When evaluating the quality of yield estimation models, we also found that the growth period of crops is a key indicator, and there is no authoritative interpretation for determining the optimal estimated growth period. We have only provided some reference conclusions based on the retrieved articles, hoping for further research in the future (RQ6). In fact, the lack of sufficient research on data volume, determination of influencing factors, feature selection, and algorithm optimization are all factors that constrain the development of current yield estimation models (RQ7).
In summary, this article highlights the challenges in crop yield estimation research based on UAV remote sensing data and ML. Through feature selection, fusion, dataset expansion, algorithm optimization, and transfer, the effectiveness of yield estimation is expected to be further improved. Especially in the framework of deep learning, multi-dimensional and multimodal features can be more accurately extracted and learned, and transfer learning is a possibility to enhance the generalization performance of yield models.
Conceptualization, J.Y. and L.G.; methodology, J.Y. and L.G.; formal analysis, J.Y. and Y.Z.; investigation, J.Y., Y.Z. and Z.Z.; data curation, J.Y., Y.Z. and Z.Z.; writing—original draft preparation, J.Y.; writing—review and editing, J.Y., W.Y., W.W. and L.G.; visualization, J.Y.; supervision, W.W., W.Y. and L.G.; project administration, W.W. and L.G.; funding acquisition, L.G. All authors have read and agreed to the published version of the manuscript.
The authors declare no conflicts of interest.
Footnotes
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.
Figure 4. Distribution of the number of research articles published by various countries.
Figure 6. Distribution of studies in which ML was used for estimating crop yields.
Figure 9. Ground data acquisition process [74]. (a) Determination of 1 m × 1 m sampling squares. (b) Harvest of the wheat in sampling squares. (c) Drying of the sampled wheat. (d) Threshing of the sampled wheat. (e) Removal of impurities from the threshed wheat. (f) Determination of dry weight of the sampled wheat.
The search strings for two databases.
Database | Search String |
---|---|
Scopus | Title, abstract, and keywords = (“crop yield prediction” OR “crop yield estimation” OR “crop yield forecasting”) AND Title, abstract, and keywords = (“unmanned aerial vehicle” OR “UAV” OR “drone”) AND Title, abstract, and keywords = (“machine learning” OR “artificial intelligence”) |
Web of Science | Topic = (“crop yield prediction” OR “crop yield estimation” OR “crop yield forecasting”) AND Topic = (“unmanned aerial vehicle” OR “UAV” OR “drone”) AND Topic = (“machine learning” OR “artificial intelligence”) |
Topic including title, abstract, author keywords, and keywords plus.
The number of related research articles published in various journals.
Journal | Number of Published Articles |
---|---|
Remote Sensing | 17 |
Computers and Electronics in Agriculture | 9 |
Frontiers in Plant Science | 5 |
Precision Agriculture | 5 |
Agronomy | 4 |
Drones | 3 |
International Journal of Applied Earth Observation and Geoinformation | 3 |
Plant Methods | 2 |
Remote Sensing Applications: Society and Environment | 2 |
Remote Sensing of Environment | 2 |
Sensors | 2 |
Agricultural and Forest Meteorology | 1 |
Agriculture | 1 |
Agriengineering | 1 |
Agronomy Journal | 1 |
Applied Sciences | 1 |
Biological Agriculture and Horticulture | 1 |
Bioinformatics | 1 |
Biosystems Engineering | 1 |
European Journal of Agronomy | 1 |
Field Crops Research | 1 |
IEEE Access | 1 |
IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing | 1 |
IEEE Transactions on Geoscience and Remote Sensing | 1 |
ISPRS Journal of Photogrammetry and Remote Sensing | 1 |
Journal of Agriculture and Food Research | 1 |
Journal of Biosystems Engineering | 1 |
Plant Journal | 1 |
Plant Phenomics | 1 |
PLOS One | 1 |
Sustainability | 1 |
Detailed information on frequently used IFMs.
IFM | # of Times Used | Formula |
---|---|---|
Normalized difference vegetation index (NDVI) | 35 | |
Green normalized difference vegetation index (GNDVI) | 26 | |
Normalized difference red edge (NDRE) | 23 | |
Optimized soil-adjusted vegetation index (OSAVI) | 18 | |
Soil-adjusted vegetation index (SAVI) | 16 | |
Chlorophyll index red edge (CIrededge) | 12 | |
Ratio vegetation index (RVI) | 12 | |
Triangular vegetation index (TVI) | 12 | |
Enhanced vegetation index (EVI) | 11 | |
Two-band enhanced vegetation index (EVI2) | 11 | |
Top five algorithms.
Top Five ML Algorithms | # of Times Used |
---|---|
Random forest (RF) | 18 |
Convolutional neural network (CNN) | 11 |
Support vector machine (SVM) | 8 |
Deep neural network (DNN) | 6 |
Ensemble learning (EL) | 8 |
Appendix A
Selected publications.
Reference | Retrieved from | Target Grain Crops | Feature | Optimal Modelling | Year |
---|---|---|---|---|---|
[ | Web of Science | Soybean | Multispectral DN value | RF | 2016 |
[ | Scopus | Wheat | RGB images, NDVI raster | CNN | 2019 |
[ | Scopus | Rice | RGB and multispectral image | CNN | 2019 |
[ | Web of Science | Maize | Multispectral VIs | RF | 2020 |
[ | Web of Science | Wheat | Mean and standard deviation of hyperspectral bands | DNN | 2020 |
[ | Web of Science | Wheat, Barley, Oats | RGB image, weather data (cumulative temperature) | 3D-CNN | 2020 |
[ | Web of Science | Soybean | Multispectral VIs | MLP | 2020 |
[ | Web of Science | Wheat | Hyperspectral VIs | PLSR | 2020 |
[ | Web of Science | Maize | RGB Vis | SVM | 2020 |
[ | Scopus | Soybean | VIs from RGB, multispectral and thermal | DNN | 2020 |
[ | Web of Science | Wheat | Multispectral VIs | RF | 2020 |
[ | Web of Science | Soybean | VIs from RGB and multispectral | XGBoost | 2020 |
[ | Web of Science | Wheat | Multispectral VIs | EL | 2021 |
[ | Web of Science | Rice | Hyperspectral VIs | XGBoost | 2021 |
[ | Web of Science | Wheat | VIs from multispectral and thermal | 2021 | |
[ | Web of Science | Wheat | Thermal VIs, weather data (rainfall, air temperature, dew point, relative humidity, wind speed) | CRT | 2021 |
[ | Web of Science | Wheat | Multispectral VIs, PH | ANN | 2021 |
[ | Web of Science | Potato | Multispectral VIs | RF | 2021 |
[ | Web of Science | Wheat | Multispectral VIs | RF | 2021 |
[ | Web of Science | Soybean | Multispectral VIs | DNN | 2021 |
[ | Web of Science | Wheat | Multispectral VIs | SVM | 2021 |
[ | Web of Science | Maize | RGB VIs, PH | RR | 2021 |
[ | Web of Science | Rice | Multispectral VIs, thermal raster | 2D-CNN | 2022 |
[ | Web of Science | Maize | Hyperspectral Vis | RR | 2022 |
[ | Web of Science | Faba bean | PH | SVM | 2022 |
[ | Scopus | Soybean | Multispectral VIs, texture, PH | Cubist | 2022 |
[ | Web of Science | Soybean | RGB VIs, texture, PH, CC, lodging data | DNN | 2022 |
[ | Web of Science | Wheat | Multispectral Vis | GPR | 2022 |
[ | Web of Science | Wheat | Multispectral Vis | GPR | 2022 |
[ | Web of Science | Wheat | Multispectral bands | avNNet | 2022 |
[ | Web of Science | Wheat | VIs from RGB and multispectral | RF, SVM, GB | 2022 |
[ | Web of Science | Wheat | Hyperspectral Vis | EL | 2022 |
[ | Web of Science | Maize | Multispectral VIs, meteorological data (daily total precipitation, daily average temperature, daily maximum temperature, daily minimum temperature, vapor pressure, and daily total solar radiation) | CNNattention–LSTM | 2023 |
[ | Web of Science | Rice | RGB image | ConvNext | 2023 |
[ | Web of Science | Wheat | RGB information | RF | 2023 |
[ | Web of Science | Maize | Hyperspectral VIs | RF | 2023 |
[ | Web of Science | Maize | RGB VIs, PH | RF | 2023 |
[ | Scopus | Wheat | Multispectral VIs | CNN | 2023 |
[ | Web of Science | Soybean | Hyperspectral VIs, texture, maturity information | GPR | 2023 |
[ | Web of Science | Faba bean | VIs from RGB and multispectral | RR | 2023 |
[ | Web of Science | Faba bean | RGB VIs, PH, CC | EL | 2023 |
[ | Scopus | Wheat | RGB, thermal and hyperspectral image | MultimodalNet | 2023 |
[ | Scopus | Wheat | VIs from RGB and multispectral, PH, CC, CV | RF | 2023 |
[ | Web of Science | Maize | Multispectral VIs, texture, CC | RF | 2023 |
[ | Web of Science | Maize | Multispectral band, leaf temperature | RF | 2023 |
[ | Web of Science | Rice | Multispectral image, weather data (precipitation, global solar radiation, average temperature, minimum temperature, maximum temperature, average relative humidity, average wind speed, vapor pressure data) | CNN | 2023 |
[ | Web of Science | Wheat | Multispectral image, genetic data | PheGeML | 2023 |
[ | Scopus | Mazie | Multispectral VIs | DNN | 2023 |
[ | Scopus | Wheat | Multispectral VIs | LASSO | 2023 |
[ | Web of Science | Chickpea | RGB VIs, CC, CV | SVM | 2023 |
[ | Web of Science | Wheat | VIs from multispectral and thermal, texture, PH | DNN | 2023 |
[ | Web of Science | Maize | Multispectral VIs, PH | KNN, SVM | 2023 |
[ | Web of Science | Wheat | RGB and multispectral image | CNN–LSTM | 2023 |
[ | Web of Science | Wheat | RGB image | CNN | 2023 |
[ | Scopus | Wheat | Multispectral VIs | GPR | 2023 |
[ | Web of Science | Mazie | Multispectral VIs, PH | SVM, RF | 2023 |
[ | Web of Science | Wheat | Multispectral VIs, weather data (cumulative rainfall, mean temperature), soil data (organic carbon, N, C/N ratio) | GBM | 2024 |
[ | Scopus | Rice | RGB image | YOLOv5 | 2024 |
[ | Scopus | Soybean | Multispectral VIs, CC | RF | 2024 |
[ | Scopus | Rice | Hyperspectral Vis | XGBoost | 2024 |
[ | Scopus | Soybean | Multispectral Vis | GBR | 2024 |
[ | Web of Science | Wheat | VIs from multispectral and thermal, texture, meteorological environment data (precipitation, minimum temperature, maximum temperature) | LSTM | 2024 |
[ | Web of Science | Maize | VIs from RGB and multispectral, weather data (daily average air temperature, daily total precipitation) | LR | 2024 |
[ | Scopus | Soybean | Multispectral VIs, weather data (daily average temperature, daily maximum temperature, daily minimum temperature, daily accumulated precipitation, global solar radiation, daily average relative humidity, daily average wind speed, actual vapor pressure) | LASSO | 2024 |
[ | Scopus | Soybean | RGB images | 3D-CNN | 2024 |
[ | Scopus | Maize | Multispectral VIs, CV, soil properties (NPK, pH, soil moisture, soil temperature, EC), weather data (solar radiation, evapotranspiration, daily rain, rain rate, humidity, temperature, wind speed) | EL | 2024 |
[ | Web of Science | Rice | RGB Vis | SVM | 2024 |
[ | Scopus | Pea | Multispectral VIs, texture, PH, CC | EL | 2024 |
[ | Web of Science | Rice | Multispectral VIs | EL | 2024 |
[ | Web of Science | Wheat | Multispectral VIs | RF | 2024 |
[ | Web of Science | Soybean | VIs from RGB and multispectral, texture, structural features (plant height, canopy convex hull volume, roughness, canopy cover, canopy width, reconstruction points of canopy point cloud, and vegetation index of point cloud) | EL | 2024 |
[ | Web of Science | Wheat | Multispectral VIs | RF | 2024 |
[ | Web of Science | Wheat | Multispectral VIs, texture | RF | 2024 |
[ | Web of Science | Rice | Multispectral VIs | RF, PLSR | 2024 |
VIs: Vegetation indexes, PH: plant height, CC: canopy cover, CV: canopy volume.
References
1. Food and Agriculture Organization, International Fund for Agricultural Development, United Nations Children’s Fund, World Food Programme, World Health Organization. The State of Food Security and Nutrition in the World 2024: Financing for the Elimination of Hunger, Food Insecurity, and All Forms of Malnutrition—Overview; Food and Agriculture Organization: Rome, Italy, 2024.
2. de Wit, A.; Boogaard, H.; Fumagalli, D.; Janssen, S.; Knapen, R.; van Kraalingen, D.; Supit, I.; van der Wijngaart, R.; van Diepen, K. 25 years of the WOFOST cropping systems model. Agric. Syst.; 2019; 168, pp. 154-167. [DOI: https://dx.doi.org/10.1016/j.agsy.2018.06.018]
3. Jones, J.W.; Hoogenboom, G.; Porter, C.H.; Boote, K.J.; Batchelor, W.D.; Hunt, L.A.; Wilkens, P.W.; Singh, U.; Gijsman, A.J.; Ritchie, J.T. The DSSAT cropping system model. Eur. J. Agron.; 2003; 18, pp. 235-265. [DOI: https://dx.doi.org/10.1016/S1161-0301(02)00107-7]
4. McCown, R.; Hammer, G.; Hargreaves, J.; Holzworth, D.; Freebairn, D. APSIM: A novel software system for model development, model testing and simulation in agricultural systems research. Agric. Syst.; 1996; 50, pp. 255-271. [DOI: https://dx.doi.org/10.1016/0308-521X(94)00055-V]
5. Xu, X.; Gao, P.; Zhu, X.; Guo, W.; Ding, J.; Li, C.; Zhu, M.; Wu, X. Design of an integrated climatic assessment indicator (ICAI) for wheat production: A case study in Jiangsu Province, China. Ecol. Indic.; 2019; 101, pp. 943-953. [DOI: https://dx.doi.org/10.1016/j.ecolind.2019.01.059]
6. Xu, W.; Lan, Y.; Li, Y.; Luo, Y.; He, Z. Classification method of cultivated land based on UAV visible light remote sensing. Int. J. Agric. Biol. Eng.; 2019; 12, pp. 103-109. [DOI: https://dx.doi.org/10.25165/j.ijabe.20191203.4754]
7. Zhai, W.; Li, C.; Cheng, Q.; Ding, F.; Chen, Z. Exploring Multisource Feature Fusion and Stacking Ensemble Learning for Accurate Estimation of Maize Chlorophyll Content Using Unmanned Aerial Vehicle Remote Sensing. Remote Sens.; 2023; 15, 3454. [DOI: https://dx.doi.org/10.3390/rs15133454]
8. Sadeh, R.; Avneri, A.; Tubul, Y.; Lati, R.N.; Bonfil, D.J.; Peleg, Z.; Herrmann, I. Chickpea leaf water potential estimation from ground and VENµS satellite. Precis. Agric.; 2024; 25, pp. 1658-1683. [DOI: https://dx.doi.org/10.1007/s11119-024-10129-w]
9. Anitha, J.; Saranya, N. Cassava Leaf Disease Identification and Detection Using Deep Learning Approach. Int. J. Comput. Commun. Control; 2022; 17, 4356. [DOI: https://dx.doi.org/10.15837/ijccc.2022.2.4356]
10. Park, Y.-H.; Choi, S.H.; Kwon, Y.-J.; Kwon, S.-W.; Kang, Y.J.; Jun, T.-H. Detection of Soybean Insect Pest and a Forecasting Platform Using Deep Learning with Unmanned Ground Vehicles. Agronomy; 2023; 13, 477. [DOI: https://dx.doi.org/10.3390/agronomy13020477]
11. Gómez, D.; Salvador, P.; Sanz, J.; Casanova, J.L. Modelling wheat yield with antecedent information, satellite and climate data using machine learning methods in Mexico. Agric. For. Meteorol.; 2021; 300, 108317. [DOI: https://dx.doi.org/10.1016/j.agrformet.2020.108317]
12. Zhuo, W.; Huang, J.; Li, L.; Zhang, X.; Ma, H.; Gao, X.; Huang, H.; Xu, B.; Xiao, X. Assimilating Soil Moisture Retrieved from Sentinel-1 and Sentinel-2 Data into WOFOST Model to Improve Winter Wheat Yield Estimation. Remote Sens.; 2019; 11, 1618. [DOI: https://dx.doi.org/10.3390/rs11131618]
13. Xie, Y.; Wang, P.; Bai, X.; Khan, J.; Zhang, S.; Li, L.; Wang, L. Assimilation of the leaf area index and vegetation temperature condition index for winter wheat yield estimation using Landsat imagery and the CERES-Wheat model. Agric. For. Meteorol.; 2017; 246, pp. 194-206. [DOI: https://dx.doi.org/10.1016/j.agrformet.2017.06.015]
14. Colomina, I.; Molina, P. Unmanned aerial systems for photogrammetry and remote sensing: A review. ISPRS J. Photogramm. Remote Sens.; 2014; 92, pp. 79-97. [DOI: https://dx.doi.org/10.1016/j.isprsjprs.2014.02.013]
15. Garcia-Ruiz, F.; Sankaran, S.; Maja, J.M.; Lee, W.S.; Rasmussen, J.; Ehsani, R. Comparison of two aerial imaging platforms for identification of Huanglongbing-infected citrus trees. Comput. Electron. Agric.; 2013; 91, pp. 106-115. [DOI: https://dx.doi.org/10.1016/j.compag.2012.12.002]
16. Chen, C.; Yang, B.; Song, S.; Peng, X.; Huang, R. Automatic clearance anomaly detection for transmission line corridors utilizing UAV-borne LIDAR data. Remote Sens.; 2018; 10, 613. [DOI: https://dx.doi.org/10.3390/rs10040613]
17. Jiang, S.; Jiang, W.; Huang, W.; Yang, L. UAV-based oblique photogrammetry for outdoor data acquisition and offsite visual inspection of transmission line. Remote Sens.; 2017; 9, 278. [DOI: https://dx.doi.org/10.3390/rs9030278]
18. Zhang, Y.; Yue, P.; Zhang, G.; Guan, T.; Lv, M.; Zhong, D. Augmented reality mapping of rock mass discontinuities and rockfall susceptibility based on unmanned aerial vehicle photogrammetry. Remote Sens.; 2019; 11, 1311. [DOI: https://dx.doi.org/10.3390/rs11111311]
19. Fernández, T.; Pérez, J.L.; Cardenal, J.; Gómez, J.M.; Colomo, C.; Delgado, J. Analysis of landslide evolution affecting olive groves using UAV and photogrammetric techniques. Remote Sens.; 2016; 8, 837. [DOI: https://dx.doi.org/10.3390/rs8100837]
20. Villa, T.F.; Salimi, F.; Morton, K.; Morawska, L.; Gonzalez, F. Development and validation of a UAV based system for air pollution measurements. Sensors; 2016; 16, 2202. [DOI: https://dx.doi.org/10.3390/s16122202]
21. Shao, G.; Han, W.; Zhang, H.; Wang, Y.; Zhang, L.; Niu, Y.; Zhang, Y.; Cao, P. Estimation of transpiration coefficient and aboveground biomass in maize using time-series UAV multispectral imagery. Crop. J.; 2022; 10, pp. 1376-1385. [DOI: https://dx.doi.org/10.1016/j.cj.2022.08.001]
22. Amarasingam, N.; Gonzalez, F.; Salgadoe, A.S.A.; Sandino, J.; Powell, K. Detection of white leaf disease in sugarcane crops using UAV-Derived RGB imagery with existing deep learning models. Remote Sens.; 2022; 14, 6137. [DOI: https://dx.doi.org/10.3390/rs14236137]
23. Chivasa, W.; Mutanga, O.; Burgueño, J. UAV-based high-throughput phenotyping to increase prediction and selection accuracy in maize varieties under artificial MSV inoculation. Comput. Electron. Agric.; 2021; 184, 106128. [DOI: https://dx.doi.org/10.1016/j.compag.2021.106128]
24. Samuel, A.L. Programming Computers to Play Games; Elsevier: Amsterdam, The Netherlands, 1960.
25. Zhou, Z. Machine Learning; Tinghua University Press: Beijing, China, 2016.
26. LeCun, Y.; Bengio, Y. Convolutional Networks for Images, Speech, and Time Series; MIT Press: Cambridge, MA, USA, 1998.
27. LeCun, Y.; Bengio, Y.; Hinton, G. Deep learning. Nature; 2015; 521, pp. 436-444. [DOI: https://dx.doi.org/10.1038/nature14539] [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/26017442]
28. Zhang, L.; Zhang, L.; Du, B. Deep Learning for Remote Sensing Data: A Technical Tutorial on the State of the Art. IEEE Geosci. Remote Sens. Mag.; 2016; 4, pp. 22-40. [DOI: https://dx.doi.org/10.1109/MGRS.2016.2540798]
29. Kamilaris, A.; Prenafeta-Boldú, F.X. Deep learning in agriculture: A survey. Comput. Electron. Agric.; 2018; 147, pp. 70-90. [DOI: https://dx.doi.org/10.1016/j.compag.2018.02.016]
30. Kaur, P.; Harnal, S.; Gautam, V.; Singh, M.P.; Singh, S.P. An approach for characterization of infected area in tomato leaf disease based on deep learning and object detection technique. Eng. Appl. Artif. Intell.; 2022; 115, 105210. [DOI: https://dx.doi.org/10.1016/j.engappai.2022.105210]
31. Mendes, P.A.S.; Coimbra, A.P.; de Almeida, A.T. Forest Vegetation Detection Using Deep Learning Object Detection Models. Forests; 2023; 14, 1787. [DOI: https://dx.doi.org/10.3390/f14091787]
32. Scher, S.; Messori, G. Predicting weather forecast uncertainty with machine learning. Q. J. R. Meteorol. Soc.; 2018; 144, pp. 2830-2841. [DOI: https://dx.doi.org/10.1002/qj.3410]
33. Weyn, J.A.; Durran, D.R.; Caruana, R.; Cresswell-Clay, N. Sub-Seasonal Forecasting with a Large Ensemble of Deep-Learning Weather Prediction Models. J. Adv. Model. Earth Syst.; 2021; 13, e2021MS002502. [DOI: https://dx.doi.org/10.1029/2021MS002502]
34. Yao, L.; Mao, C.; Luo, Y. Clinical text classification with rule-based features and knowledge-guided convolutional neural networks. BMC Med. Inform. Decis. Mak.; 2019; 19, 71. [DOI: https://dx.doi.org/10.1186/s12911-019-0781-4]
35. Kitchenham, B.; Brereton, O.P.; Budgen, D.; Turner, M.; Bailey, J.; Linkman, S. Systematic literature reviews in software engineering–a systematic literature review. Inf. Softw. Technol.; 2009; 51, pp. 7-15. [DOI: https://dx.doi.org/10.1016/j.infsof.2008.09.009]
36. Zhang, Y.; Yang, Y.; Zhang, Q.; Duan, R.; Liu, J.; Qin, Y.; Wang, X. Toward Multi-Stage Phenotyping of Soybean with Multimodal UAV Sensor Data: A Comparison of Machine Learning Approaches for Leaf Area Index Estimation. Remote Sens.; 2023; 15, 7. [DOI: https://dx.doi.org/10.3390/rs15010007]
37. Juan, Y.; Ke, Z.; Chen, Z.; Zhong, D.; Chen, W.; Yin, L. Rapid density estimation of tiny pests from sticky traps using Qpest RCNN in conjunction with UWB-UAV-based IoT framework. Neural Comput. Appl.; 2023; 36, pp. 9779-9803. [DOI: https://dx.doi.org/10.1007/s00521-023-09230-4]
38. Cong, C.; Guangqiao, C.; Yibai, L.; Dong, L.; Bin, M.; Jinlong, Z.; Liang, L.; Jianping, H. Research on Monitoring Methods for the Appropriate Rice Harvest Period Based on Multispectral Remote Sensing. Discret. Dyn. Nat. Soc.; 2022; 2022, 1519667. [DOI: https://dx.doi.org/10.1155/2022/1519667]
39. Tatsumi, K.; Igarashi, N.; Mengxue, X. Prediction of plant-level tomato biomass and yield using machine learning with unmanned aerial vehicle imagery. Plant Methods; 2021; 17, 77. [DOI: https://dx.doi.org/10.1186/s13007-021-00761-2]
40. Chen, Y.; Lee, W.S.; Gan, H.; Peres, N.; Fraisse, C.; Zhang, Y.; He, Y. Strawberry Yield Prediction Based on a Deep Neural Network Using High-Resolution Aerial Orthoimages. Remote Sens.; 2019; 11, 1584. [DOI: https://dx.doi.org/10.3390/rs11131584]
41. Yu, N.; Li, L.; Schmitz, N.; Tian, L.F.; Greenberg, J.A.; Diers, B.W. Development of methods to improve soybean yield estimation and predict plant maturity with an unmanned aerial vehicle based platform. Remote Sens. Environ.; 2016; 187, pp. 91-101. [DOI: https://dx.doi.org/10.1016/j.rse.2016.10.005]
42. Nevavuori, P.; Narra, N.; Linna, P.; Lipping, T. Crop Yield Prediction Using Multitemporal UAV Data and Spatio-Temporal Deep Learning Models. Remote Sens.; 2020; 12, 4000. [DOI: https://dx.doi.org/10.3390/rs12234000]
43. Peng, J.; Wang, D.; Zhu, W.; Yang, T.; Liu, Z.; Rezaei, E.E.; Li, J.; Sun, Z.; Xin, X. Combination of UAV and deep learning to estimate wheat yield at ripening stage: The potential of phenotypic features. Int. J. Appl. Earth Obs. Geoinf.; 2023; 124, 103494. [DOI: https://dx.doi.org/10.1016/j.jag.2023.103494]
44. Choudhury, M.R.; Das, S.; Christopher, J.; Apan, A.; Chapman, S.; Menzies, N.W.; Dang, Y.P. Improving Biomass and Grain Yield Prediction of Wheat Genotypes on Sodic Soil Using Integrated High-Resolution Multispectral, Hyperspectral, 3D Point Cloud, and Machine Learning Techniques. Remote Sens.; 2021; 13, 3482. [DOI: https://dx.doi.org/10.3390/rs13173482]
45. Ji, Y.; Chen, Z.; Cheng, Q.; Liu, R.; Li, M.; Yan, X.; Li, G.; Wang, D.; Fu, L.; Ma, Y. et al. Estimation of plant height and yield based on UAV imagery in faba bean (Vicia faba L.). Plant Methods; 2022; 18, 26. [DOI: https://dx.doi.org/10.1186/s13007-022-00861-7] [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/35246179]
46. Astaoui, G.; Dadaiss, J.E.; Sebari, I.; Benmansour, S.; Mohamed, E. Mapping Wheat Dry Matter and Nitrogen Content Dynamics and Estimation of Wheat Yield Using UAV Multispectral Imagery Machine Learning and a Variety-Based Approach: Case Study of Morocco. Agriengineering; 2021; 3, pp. 29-49. [DOI: https://dx.doi.org/10.3390/agriengineering3010003]
47. Ganeva, D.; Roumenina, E.; Dimitrov, P.; Gikov, A.; Jelev, G.; Dragov, R.; Bozhanova, V.; Taneva, K. Phenotypic Traits Estimation and Preliminary Yield Assessment in Different Phenophases of Wheat Breeding Experiment Based on UAV Multispectral Images. Remote Sens.; 2022; 14, 1019. [DOI: https://dx.doi.org/10.3390/rs14041019]
48. Vatter, T.; Gracia-Romero, A.; Kefauver, S.C.; Nieto-Taladriz, M.T.; Aparicio, N.; Araus, J.L. Preharvest phenotypic prediction of grain quality and yield of durum wheat using multispectral imaging. Plant J.; 2022; 109, pp. 1507-1518. [DOI: https://dx.doi.org/10.1111/tpj.15648]
49. Liang, J.; Ren, W.; Liu, X.; Zha, H.; Wu, X.; He, C.; Sun, J.; Zhu, M.; Mi, G.; Chen, F. et al. Improving Nitrogen Status Diagnosis and Recommendation of Maize Using UAV Remote Sensing Data. Agronomy; 2023; 13, 1994. [DOI: https://dx.doi.org/10.3390/agronomy13081994]
50. Das, S.; Christopher, J.; Apan, A.; Choudhury, M.R.; Chapman, S.; Menzies, N.W.; Dang, Y.P. Evaluation of water status of wheat genotypes to aid prediction of yield on sodic soils using UAV-thermal imaging and machine learning. Agric. For. Meteorol.; 2021; 307, 108477. [DOI: https://dx.doi.org/10.1016/j.agrformet.2021.108477]
51. Wang, J.; Wu, B.; Kohnen, M.V.; Lin, D.; Yang, C.; Wang, X.; Qiang, A.; Liu, W.; Kang, J.; Li, H. et al. Classification of Rice Yield Using UAV-Based Hyperspectral Imagery and Lodging Feature. Plant Phenomics; 2021; 2021, 9765952. [DOI: https://dx.doi.org/10.34133/2021/9765952]
52. Zhang, S.; Qi, X.; Duan, J.; Yuan, X.; Zhang, H.; Feng, W.; Guo, T.; He, L. Comparison of Attention Mechanism-Based Deep Learning and Transfer Strategies for Wheat Yield Estimation Using Multisource Temporal Drone Imagery. IEEE Trans. Geosci. Remote Sens.; 2024; 62, 4407723. [DOI: https://dx.doi.org/10.1109/TGRS.2024.3401474]
53. Li, Z.; Chen, Z.; Cheng, Q.; Duan, F.; Sui, R.; Huang, X.; Xu, H. UAV-Based Hyperspectral and Ensemble Machine Learning for Predicting Yield in Winter Wheat. Agronomy; 2022; 12, 202. [DOI: https://dx.doi.org/10.3390/agronomy12010202]
54. Bai, D.; Li, D.; Zhao, C.; Wang, Z.; Shao, M.; Guo, B.; Liu, Y.; Wang, Q.; Li, J.; Guo, S. et al. Estimation of soybean yield parameters under lodging conditions using RGB information from unmanned aerial vehicles. Front. Plant Sci.; 2022; 13, 1012293. [DOI: https://dx.doi.org/10.3389/fpls.2022.1012293]
55. Sun, G.; Zhang, Y.; Chen, H.; Wang, L.; Li, M.; Sun, X.; Fei, S.; Xiao, S.; Yan, L.; Li, Y. et al. Improving soybean yield prediction by integrating UAV nadir and cross-circling oblique imaging. Eur. J. Agron.; 2024; 155, 127134. [DOI: https://dx.doi.org/10.1016/j.eja.2024.127134]
56. Sarkar, T.K.; Roy, D.K.; Kang, Y.S.; Jun, S.R.; Park, J.W.; Ryu, C.S. Ensemble of Machine Learning Algorithms for Rice Grain Yield Prediction Using UAV-Based Remote Sensing. J. Biosyst. Eng.; 2024; 49, pp. 1-19. [DOI: https://dx.doi.org/10.1007/s42853-023-00209-6]
57. Fei, S.; Hassan, M.A.; Ma, Y.; Shu, M.; Cheng, Q.; Li, Z.; Chen, Z.; Xiao, Y. Entropy Weight Ensemble Framework for Yield Prediction of Winter Wheat Under Different Water Stress Treatments Using Unmanned Aerial Vehicle-Based Multispectral and Thermal Data. Front. Plant Sci.; 2021; 12, 730181. [DOI: https://dx.doi.org/10.3389/fpls.2021.730181] [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/34987529]
58. Herrero-Huerta, M.; Rodriguez-Gonzalvez, P.; Rainey, K.M. Yield prediction by machine learning from UAS-based multi-sensor data fusion in soybean. Plant Methods; 2020; 16, 78. [DOI: https://dx.doi.org/10.1186/s13007-020-00620-6] [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/32514286]
59. Bhadra, S.; Sagan, V.; Skobalski, J.; Grignola, F.; Sarkar, S.; Vilbig, J. End-to-end 3D CNN for plot-scale soybean yield prediction using multitemporal UAV-based RGB images. Precis. Agric.; 2024; 25, pp. 1014-1037. [DOI: https://dx.doi.org/10.1007/s11119-023-10096-8]
60. Ma, J.; Liu, B.; Ji, L.; Zhu, Z.; Wu, Y.; Jiao, W. Field-scale yield prediction of winter wheat under different irrigation regimes based on dynamic fusion of multimodal UAV imagery. Int. J. Appl. Earth Obs. Geoinf.; 2023; 118, 103292. [DOI: https://dx.doi.org/10.1016/j.jag.2023.103292]
61. Zhou, W.; Song, C.; Liu, C.; Fu, Q.; An, T.; Wang, Y.; Sun, X.; Wen, N.; Tang, H.; Wang, Q. A Prediction Model of Maize Field Yield Based on the Fusion of Multitemporal and Multimodal UAV Data: A Case Study in Northeast China. Remote Sens.; 2023; 15, 3483. [DOI: https://dx.doi.org/10.3390/rs15143483]
62. Moghimi, A.; Yang, C.; Anderson, J.A. Aerial hyperspectral imagery and deep neural networks for high-throughput yield phenotyping in wheat. Comput. Electron. Agric.; 2020; 172, 105299. [DOI: https://dx.doi.org/10.1016/j.compag.2020.105299]
63. Skobalski, J.; Sagan, V.; Alifu, H.; Al Akkad, O.; Lopes, F.A.; Grignola, F. Bridging the gap between crop breeding and GeoAI: Soybean yield prediction from multispectral UAV images with transfer learning. ISPRS J. Photogramm. Remote Sens.; 2024; 210, pp. 260-281. [DOI: https://dx.doi.org/10.1016/j.isprsjprs.2024.03.015]
64. Camenzind, M.P.; Yu, K. Multi temporal multispectral UAV remote sensing allows for yield assessment across European wheat varieties already before flowering. Front. Plant Sci.; 2024; 14, 1214931. [DOI: https://dx.doi.org/10.3389/fpls.2023.1214931]
65. Eugenio, F.C.; Grohs, M.; Venancio, L.P.; Schuh, M.; Bottega, E.L.; Ruoso, R.; Schons, C.; Mallmann, C.L.; Badin, T.L.; Fernandes, P. Estimation of soybean yield from machine learning techniques and multispectral RPAS imagery. Remote Sens. Appl. Soc. Environ.; 2020; 20, 100397. [DOI: https://dx.doi.org/10.1016/j.rsase.2020.100397]
66. Teshome, F.T.; Bayabil, H.K.; Hoogenboom, G.; Schaffer, B.; Singh, A.; Ampatzidis, Y. Unmanned aerial vehicle (UAV) imaging and machine learning applications for plant phenotyping. Comput. Electron. Agric.; 2023; 212, 108064. [DOI: https://dx.doi.org/10.1016/j.compag.2023.108064]
67. Tao, H.; Feng, H.; Xu, L.; Miao, M.; Yang, G.; Yang, X.; Fan, L. Estimation of the Yield and Plant Height of Winter Wheat Using UAV-Based Hyperspectral Images. Sensors; 2020; 20, 1231. [DOI: https://dx.doi.org/10.3390/s20041231] [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/32102358]
68. Fiorentini, M.; Schillaci, C.; Denora, M.; Zenobi, S.; Deligios, P.; Orsini, R.; Santilocchi, R.; Perniola, M.; Montanarella, L.; Ledda, L. A machine learning modeling framework for Triticum turgidum subsp. durum Desf. yield forecasting in Italy. Agron. J.; 2024; 116, pp. 1050-1070. [DOI: https://dx.doi.org/10.1002/agj2.21279]
69. Ramos, A.P.M.; Osco, L.P.; Furuya, D.E.G.; Gonçalves, W.N.; Santana, D.C.; Teodoro, L.P.R.; Junior, C.A.d.S.; Capristo-Silva, G.F.; Li, J.; Baio, F.H.R. et al. A random forest ranking approach to predict yield in maize with uav-based vegetation spectral indices. Comput. Electron. Agric.; 2020; 178, 105791. [DOI: https://dx.doi.org/10.1016/j.compag.2020.105791]
70. Fei, S.; Hassan, M.A.; He, Z.; Chen, Z.; Shu, M.; Wang, J.; Li, C.; Xiao, Y. Assessment of Ensemble Learning to Predict Wheat Grain Yield Based on UAV-Multispectral Reflectance. Remote Sens.; 2021; 13, 2338. [DOI: https://dx.doi.org/10.3390/rs13122338]
71. Yang, G.; Li, Y.; Yuan, S.; Zhou, C.; Xiang, H.; Zhao, Z.; Wei, Q.; Chen, Q.; Peng, S.; Xu, L. Enhancing direct-seeded rice yield prediction using UAV-derived features acquired during the reproductive phase. Precis. Agric.; 2023; 25, pp. 834-864. [DOI: https://dx.doi.org/10.1007/s11119-023-10103-y]
72. Alzubaidi, L.; Zhang, J.; Humaidi, A.J.; Al-Dujaili, A.; Duan, Y.; Al-Shamma, O.; Santamaría, J.; Fadhel, M.A.; Al-Amidie, M.; Farhan, L. Review of deep learning: Concepts, CNN architectures, challenges, applications, future directions. J. Big Data; 2021; 8, 53. [DOI: https://dx.doi.org/10.1186/s40537-021-00444-8]
73. Liang, Y.; Li, H.; Wu, H.; Zhao, Y.; Liu, Z.; Liu, D.; Liu, Z.; Fan, G.; Pan, Z.; Shen, Z. et al. A rotated rice spike detection model and a crop yield estimation application based on UAV images. Comput. Electron. Agric.; 2024; 224, 109188. [DOI: https://dx.doi.org/10.1016/j.compag.2024.109188]
74. Bian, C.; Shi, H.; Wu, S.; Zhang, K.; Wei, M.; Zhao, Y.; Sun, Y.; Zhuang, H.; Zhang, X.; Chen, S. Prediction of Field-Scale Wheat Yield Using Machine Learning Method and Multi-Spectral UAV Data. Remote Sens.; 2022; 14, 1474. [DOI: https://dx.doi.org/10.3390/rs14061474]
75. Habibi, L.N.; Matsui, T.; Tanaka, T.S. Critical evaluation of the effects of a cross-validation strategy and machine learning optimization on the prediction accuracy and transferability of a soybean yield prediction model using UAV-based remote sensing. J. Agric. Food Res.; 2024; 16, 101096. [DOI: https://dx.doi.org/10.1016/j.jafr.2024.101096]
76. Mia, S.; Tanabe, R.; Habibi, L.N.; Hashimoto, N.; Homma, K.; Maki, M.; Matsui, T.; Tanaka, T.S.T. Multimodal Deep Learning for Rice Yield Prediction Using UAV-Based Multispectral Imagery and Weather Data. Remote Sens.; 2023; 15, 2511. [DOI: https://dx.doi.org/10.3390/rs15102511]
77. Pukrongta, N.; Taparugssanagorn, A.; Sangpradit, K. Enhancing Crop Yield Predictions with PEnsemble 4: IoT and ML-Driven for Precision Agriculture. Appl. Sci.; 2024; 14, 3313. [DOI: https://dx.doi.org/10.3390/app14083313]
78. Togninalli, M.; Wang, X.; Kucera, T.; Shrestha, S.; Juliana, P.; Mondal, S.; Pinto, F.; Govindan, V.; Crespo-Herrera, L.; Huerta-Espino, J. et al. Multi-modal deep learning improves grain yield prediction in wheat breeding by fusing genomics and phenomics. Bioinformatics; 2023; 39, btad336. [DOI: https://dx.doi.org/10.1093/bioinformatics/btad336]
79. Fei, S.; Hassan, M.A.; Xiao, Y.; Su, X.; Chen, Z.; Cheng, Q.; Duan, F.; Chen, R.; Ma, Y. UAV-based multi-sensor data fusion and machine learning algorithm for yield prediction in wheat. Precis. Agric.; 2023; 24, pp. 187-212. [DOI: https://dx.doi.org/10.1007/s11119-022-09938-8]
80. Yang, R.; Zhou, J.; Lu, X.; Shen, J.; Chen, H.; Chen, M.; He, Y.; Liu, F. A robust rice yield estimation framework developed by grading modeling and normalized weight decision-making strategy using UAV imaging technology. Comput. Electron. Agric.; 2023; 215, 108417. [DOI: https://dx.doi.org/10.1016/j.compag.2023.108417]
81. Goodfellow, I.; Bengio, Y.; Courville, A. Deep Learning; MIT Press: Cambridge, MA, USA, 2016.
82. Ma’sum, M.A.; Pratama, M.; Savitha, R.; Liu, L.; Kowalczyk, R. Unsupervised Few-Shot Continual Learning for Remote Sensing Image Scene Classification. IEEE Trans. Geosci. Remote Sens.; 2024; 62, 4707214. [DOI: https://dx.doi.org/10.1109/tgrs.2024.3445887]
83. Zhu, H.; Zhu, H.; Shen, C.; Shen, C.; Wang, J.; Wang, J.; Chen, B.; Chen, B.; Wang, D.; Wang, D. et al. Few-Shot Class-Incremental Learning with Adjustable Pseudo-Incremental Sessions for Bearing Fault Diagnosis. IEEE Sensors J.; 2024; 24, pp. 19543-19552. [DOI: https://dx.doi.org/10.1109/JSEN.2024.3395515]
84. Ahmed, M.; Mustafa, H.; Wu, M.; Babaei, M.; Kong, L.; Jeong, N.; Gan, Y. Few shot learning for avocado maturity determination from microwave images. J. Agric. Food Res.; 2024; 15, 100977. [DOI: https://dx.doi.org/10.1016/j.jafr.2024.100977]
85. Cai, Z.; He, M.; Li, C.; Qi, H.; Bai, R.; Yang, J.; Zhang, C. Identification of chrysanthemum using hyperspectral imaging based on few-shot class incremental learning. Comput. Electron. Agric.; 2023; 215, 108371. [DOI: https://dx.doi.org/10.1016/j.compag.2023.108371]
86. Sreejith, S.; Nehemiah, H.K.; Kannan, A. Clinical data classification using an enhanced SMOTE and chaotic evolutionary feature selection. Comput. Biol. Med.; 2020; 126, 103991. [DOI: https://dx.doi.org/10.1016/j.compbiomed.2020.103991]
87. Kourehpaz, P.; Hutt, C.M. Machine Learning for Enhanced Regional Seismic Risk Assessments. J. Struct. Eng.; 2022; 148, 04022126. [DOI: https://dx.doi.org/10.1061/(ASCE)ST.1943-541X.0003421]
88. Ke, H.; Gong, S.; He, J.; Zhang, L.; Mo, J. A hybrid XGBoost-SMOTE model for optimization of operational air quality numerical model forecasts. Front. Environ. Sci.; 2022; 10, 1007530. [DOI: https://dx.doi.org/10.3389/fenvs.2022.1007530]
89. Ren, P.; Li, H.; Han, S.; Chen, R.; Yang, G.; Yang, H.; Feng, H.; Zhao, C. Estimation of Soybean Yield by Combining Maturity Group Information and Unmanned Aerial Vehicle Multi-Sensor Data Using Machine Learning. Remote Sens.; 2023; 15, 4286. [DOI: https://dx.doi.org/10.3390/rs15174286]
90. Li, B.; Xu, X.; Zhang, L.; Han, J.; Bian, C.; Li, G.; Liu, J.; Jin, L. Above-ground biomass estimation and yield prediction in potato by using UAV-based RGB and hyperspectral imaging. ISPRS J. Photogramm. Remote Sens.; 2020; 162, pp. 161-172. [DOI: https://dx.doi.org/10.1016/j.isprsjprs.2020.02.013]
91. Maimaitijiang, M.; Sagan, V.; Sidike, P.; Hartling, S.; Esposito, F.; Fritschi, F.B. Soybean yield prediction from UAV using multimodal data fusion and deep learning. Remote Sens. Environ.; 2020; 237, 111599. [DOI: https://dx.doi.org/10.1016/j.rse.2019.111599]
92. Yang, Q.; Shi, L.; Han, J.; Zha, Y.; Zhu, P. Deep convolutional neural networks for rice grain yield estimation at the ripening stage using UAV-based remotely sensed images. Field Crop. Res.; 2019; 235, pp. 142-153. [DOI: https://dx.doi.org/10.1016/j.fcr.2019.02.022]
93. Thakur, P.S.; Sheorey, T.; Ojha, A. VGG-ICNN: A Lightweight CNN model for crop disease identification. Multimedia Tools Appl.; 2023; 82, pp. 497-520. [DOI: https://dx.doi.org/10.1007/s11042-022-13144-z]
94. Chen, W.; Chen, J.; Duan, R.; Fang, Y.; Ruan, Q.; Zhang, D. MS-DNet: A mobile neural network for plant disease identification. Comput. Electron. Agric.; 2022; 199, 107175. [DOI: https://dx.doi.org/10.1016/j.compag.2022.107175]
95. Liu, Y.; Zhang, X.; Gao, Y.; Qu, T.; Shi, Y. Improved CNN Method for Crop Pest Identification Based on Transfer Learning. Comput. Intell. Neurosci.; 2022; 2022, 9709648. [DOI: https://dx.doi.org/10.1155/2022/9709648]
96. Guruprakash, K.S.; Siva Karthik, P.; Ramachandran, A.; Gayathri, K. Crop pest identification using deep network based extracted features and MobileENet in smart agriculture. Land Degrad. Dev.; 2024; 35, pp. 3642-3652. [DOI: https://dx.doi.org/10.1002/ldr.5157]
97. Taylor, J.A.; McBratney, A.B.; Whelan, B.M. Establishing Management Classes for Broadacre Agricultural Production. Agron. J.; 2007; 99, pp. 1366-1376. [DOI: https://dx.doi.org/10.2134/agronj2007.0070]
98. Nevavuori, P.; Narra, N.; Lipping, T. Crop yield prediction with deep convolutional neural networks. Comput. Electron. Agric.; 2019; 163, 104859. [DOI: https://dx.doi.org/10.1016/j.compag.2019.104859]
99. Li, Z.; Chen, Z.; Cheng, Q.; Fei, S.; Zhou, X. Deep Learning Models Outperform Generalized Machine Learning Models in Predicting Winter Wheat Yield Based on Multispectral Data from Drones. Drones; 2023; 7, 505. [DOI: https://dx.doi.org/10.3390/drones7080505]
100. Cui, Y.; Ji, Y.; Liu, R.; Li, W.; Liu, Y.; Liu, Z.; Zong, X.; Yang, T. Faba Bean (Vicia faba L.) Yield Estimation Based on Dual-Sensor Data. Drones; 2023; 7, 378. [DOI: https://dx.doi.org/10.3390/drones7060378]
101. Han, J.; Zhang, Z.; Cao, J.; Luo, Y.; Zhang, L.; Li, Z.; Zhang, J. Prediction of Winter Wheat Yield Based on Multi-Source Data and Machine Learning in China. Remote Sens.; 2020; 12, 236. [DOI: https://dx.doi.org/10.3390/rs12020236]
102. Filippi, P.; Jones, E.J.; Wimalathunge, N.S.; Somarathna, P.D.S.N.; Pozza, L.E.; Ugbaje, S.U.; Jephcott, T.G.; Paterson, S.E.; Whelan, B.M.; Bishop, T.F.A. An approach to forecast grain crop yield using multi-layered, multi-farm data sets and machine learning. Precis. Agric.; 2019; 20, pp. 1015-1029. [DOI: https://dx.doi.org/10.1007/s11119-018-09628-4]
103. Filippi, P.; Whelan, B.M.; Vervoort, R.W.; Bishop, T.F. Mid-season empirical cotton yield forecasts at fine resolutions using large yield mapping datasets and diverse spatial covariates. Agric. Syst.; 2020; 184, 102894. [DOI: https://dx.doi.org/10.1016/j.agsy.2020.102894]
104. Nguyen, L.H.; Zhu, J.; Lin, Z.; Du, H.; Yang, Z.; Guo, W. Spatial-Temporal Multi-Task Learning for within-Field Cotton Yield Prediction. Advances in Knowledge Discovery and Data Mining, PAKDD 2019; Lecture Notes in Computer Science Yang, Q.; Zhou, Z.H.; Gong, Z.; Zhang, M.L.; Huang, S.J. Springer: Cham, Switzerland, 2019; Volume 11439, pp. 343-354.
105. Shi, J.; Zheng, X.; Li, Y.; Zhang, Q.; Ying, S. Multimodal Neuroimaging Feature Learning with Multimodal Stacked Deep Polynomial Networks for Diagnosis of Alzheimer’s Disease. IEEE J. Biomed. Health Inform.; 2018; 22, pp. 173-183. [DOI: https://dx.doi.org/10.1109/JBHI.2017.2655720]
106. Hua, Y.; Feng, Z.; Song, X.; Wu, X.-J.; Kittler, J. MMDG-DTI: Drug–target interaction prediction via multimodal feature fusion and domain generalization. Pattern Recognit.; 2025; 157, 110887. [DOI: https://dx.doi.org/10.1016/j.patcog.2024.110887]
107. Wang, Y.; Chen, X.; Li, J.; Lu, Z. Convolutional Block Attention Module–Multimodal Feature-Fusion Action Recognition: Enabling Miner Unsafe Action Recognition. Sensors; 2024; 24, 4557. [DOI: https://dx.doi.org/10.3390/s24144557] [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/39065955]
108. Chen, L.; Li, M.; Wu, M.; Pedrycz, W.; Hirota, K. Coupled Multimodal Emotional Feature Analysis Based on Broad-Deep Fusion Networks in Human–Robot Interaction. IEEE Trans. Neural Netw. Learn. Syst.; 2024; 35, pp. 9663-9673. [DOI: https://dx.doi.org/10.1109/TNNLS.2023.3236320] [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/37021991]
109. Li, D.; Miao, Y.; Gupta, S.K.; Rosen, C.J.; Yuan, F.; Wang, C.; Wang, L.; Huang, Y. Improving Potato Yield Prediction by Combining Cultivar Information and UAV Remote Sensing Data Using Machine Learning. Remote Sens.; 2021; 13, 3322. [DOI: https://dx.doi.org/10.3390/rs13163322]
110. Fu, Z.; Jiang, J.; Gao, Y.; Krienke, B.; Wang, M.; Zhong, K.; Cao, Q.; Tian, Y.; Zhu, Y.; Cao, W. et al. Wheat Growth Monitoring and Yield Estimation based on Multi-Rotor Unmanned Aerial Vehicle. Remote Sens.; 2020; 12, 508. [DOI: https://dx.doi.org/10.3390/rs12030508]
111. Teodoro, P.E.; Teodoro, L.P.R.; Baio, F.H.R.; Junior, C.A.d.S.; dos Santos, R.G.; Ramos, A.P.M.; Pinheiro, M.M.F.; Osco, L.P.; Gonçalves, W.N.; Carneiro, A.M. et al. Predicting Days to Maturity, Plant Height, and Grain Yield in Soybean: A Machine and Deep Learning Approach Using Multispectral Data. Remote Sens.; 2021; 13, 4632. [DOI: https://dx.doi.org/10.3390/rs13224632]
112. Yuan, J.; Zheng, Z.; Chu, C.; Wang, W.; Guo, L. A Hybrid Synthetic Minority Oversampling Technique and Deep Neural Network Framework for Improving Rice Yield Estimation in an Open Environment. Agronomy; 2024; 14, 1890. [DOI: https://dx.doi.org/10.3390/agronomy14091890]
113. Feng, X.; Zhao, C.; Wang, C.; Wu, H.; Miao, Y.; Zhang, J. A Vegetable Leaf Disease Identification Model Based on Image-Text Cross-Modal Feature Fusion. Front. Plant Sci.; 2022; 13, 918940. [DOI: https://dx.doi.org/10.3389/fpls.2022.918940]
114. Guo, Y.; Wang, H.; Wu, Z.; Wang, S.; Sun, H.; Senthilnath, J.; Wang, J.; Bryant, C.R.; Fu, Y. Modified Red Blue Vegetation Index for Chlorophyll Estimation and Yield Prediction of Maize from Visible Images Captured by UAV. Sensors; 2020; 20, 5055. [DOI: https://dx.doi.org/10.3390/s20185055]
115. Shafiee, S.; Lied, L.M.; Burud, I.; Dieseth, J.A.; Alsheikh, M.; Lillemo, M. Sequential forward selection and support vector regression in comparison to LASSO regression for spring wheat yield prediction based on UAV imagery. Comput. Electron. Agric.; 2021; 183, 106036. [DOI: https://dx.doi.org/10.1016/j.compag.2021.106036]
116. Adak, A.; Murray, S.C.; Božinović, S.; Lindsey, R.; Nakasagga, S.; Chatterjee, S.; Anderson, S.L.; Wilde, S. Temporal Vegetation Indices and Plant Height from Remotely Sensed Imagery Can Predict Grain Yield and Flowering Time Breeding Value in Maize via Machine Learning Regression. Remote Sens.; 2021; 13, 2141. [DOI: https://dx.doi.org/10.3390/rs13112141]
117. Bellis, E.S.; Hashem, A.A.; Causey, J.L.; Runkle, B.R.K.; Moreno-García, B.; Burns, B.W.; Green, V.S.; Burcham, T.N.; Reba, M.L.; Huang, X. Detecting Intra-Field Variation in Rice Yield with Unmanned Aerial Vehicle Imagery and Deep Learning. Front. Plant Sci.; 2022; 13, 716506. [DOI: https://dx.doi.org/10.3389/fpls.2022.716506] [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/35401643]
118. Fan, J.; Zhou, J.; Wang, B.; de Leon, N.; Kaeppler, S.M.; Lima, D.C.; Zhang, Z. Estimation of Maize Yield and Flowering Time Using Multi-Temporal UAV-Based Hyperspectral Data. Remote Sens.; 2022; 14, 3052. [DOI: https://dx.doi.org/10.3390/rs14133052]
119. Alabi, T.R.; Abebe, A.T.; Chigeza, G.; Fowobaje, K.R. Estimation of soybean grain yield from multispectral high-resolution UAV data with machine learning models in West Africa. Remote Sens. Appl. Soc. Environ.; 2022; 27, 100782. [DOI: https://dx.doi.org/10.1016/j.rsase.2022.100782]
120. Prey, L.; Hanemann, A.; Ramgraber, L.; Seidl-Schulz, J.; Noack, P.O. UAV-Based Estimation of Grain Yield for Plant Breeding: Applied Strategies for Optimizing the Use of Sensors, Vegetation Indices, Growth Stages, and Machine Learning Algorithms. Remote Sens.; 2022; 14, 6345. [DOI: https://dx.doi.org/10.3390/rs14246345]
121. Guo, Y.; Xiao, Y.; Hao, F.; Zhang, X.; Chen, J.; de Beurs, K.; He, Y.; Fu, Y.H. Comparison of different machine learning algorithms for predicting maize grain yield using UAV-based hyperspectral images. Int. J. Appl. Earth Obs. Geoinf.; 2023; 124, 103528. [DOI: https://dx.doi.org/10.1016/j.jag.2023.103528]
122. Chatterjee, S.; Adak, A.; Wilde, S.; Nakasagga, S.; Murray, S.C. Cumulative temporal vegetation indices from unoccupied aerial systems allow maize (Zea mays L.) hybrid yield to be estimated across environments with fewer flights. PLoS ONE; 2023; 18, e0277804. [DOI: https://dx.doi.org/10.1371/journal.pone.0277804]
123. Ji, Y.; Liu, R.; Xiao, Y.; Cui, Y.; Chen, Z.; Zong, X.; Yang, T. Faba bean above-ground biomass and bean yield estimation based on consumer-grade unmanned aerial vehicle RGB images and ensemble learning. Precis. Agric.; 2023; 24, pp. 1439-1460. [DOI: https://dx.doi.org/10.1007/s11119-023-09997-5]
124. Zhou, H.; Yang, J.; Lou, W.; Sheng, L.; Li, D.; Hu, H. Improving grain yield prediction through fusion of multi-temporal spectral features and agronomic trait parameters derived from UAV imagery. Front. Plant Sci.; 2023; 14, 1217448. [DOI: https://dx.doi.org/10.3389/fpls.2023.1217448]
125. Baio, F.H.R.; Santana, D.C.; Teodoro, L.P.R.; de Oliveira, I.C.; Gava, R.; de Oliveira, J.L.G.; Junior, C.A.d.S.; Teodoro, P.E.; Shiratsuchi, L.S. Maize Yield Prediction with Machine Learning, Spectral Variables and Irrigation Management. Remote Sens.; 2023; 15, 79. [DOI: https://dx.doi.org/10.3390/rs15010079]
126. Kumar, C.; Mubvumba, P.; Huang, Y.; Dhillon, J.; Reddy, K. Multi-Stage Corn Yield Prediction Using High-Resolution UAV Multispectral Data and Machine Learning Models. Agronomy; 2023; 13, 1277. [DOI: https://dx.doi.org/10.3390/agronomy13051277]
127. Shafi, U.; Mumtaz, R.; Anwar, Z.; Ajmal, M.M.; Khan, M.A.; Mahmood, Z.; Qamar, M.; Jhanzab, H.M. Tackling Food Insecurity Using Remote Sensing and Machine Learning-Based Crop Yield Prediction. IEEE Access; 2023; 11, pp. 108640-108657. [DOI: https://dx.doi.org/10.1109/ACCESS.2023.3321020]
128. Avneri, A.; Aharon, S.; Brook, A.; Atsmon, G.; Smirnov, E.; Sadeh, R.; Abbo, S.; Peleg, Z.; Herrmann, I.; Bonfil, D.J. et al. UAS-based imaging for prediction of chickpea crop biophysical parameters and yield. Comput. Electron. Agric.; 2023; 205, 107581. [DOI: https://dx.doi.org/10.1016/j.compag.2022.107581]
129. Wei, L.; Yang, H.; Niu, Y.; Zhang, Y.; Xu, L.; Chai, X. Wheat biomass, yield, and straw-grain ratio estimation from multi-temporal UAV-based RGB and multispectral images. Biosyst. Eng.; 2023; 234, pp. 187-205. [DOI: https://dx.doi.org/10.1016/j.biosystemseng.2023.08.002]
130. Ma, J.; Wu, Y.; Liu, B.; Zhang, W.; Wang, B.; Chen, Z.; Wang, G.; Guo, A. Wheat Yield Prediction Using Unmanned Aerial Vehicle RGB-Imagery-Based Convolutional Neural Network and Limited Training Samples. Remote Sens.; 2023; 15, 5444. [DOI: https://dx.doi.org/10.3390/rs15235444]
131. Li, Y.; Zhao, B.; Wang, J.; Li, Y.; Yuan, Y. Winter Wheat Yield Estimation Based on Multi-Temporal and Multi-Sensor Remote Sensing Data Fusion. Agriculture; 2023; 13, 2190. [DOI: https://dx.doi.org/10.3390/agriculture13122190]
132. de Sa Leitão, D.A.H.; Sharma, A.K.; Singh, A.; Sharma, L.K. Yield and plant height predictions of irrigated maize through unmanned aerial vehicle in North Florida. Comput. Electron. Agric.; 2023; 215, 108374. [DOI: https://dx.doi.org/10.1016/j.compag.2023.108374]
133. Alam Shammi, S.; Huang, Y.; Feng, G.; Tewolde, H.; Zhang, X.; Jenkins, J.; Shankle, M. Application of UAV Multispectral Imaging to Monitor Soybean Growth with Yield Prediction through Machine Learning. Agronomy; 2024; 14, 672. [DOI: https://dx.doi.org/10.3390/agronomy14040672]
134. Shen, Y.; Yan, Z.; Yang, Y.; Tang, W.; Sun, J.; Zhang, Y. Application of UAV-Borne Visible-Infared Pushbroom Imaging Hyperspectral for Rice Yield Estimation Using Feature Selection Regression Methods. Sustainability; 2024; 16, 632. [DOI: https://dx.doi.org/10.3390/su16020632]
135. Killeen, P.; Kiringa, I.; Yeap, T.; Branco, P. Corn Grain Yield Prediction Using UAV-Based High Spatiotemporal Resolution Imagery, Machine Learning, and Spatial Cross-Validation. Remote Sens.; 2024; 16, 683. [DOI: https://dx.doi.org/10.3390/rs16040683]
136. Liu, Z.; Ji, Y.; Ya, X.; Liu, R.; Liu, Z.; Zong, X.; Yang, T. Ensemble Learning for Pea Yield Estimation Using Unmanned Aerial Vehicles, Red Green Blue, and Multispectral Imagery. Drones; 2024; 8, 227. [DOI: https://dx.doi.org/10.3390/drones8060227]
137. Ali, N.; Mohammed, A.; Bais, A.; Berraies, S.; Ruan, Y.; Cuthbert, R.D.; Sangha, J.S. Field Scale Precision: Predicting Grain Yield of Diverse Wheat Breeding Lines Using High-Throughput UAV Multispectral Imaging. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens.; 2024; 17, pp. 11419-11433. [DOI: https://dx.doi.org/10.1109/JSTARS.2024.3411994]
138. Szigeti, N.; Sulyán, P.G.; Labus, B.; Földi, M.; Hunyadi, É.; Mikó, P.; Milibák, F.; Drexler, D. Limitations and solutions for developing a grain yield and protein content forecasting model based on vegetation indices in organic wheat production—On-farm experimentation. Biol. Agric. Hortic.; 2024; 40, pp. 190-204. [DOI: https://dx.doi.org/10.1080/01448765.2024.2364304]
139. Sun, X.; Zhang, P.; Wang, Z.; Wang, Y. Potential of multi-seasonal vegetation indices to predict rice yield from UAV multispectral observations. Precis. Agric.; 2024; 25, pp. 1235-1261. [DOI: https://dx.doi.org/10.1007/s11119-023-10109-6]
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
Abstract
Preharvest crop yield estimation is crucial for achieving food security and managing crop growth. Unmanned aerial vehicles (UAVs) can quickly and accurately acquire field crop growth data and are important mediums for collecting agricultural remote sensing data. With the rapid development of machine learning, especially deep learning, research on yield estimation based on UAV remote sensing data and machine learning has achieved excellent results. This paper systematically reviews the current research of yield estimation research based on UAV remote sensing and machine learning through a search of 76 articles, covering aspects such as the grain crops studied, research questions, data collection, feature selection, optimal yield estimation models, and optimal growth periods for yield estimation. Through visual and narrative analysis, the conclusion covers all the proposed research questions. Wheat, corn, rice, and soybeans are the main research objects, and the mechanisms of nitrogen fertilizer application, irrigation, crop variety diversity, and gene diversity have received widespread attention. In the modeling process, feature selection is the key to improving the robustness and accuracy of the model. Whether based on single modal features or multimodal features for yield estimation research, multispectral images are the main source of feature information. The optimal yield estimation model may vary depending on the selected features and the period of data collection, but random forest and convolutional neural networks still perform the best in most cases. Finally, this study delves into the challenges currently faced in terms of data volume, feature selection and optimization, determining the optimal growth period, algorithm selection and application, and the limitations of UAVs. Further research is needed in areas such as data augmentation, feature engineering, algorithm improvement, and real-time yield estimation in the future.
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer
Details

1 College of Information Science & Technology, Hebei Agricultural University, Baoding 071001, China; Academy of National Food and Strategic Reserves Administration, Beijing 100037, China
2 Agriculture Information Institute, Chinese Academy of Agriculture Science, Beijing 100086, China
3 College of Information Science & Technology, Hebei Agricultural University, Baoding 071001, China
4 College of Information Science & Technology, Hebei Agricultural University, Baoding 071001, China; Big Data Development Center, Ministry of Agriculture and Rural Affairs, Beijing 100125, China
5 College of Information Science & Technology, Hebei Agricultural University, Baoding 071001, China; Agriculture Information Institute, Chinese Academy of Agriculture Science, Beijing 100086, China