1. Introduction
Sweetpotato (Ipomoea batatas (L.) Lam.), a herb belonging to the sweetpotato genus Ipomoea in the Convolvulaceae family, is an important food crop, and can also be used as an industrial raw material and as fodder. It is not only easy to cultivate and versatile, but also highly productive. Indeed, globally, it ranks as the seventh most important staple food; in developing countries, it ranks fifth, after rice, wheat, maize, and cassava. Similarly to other crops, diseases and pests are one of the most important factors jeopardizing sweetpotato production, and virus diseases are one of the most damaging types of diseases. Sweetpotato virus disease (SPVD) is caused by sweetpotato chlorotic stunt virus (SPCSV) and sweetpotato feathery mottle virus (SPFMV). It is a viral disease caused by the co-infection of sweetpotato chlorotic stunt virus (SPCSV) and sweetpotato feathery mottle virus (SPFMV), which can lead to an over 90% reduction in sweetpotato yield, and even to crop failure in severe cases [1]. In addition, the prevalence of virus diseases is an important factor in the decline of sweetpotato quality and germplasm degradation [2]. Currently, there are no high-quality virus disease-resistant varieties in existence, and there is a lack of effective control agents. Therefore, early diagnosis and warning of SPVD, the timely removal of virus-carrying plants, and early avoidance of virus transmission are the most effective preventive and control measures [3].
Traditional SPVD detection methods mainly include serological detection [4], indicator plant detection [5], and molecular biology detection [6]. Although these methods have high sensitivity, they require specialized equipment and personnel to be performed, they have a high cost and low efficiency, and they struggle to meet the demand of large-scale industrial services, while their detection accuracy is also affected by the subjectivity of the staff, which can easily lead to the optimal opportunity for prevention and control being missed [7].
Hyperspectral remote sensing captures subtle spectral variations in plant leaves across the visible, near-infrared, and short-wave infrared regions by collecting hundreds of continuous, narrow-band spectral data points [8]. This high-resolution spectral information reveals minute, disease-induced changes in chlorophyll content, moisture levels, cellular structure, and other physiological and biochemical parameters, providing a theoretical foundation and data support for early disease diagnosis [9,10].
Unmanned Aerial Vehicles (UAVs) equipped with hyperspectral cameras offer key advantages, including high spatial resolution, flexible deployment, and rapid data acquisition over large areas under natural lighting conditions. By conducting air-to-ground remote sensing, UAVs can collect canopy-level spectral data that reflect crop structural attributes (e.g., leaf area index (LAI), biomass) [11], spatial heterogeneity (e.g., pest and disease patches, fertility gradients), and interactions between plants and their environment [12].
Several studies have demonstrated the effectiveness of UAV hyperspectral sensing in crop disease detection. For example, after identifying relevant spectral bands associated with changes in pigment concentration and leaf structure caused by sugarcane mosaic virus, researchers developed the anthocyanin red-edge index, which effectively distinguishes virus-affected areas in sugarcane [13]. Shi et al. proposed an end-to-end deep learning model, CropdocNet, which combines multi-level spectral–spatial features for accurate and automated diagnosis of crop diseases and pests, including sugarcane mosaic virus and potato late blight, using UAV-based hyperspectral imagery [14]. Similarly, Mickey et al. analyzed two grapevine viruses—grapevine leafroll disease (GLD) and Shiraz disease (SD)—in Australian vineyards, identifying their unique spectral signatures and optimal detection windows during the growing season to support targeted disease management [15].
Currently, hyperspectral disease identification techniques have been primarily applied to major crops such as rice [16], maize [17], and wheat [18]. However, there is currently a limited body of research and insufficient mechanistic understanding regarding hyperspectral responses to SPVD. Therefore, investigating UAV-based hyperspectral remote sensing for characterizing the spectral response patterns of SPVD holds significant potential for improving early detection and precise disease monitoring in sweetpotato cultivation.
In recent years, Convolutional Neural Networks (CNNs) have been increasingly applied in hyperspectral crop classification tasks [19,20]. CNNs are capable of learning underlying patterns in data without requiring knowledge of its statistical distribution, and they can extract both linear and nonlinear features without relying on prior domain-specific knowledge [21,22,23]. For instance, Trivedi et al. employed deep-learning-based CNN models to classify normal and abnormal potato leaves affected by fungal infections (e.g., early and late blight) into multiple categories [24]. Zeng et al. developed a multiscale selective-attention CNN (MSA-CNN) model for the early detection of powdery mildew in rubber trees [25]. Similarly, Bhatti et al. introduced a mobile application powered by a CNN model that can identify plant diseases and offer management recommendations, enabling farmers and agricultural practitioners to diagnose crop diseases quickly and accurately [26].
Despite the advantages of deep learning (DL) methods in disease detection, most existing approaches primarily focus on pixel- or leaf-level classification tasks [27,28]. While these methods have shown promising performance under controlled laboratory conditions, they often face significant limitations when applied in real-world field environments. Challenges such as variations in illumination between plants, strong background interference, and inconsistent predictions across different regions of the same plant canopy reduce the spatial consistency and reliability of classification results. Compared with the existing convolutional neural network frameworks (such as ResNet, DenseNet, U-Net, etc.), PLCNet not only provides higher classification accuracy, but also achieves stronger spatial consistency and enhanced biological interpretability, making it particularly suitable for efficient disease detection in complex field environments.
To address the limitations of existing methods, this study proposes PLCNet (Plant-Level Classification Network), a plant-level SPVD recognition framework for UAV-based hyperspectral imaging, built upon a 3D Convolutional Neural Network (3D-CNN) (Figure 1). Unlike traditional approaches that classify each pixel independently, PLCNet utilizes high-resolution hyperspectral images captured by UAV-mounted sensors and employs the Random Forest (RF) algorithm for optimal spectral band selection. To address multicollinearity, Variance Inflation Factor (VIF) analysis is performed on the combined RF-selected bands and vegetation indices, ensuring the selection of SPVD-sensitive, non-redundant features. These filtered feature bands are then input into a 3D-CNN, which extracts deep spectral–spatial features for robust and discriminative classification.
To enhance spatial consistency in the resulting classification maps, PLCNet integrates a post-classification refinement module that applies connected component analysis and majority voting, ensuring that each individual plant is assigned a consistent and biologically meaningful label. This comprehensive framework significantly improves the classification accuracy, spatial coherence, and biological interpretability of the method, offering a scalable and practical solution for the early detection of SPVD and high-throughput field-level monitoring.
2. Materials and Methods
2.1. Study Area and Sample Collection
The experimental site for this study was located at the base of the Xuzhou Academy of Agricultural Sciences in the Xuzhou Economic Development Zone, Xuzhou City, Jiangsu Province, China (N34°16′57″–N34°16′59″, E117°17′25″–E117°17′27″, elevation: 36 m). A sweetpotato field within this area was selected for hyperspectral remote sensing data acquisition. Xuzhou’s topography is primarily composed of plains, and it lies within the warm-temperate monsoon climate zone. Spanning east to west, the region exhibits climatic variation influenced by proximity to the ocean. The eastern part of Xuzhou experiences a warm-temperate humid monsoon climate, while the western part is characterized by a warm-temperate semi-humid climate. With four distinct seasons, abundant sunlight, moderate rainfall, and a synchrony of heat and precipitation, the region provides favorable climatic conditions for crop cultivation.
Three widely promoted and representative sweetpotato varieties were selected for the study. Xu Zishu No. 8 (designated FR-1) exhibits mostly crested or shallowly lobed leaves that are dark green in color with a purple halo and purplish-red veins. Xu Shu 37 (FR-2) has predominantly heart-shaped or shallowly lobed leaves that are dark green in color. Shangshu 19 (FR-3) features heart-shaped or shallowly lobed leaves with single notches that are also dark green in appearance. The sweetpotato seedlings used in the experiment were obtained from the Key Laboratory of Sweetpotato Biology and Genetic Breeding of the Ministry of Agriculture and Rural Affairs [29]. The healthy group consisted of virus-free seedlings verified through virus detection. Correspondingly, the infected group included seedlings that only tested positive for the following two viruses: sweetpotato feathery mottle virus (SPFMV) and sweetpotato chlorotic stunt virus (SPCSV). The infected variants were designated as SPVD-1, SPVD-2, and SPVD-3, corresponding to FR-1, FR-2, and FR-3, respectively.
Virus detection was conducted using RT-PCR (Eppendorf AG, Hamburg, Germany), with a TaKaRa MiniBEST Plant RNA Extraction Kit (TaKaRa Bio Dalian Co., Ltd., Dalian, China) [30]. A total of six experimental plots were established (Figure 2), each measuring 10 m × 8 m. Rows were spaced 1 m apart, with a 1 m distance between individual plants, resulting in 80 plants per plot. For each variety, virus-free (healthy) seedlings were planted on the left side of the plot, and SPVD-infected seedlings were planted on the right. To prevent cross-contamination, a 2 m-high insect-proof net and a protective buffer zone were installed between healthy and infected plots. All plots were managed under uniform irrigation and fertilization protocols.
After the acquisition of hyperspectral images via unmanned aerial vehicle (UAV) flights, chlorophyll content was measured using a SPAD-502 portable chlorophyll meter (Konica Minolta, Inc., Tokyo, Japan) to further assess the health and disease status of the sweetpotato plants. Within each plot, three rows of representative sweetpotato plants were selected for measurement. For each plant, chlorophyll readings were taken at three upper leaf positions, avoiding the main veins, using a SPAD instrument. Each leaf was measured three times, and the average value was recorded to ensure accuracy. To maintain statistical reliability, a minimum of six valid chlorophyll measurements were collected from each row.
2.2. UAV Data Collection and Preprocessing
The DJI Matrice 300 RTK (DJI Innovation, Shenzhen, China) used for airborne hyperspectral image acquisition in this study primarily consisted of a flight platform, flight control system, motorized gimbal, ground station control system, and data export module. The hyperspectral data acquisition system is a Cubert S185 imaging spectrometer ((Cubert GmbH, Ulm, Germany;
Hyperspectral data were acquired under optimal environmental conditions: clear skies, no cloud cover, and minimal wind. Data collection was conducted between 10:00 a.m. and 12:00 p.m. during the early growth stage of sweetpotato (on 4 August 2024) to minimize spectral interference and maximize disease detection sensitivity. The UAV was flown at an altitude of 12m, with an 80% image overlap rate, a flight speed of 1 m·s−1, and a resulting spatial resolution of 0.005 m. An additional UAV hyperspectral image was acquired on 17 August 2024 for model validation. Early-stage image acquisition was prioritized because of the advantages of lower background interference and reduced leaf overlap, which are essential for enhancing the detectability of early disease symptoms and facilitating timely intervention.
During data collection, hyperspectral signals can be affected by external environmental factors or intrinsic sensor noise, often manifesting as fluctuations or spikes in the spectral curve. To improve spectral smoothness, enhance the signal-to-noise ratio, and increase the accuracy of information extraction, spatial-domain smoothing was applied. The Savitzky–Golay (S-G) filter, a low-pass smoothing filter based on local polynomial least-squares fitting, was used for this purpose. The S-G filter is particularly effective in preserving the original shape (e.g., peaks and troughs) and width characteristics of spectral signals, while reducing high-frequency noise. In this study, a window size of 13 and a polynomial order of five were selected for convolutional smoothing, which significantly enhanced the spectral quality [31].
To enable accurate identification of SPVD, regions of interest (ROIs) were manually annotated using ENVI 5.3 (L3Harris Geospatial, Broomfield, CO, USA), guided by field-based ground truth observations and expert knowledge of crop health conditions. The dataset was classified into three categories as follows: healthy plants, diseased plants, and others. Approximately 50,000 pixels were selected per category to ensure balanced class representation and minimize training bias. These ROIs were subsequently converted into labeled maps (ground truth images), where each pixel was assigned a category label corresponding to its class. This labeling process ensured spatial alignment with the original hyperspectral imagery and provided high-quality, pixel-level annotations for supervised learning.
The resulting dataset served as the foundation for the subsequent steps, including sample extraction, feature selection, and model training. By integrating expert-guided annotations with precise UAV hyperspectral imagery, the study established a robust and representative dataset to support plant-level classification of SPVD.
2.3. Feature Selection
Given the high dimensionality and strong inter-band correlations in hyperspectral data, it is crucial to perform feature selection to improve model accuracy and eliminate redundant information. In this study, a two-step feature selection strategy was adopted to effectively reduce data dimensionality and enhance model performance.
In the first step, three widely used feature selection methods—Local Covariance Matrix (LCM), Minimum Redundancy Maximum Relevance (mRMR), and Random Forest (RF)—were employed to evaluate and extract the most informative hyperspectral bands from different perspectives.
(1). LCM evaluates the discriminative power of each spectral band by calculating the local variance within target regions of interest, involving steps such as local domain construction, covariance matrix computation, and feature screening based on local variability [32].
(2). mRMR selects features by maximizing relevance with the target variable while minimizing redundancy among features, using an objective function and an incremental search strategy to iteratively select the most informative bands [33].
(3). RF constructs an ensemble of decision trees and ranks spectral bands based on their contribution to model accuracy. Feature importance scores are calculated, and the top-k-ranked bands are selected as the optimal feature subset [34].
To ensure a fair comparison, the number of selected features was fixed at 30 for each method, and ten-fold cross-validation was used to validate the stability and reliability of the selected features.
In the second step, Variance Inflation Factor (VIF) analysis was applied to the feature subsets obtained from the first stage to eliminate highly collinear bands and ensure the stability of subsequent model parameters [35]. This two-step process not only reduces the initial feature space, thereby simplifying VIF computation, but also addresses the problem of multicollinearity among features.
Additionally, 24 vegetation indices, grouped into three categories based on their biophysical and biochemical significance, were incorporated into the analysis(Table 1). VIF was used to identify and retain vegetation indices that are most sensitive to SPVD, ensuring the robustness of disease detection models. Through this comprehensive feature selection process, the dimensionality of the hyperspectral dataset is significantly reduced, leading to improved generalization ability, enhanced computational efficiency, and the more stable performance of subsequent regression or classification models.
2.4. Modeling Methods
In this study, three commonly used hyperspectral disease classification methods—Support Vector Machine (SVM), Gradient Boosting Decision Tree (GBDT), and Residual Network (ResNet) and 3D Convolutional Neural Network (3D-CNN)—were employed for comparative analysis.
SVM is one of the most widely used classification models in remote sensing applications [52]. Its core principle involves projecting low-dimensional feature variables into a higher-dimensional space using kernel functions (e.g., linear, Gaussian), thereby constructing an optimal decision hyperplane that maximizes the margin between different classes.
GBDT is an ensemble learning algorithm based on the Boosting framework, which incrementally improves model performance by combining multiple weak learners. It has shown strong capabilities in handling complex, nonlinear relationships in hyperspectral data [53].
CNNs are a class of deep neural networks involving convolutional operations within a feedforward architecture. They are capable of automatically learning both shallow and deep discriminative features from data [54]. ResNet employs a deep architecture with residual connections, and has been widely used in tasks such as plant disease identification, object detection, and semantic segmentation [55]. By simultaneously processing spatial and spectral dimensions, 3D-CNNs further extend this capability, enabling the extraction of rich spectral–spatial features. To reduce computational complexity while maintaining high classification accuracy, depthwise separable convolutions are used, which significantly reduce the parameter count and training time [56,57,58,59].
In this study, a tailored 3D-CNN architecture was designed with three 3D convolutional layers to increase the number of spatial–spectral feature maps. This ensures that spatial information within different spectral bands is effectively captured without information loss. Each convolutional layer is followed by batch normalization and a ReLU activation function, enhancing training stability and nonlinearity. A hyperbolic tangent activation is applied in the fully connected layer to perform final feature mapping (Figure 1). For model optimization, the Adam optimizer with cosine annealing learning rate scheduling was adopted. The initial learning rate was set to 0.001, with a decay rate of 3% per epoch. An early stopping mechanism was also introduced to prevent overfitting; training was halted if the validation loss did not decrease for five consecutive epochs.
Finally, we also implemented the CropdocNet model as a direct pixel-level baseline. CropdocNet consists of a spectral encoder, with two consecutive Conv3D–BatchNorm3D–ReLU layers (channels 1→8→16, kernel 3 × 3 × 3); a spectral–spatial encoder, with one Conv3D–BatchNorm3D–ReLU layer (16→32, kernel 3 × 3 × 3); adaptive pooling, comprising an AdaptiveAvgPool3d to reshape features to (bands, patch_size, patch_size); and a classifier, composed of a 512-unit fully connected layer and a final output layer (num_classes). It was trained on the same 70/30 split with patch_size = 5, batch_size = 128, Adam (lr = 1 × 10−3), and 20 epochs, without any post-processing. This allows for a fair comparison of raw voxel-level performance against our two-stage PLCNet pipeline.
2.5. Post-Processing Module
To further improve the spatial consistency of model prediction results and reduce the salt-and-pepper noise caused by local misclassifications, this study introduces a post-processing strategy for whole-plant-level classification, integrating connected component analysis with a majority voting mechanism. This strategy is grounded in the biological reality that virus infections in sweetpotato typically affect the entire plant. As such, the goal is to assign a consistent category label to each individual plant in the classification map, thereby enhancing both the biological validity and the practical reliability of the results.
The post-processing pipeline consists of the following three steps: (1). Plant Mask Extraction (PME):
A binary plant mask is generated from the initial classification map by removing non-plant regions, such as soil and shadows. This step ensures that subsequent analysis is restricted to valid plant areas only.
(2). Connected-Components Labeling:
Within the extracted plant mask, spatially connected regions are identified using 8-neighborhood connectivity analysis. Each connected component corresponds to a single sweetpotato plant or a tightly grouped cluster of plants in the field [60].
(3). Plant-Level Majority Voting (PLMV):
For each connected region, the predicted category of all pixels is aggregated, and a majority voting rule is applied to determine the dominant class. This class is then assigned uniformly to all pixels within the connected component, ensuring that each plant receives a consistent label [61,62].
By applying this three-step refinement process, the inconsistencies typically observed along plant edges in initial classification maps are significantly reduced. The method improves spatial smoothness, enhances regional coherence, and strengthens the biological interpretability of the model output. Consequently, this approach is highly suited to real-world field applications, enabling more accurate, robust, and scalable identification of SPVD at the plant level.
2.6. Model Performance Evaluation
In this study, the dataset was divided into a training set and a testing set at a ratio of 7:3, ensuring that the class distribution remained consistent across both sets. To comprehensively evaluate the performance of the model in hyperspectral remote sensing classification tasks, three commonly used classification performance metrics were employed as follows:
Overall Accuracy (OA): OA refers to the proportion of correctly classified samples among all test samples. It is the most straightforward indicator for assessing the overall classification performance of a model. The calculation formula is
(1)
where is the number of correctly classified samples, and is the total number of samples.The F1-score is the harmonic mean of Precision and Recall, balancing both completeness and exactness. It is particularly suitable for situations where there is a class imbalance.
(2)
Macro F1-Score: the Macro F1-Score is the arithmetic mean of the F1-scores across all C classes:
(3)
where is the F1-score for class i, computed by treating class i as the positive class and all other classes as negative. This macro-averaging ensures that each class contributes equally to the final metric, which is particularly useful in imbalanced class scenarios.Mean User’s Accuracy (UA_mean): UA_mean is the average of the user’s accuracy (precision) calculated for each class:
(4)
where and and represent the true positives and false positives for class , respectively.Mean Producer’s Accuracy (PA_mean): PA_mean is the average of the producer’s accuracy (recall) calculated for each class:
(5)
where is the number of false negatives for class .In this study, the Kappa coefficient was not used due to its known limitations in remote sensing accuracy assessment [63]. Instead, UA_mean and PA_mean were included as they provide more interpretable, class-specific accuracy information that complements OA and F1.
3. Results
3.1. Spectral Characteristics and SPAD of Healthy and Diseased Sweetpotato
To compare the spectral differences between healthy and diseased sweetpotato plants, the normalized average spectra of diseased pixels relative to healthy pixels were plotted for each plot (Figure 3). Generally, a distinct reflectance peak was observed at around 550 nm, followed by a plateau beginning near 770 nm. In the visible region (450–690 nm), the spectral reflectance of diseased sweetpotato leaves across all three varieties was consistently higher than that of healthy leaves. This region exhibited a characteristic pattern of increasing and then decreasing reflectance, with a pronounced “green peak” near 550 nm and a “red valley” around 690 nm.
In the near-infrared region (700–980 nm), a sharp rise in reflectance was observed. Healthy sweetpotato plants of all three varieties showed a more pronounced increase in reflectance compared to SPVD-infected plants. Notably, after 730 nm, the spectral reflectance of diseased leaves was consistently lower than that of healthy leaves, indicating a clear spectral distinction associated with disease presence.
This box plot (Figure 4) illustrates the distribution of leaf chlorophyll content (SPAD value) across three sweetpotato varieties (FR-1, FR-2, FR-3) under healthy and SPVD-infected conditions. The key finding is that SPVD induces significant chlorophyll loss. Both the median and mean SPAD values of all infected groups (SPVD-1/2/3) are substantially lower than those of their corresponding healthy groups (FR-1/2/3), confirming severe pigment degradation following viral infection. There were varietal differences in disease resistance; for FR-1 (susceptible), SPVD-1 shows the greatest decline in chlorophyll, with a median SPAD of ~28. For FR-2 (moderately resistant), SPVD-2 has a median SPAD of ~32. For FR-3 (highly resistant), even when infected (SPVD-3), the median SPAD remains at ~35, near the level of healthy FR-1, while healthy FR-3 exhibits the highest median (~38), underscoring its intrinsic physiological advantage. These results not only demonstrate SPVD’s detrimental impact on chlorophyll content, but also provide a strong rationale for integrating chlorophyll-related vegetation indices into hyperspectral models to enhance their accuracy and interpretability.
3.2. Feature Selection for SPVD
To identify the most informative spectral bands for SPVD classification, we compared three widely used feature selection methods—Local Covariance Matrix (LCM), Minimum Redundancy–Maximum Relevance (mRMR), and Random Forest (RF). As summarized in Table 1, each method produced a distinct subset of wavelength variables.
LCM was concentrated in the 790–926 nm region, which is sensitive to plant structural and physiological changes. mRMR yielded a broader distribution spanning both visible and near-infrared bands, notably around 674, 678, and 686 nm, displaying key chlorophyll absorption features. RF selected bands across the full spectrum, with emphasis on the 650–750 nm red-edge region and other chlorophyll-sensitive wavelengths.
To reduce multicollinearity and enhance robustness, we applied Variance Inflation Factor (VIF) analysis to each subset, removing highly collinear bands and yielding the following stable sets (Table 2).
These filtered wavelengths minimize redundancy, improving model interpretability and stability. Among the three approaches, RF’s impurity-based ranking criterion delivered the best model performance (OA = 91.36%); therefore, its selected bands were adopted for all subsequent model training (Table 3).
To further enhance the biological interpretability and classification performance of the model, 24 vegetation indices commonly associated with chlorophyll content, carotenoids, and plant stress were initially evaluated. VIF analysis was then applied to remove redundant and highly collinear indices, resulting in a subset of five optimal indices closely related to SPVD (Table 4).
Among the selected indices, three—PSSRb, DATT, and MDATT—are associated with chlorophyll content, reflecting the decline in chlorophyll commonly observed in virus-infected leaves. One index, the PRI (Photochemical Reflectance Index), is indicative of carotenoid activity and photosynthetic efficiency, often altered under stress conditions. The final index, ND800,530, is a normalized difference index that combines the near-infrared and visible bands to capture structural and pigment-related changes in vegetation.
These SPVD-sensitive indices were subsequently incorporated into the feature sets of the LCM, mRMR, and RF selection methods to build classification models. The integration of spectral features with biologically relevant vegetation indices is expected to improve the robustness and physiological interpretability of the disease classification models.
3.3. Evaluating the Effect of Feature Selection Methods and Classifiers on Recognition Performance
To evaluate the impact of different feature selection methods on classification performance, we tested three FS + Vis combinations (LCM + Vis, mRMR + Vis, RF + Vis) across the following four classifiers: SVM, GBDT, ResNet, and 3D-CNN. The results are summarized in Table 5.
Among the four classifiers, 3D-CNN consistently achieved the highest accuracy, with the best configuration—RF + Vis + 3D-CNN—reaching OA = 96.55%, F1 = 95.36%, UA_mean = 0.9498, and PA_mean = 0.9504. ResNet outperformed the two traditional models but fell slightly short of the 3D-CNN, achieving OA = 93.67%, F1 = 92.96%, UA_mean = 0.9351, and PA_mean = 0.9303 under RF + Vis. Both SVM and GBDT yielded similar results with LCM + Vis or mRMR + Vis (OA ≈ 91.3–91.5%, F1 ≈ 90.7–91.1%, UA_mean ≈ 0.9133–0.9333, PA_mean ≈ 0.9093–0.9299), demonstrating that while LCM and mRMR extract informative features, they cannot capture complex nonlinear spectral–spatial relationships as effectively as deep networks.
Notably, RF-based feature selection consistently outperformed LCM and mRMR for all classifiers—especially when paired with deep backbones—indicating that RF is more adept at identifying SPVD-sensitive bands and indices.
Overall, these findings highlight that (1) feature selection quality critically influences downstream performance; (2) deep learning models—particularly our 3D-CNN—excel at hyperspectral representation; and (3) the combination of RF-selected features with a deep 3D-CNN backbone offers the most robust and generalizable pipeline for SPVD detection.
3.4. Classification Performance of Hyperspectral Images of SPVD
To assess the spatial distribution of sweetpotato virus infections within the study area, selected characteristic spectral bands were used as input features for classification using the 3D-CNN model. The resulting classification maps of SPVD, based on hyperspectral imagery, are shown in Figure 5a–f. Overall, the healthy sweetpotato plants from all three varieties were accurately identified, with classification results largely aligning with ground truth observations. However, a few instances of misclassification were observed, particularly in Figure 5d, which may be attributed to spectral interference from non-plant elements such as shadows, senescent (yellow) leaves, or overlapping vegetation, thereby affecting the classification accuracy.
Figure 5g–l presents the classification results for virus-infected sweetpotato plants located within the central “protected rows.” The majority of these areas were accurately identified, and notably, the healthy plants in the protected rows were not misclassified, despite their proximity to infected plants. This suggests that the model effectively distinguished between diseased and healthy plants under spatially constrained conditions. Nonetheless, minor misclassifications occurred in a few plots, specifically in Figure 5k,l, which may be attributed to the low infection severity in certain diseased plants. Such cases can result in spectral signatures that closely resemble those of healthy plants, thus reducing the model’s discriminative capability.
Figure 6 qualitatively compares the pixel-level CropdocNet output with the plant-level PLCNet result on both a healthy FR-2 patch and an SPVD-3 infected patch.
Healthy FR-2 (Figure 6a–f): CropdocNet (Figure 6a–c) correctly identifies the majority of the healthy canopy, but leaves small holes within the mask and produces isolated green speckles on the bare soil. PLCNet (Figure 6d–f), after connected-component filtering and majority voting, yields a continuous, gap-free canopy outline that faithfully matches the true plant boundary and virtually eliminates stray soil pixels.
Infected SPVD-3 (Figure 6g–l): CropdocNet (Figure 6g–i) correctly delineates the general infection area but leaves small “holes” within lesions—misclassifying a few diseased pixels as healthy—and produces fragmented gaps inside individual infected plants. PLCNet (Figure 6j–l) fills these gaps via connected-component analysis and majority voting, resulting in solid, contiguous infection masks that better reflect the true extent of disease.
Overall, PLCNet’s post-processing markedly enhances spatial coherence, removing “salt-and-pepper” noise and consolidating fragmented predictions while preserving the accurate delineation of both healthy and diseased tissue in field conditions.
To further evaluate model robustness over time, hyperspectral images were acquired again 13 days later, and the trained model was used to generate a new classification map, as shown in Figure 7. The results indicate the high temporal stability of the model, with only two instances of misclassification in the healthy FR-1 plots (Figure 7f), which were potentially caused by weeds or fallen leaves being incorrectly identified as diseased sweetpotato. However, the model showed a slight increase in false negatives during this later growth stage. This may be explained by tissue regeneration and new biomass accumulation in plants that were initially infected but showed reduced visible symptoms, causing attenuated spectral signals and making detection more difficult (Figure 7k,l).
4. Discussion
4.1. Challenges in Hyperspectral Remote Sensing Diagnosis of SPVD
Remote-sensing-based diagnosis of SPVD faces significant technical and theoretical challenges, arising both from inherent complexities in virus detection and from the unique agronomic characteristics of sweetpotato crops. Unlike diseases caused by fungi or bacteria, plant viral diseases often involve simultaneous infection by multiple virus species, resulting in heterogeneous spectral responses that complicate the extraction of stable and discriminative spectral features [64,65]. Direct quantification of virus loads under field conditions also remains challenging, limiting the ability to establish precise quantitative relationships between remote sensing signals and viral infection severity. Additionally, observed spectral variations in infected plants largely reflect indirect physiological responses, such as reduced chlorophyll content, which can also be influenced by non-viral environmental stressors, reducing the specificity of spectral-based disease models [66,67,68].
These challenges are particularly pronounced under the dense planting conditions typical of sweetpotato cultivation. The multi-layered canopy structure can induce significant spectral mixing effects, diminishing the sensitivity of disease-specific spectral bands. Furthermore, early-stage SPVD symptoms are often subtle and obscured within the canopy, restricting the reliable detection of initial infection stages. Compared to structurally simpler crops like rice or wheat, SPVD diagnosis necessitates enhanced spatial–spectral coupling across multiple scales and requires robust classification models capable of mitigating background interference.
The transient masking of SPVD symptoms can be further understood by considering the vascular-confined spread of SPFMV and SPCSV, and sweetpotato’s inherent source–sink transport dynamics [2,65]. Both viruses colonize the plant’s phloem sieve tubes and move systemically through the nutrient-transport network. Older leaves and stem tissues, which act as primary “source” organs, typically accumulate higher viral titers earlier, displaying chlorosis, leaf curling, and mosaic symptoms. Conversely, newly emerging “sink” tissues (such as apical meristems and unfolding young leaves) initially remain less exposed to high viral concentrations due to their role in drawing photoassimilates rather than exporting nutrients. During periods of vigorous plant growth, these sink tissues may develop with relatively low viral loads and exhibit nearly normal spectral signatures, temporarily masking symptoms and generating false negatives in hyperspectral classifications. Over time, as the virus progressively invades these sink tissues, symptoms become apparent again, but the initial growth phase offers a critical window during which symptom expression is significantly attenuated.
To effectively overcome these challenges, hyperspectral remote-sensing-based detection of SPVD should integrate synergistic optimization across algorithm development, targeted data acquisition strategies, and deeper elucidation of underlying phenotypic mechanisms. This comprehensive, multidisciplinary approach is essential for substantially improving the accuracy, stability, and generalization capabilities of SPVD identification models in practical agricultural settings.
4.2. Spectral Response Mechanisms of SPVD-Infected Sweetpotato Leaves
Common symptoms of SPVD-infected sweetpotato leaves include physiological changes that are typically accompanied by a reduction in chlorophyll content, such as yellowing, vein clearing, leaf curling, and plant dwarfing (Figure 4). The spectral characteristics observed in Figure 3 are highly consistent with findings from previous hyperspectral near-surface remote sensing studies [15,25,69]. This consistency underscores the reliability of low-altitude UAV-based hyperspectral remote sensing in detecting sweetpotato viral diseases. It also suggests that SPVD-infected and healthy plants exhibit significant spectral differences across multiple wavelength regions, which can be leveraged to differentiate between healthy and diseased sweetpotato plants.
Notably, across all SPVD-infected samples, higher reflectance was observed in the red-edge region (700–740 nm), a spectral range known to be highly sensitive to plant stress responses [70,71]. In this study, the elevated red-edge reflectance is likely attributable to physiological stress caused by viral infection. In contrast, spectral variations in other regions were more variety-specific. In the visible region (450–700 nm), the selected wavelengths were strongly correlated with the primary absorption bands of chlorophyll a and b. SPVD-induced chlorophyll degradation resulted in noticeable increases in reflectance, particularly in the red region (approximately 660–690 nm), where weakened absorption indicates reduced photosynthetic efficiency in diseased plants [2]. These reflectance changes serve as sensitive indicators of virus-induced photosynthetic inhibition.
The red-edge region, representing the transition from red to near-infrared reflectance, is particularly sensitive to chlorophyll concentration and cellular structural changes. Alterations in the slope of the red-edge curve—or the observed “blue shift” of its position—reflect leaf health deterioration, making it an essential spectral feature for early-stage viral disease detection [72,73,74,75]. In this study, red-edge wavelengths were repeatedly selected by multiple feature selection methods, highlighting their robustness and strong biological interpretability, which serve as a solid foundation for subsequent modeling and variable integration.
In the near-infrared region (750–950 nm), reflectance is primarily influenced by leaf cellular structure, tissue density, and water content. Following SPVD infection, tissue degradation and moisture loss led to a general decline in reflectance within this region, providing additional evidence of physiological stress and deterioration.
Collectively, these spectral bands reflect the distinct physiological and biochemical changes between healthy and SPVD-infected sweetpotato plants. The spectral response patterns observed are largely consistent with prior research on hyperspectral reflectance in plant disease detection [50,51]. These findings support the conclusion that hyperspectral technology can effectively enable the indirect, yet efficient, identification of SPVD by capturing photochemical signals indicative of plant health status.
4.3. From Pixel to Plant: A Post-Processing Strategy for Practical SPVD Monitoring
UAV-based hyperspectral screening offers non-destructive, high-throughput, and cost-effective advantages over PCR for large-scale SPVD surveillance. Our model achieves OA = 96.55%, F1 = 96.45%, yet field-scale imagery still produces “salt-and-pepper” noise due to canopy shading, leaf-edge regions, and soil background.
To illustrate this issue, we have compared pixel-level CropdocNet and our plant-level PLCNet. CropdocNet’s dual-branch design only fuses spectral and spatial features at a shallow layer and is limited in depth, so it often leaves false positives and small gaps within true canopy regions isolated. In contrast, PLCNet’s deep 3D-CNN backbone extracts multi-layer spectral–spatial representations, and its connected-component post-processing removes scattered noise and enforces whole-plant consistency.
To mitigate residual misclassifications, we have employed the following methods:
Pixel-level deep feature extraction: a volumetric 3D-CNN captures localized disease signals (e.g., red-edge shift, NIR suppression).
Plant-level post-processing: connected-component analysis and majority voting aggregate voxel predictions into coherent plant-level labels.
This hybrid pipeline balances model complexity with field practicality, yielding cleaner, biologically plausible infection maps that support early warning, high-risk area identification, and targeted sampling for optimized disease management.
In this study, the Kappa coefficient was excluded due to its well-documented limitations in remote sensing accuracy assessment, particularly its potential to misrepresent classification performance in imbalanced datasets [63]. Instead, UA_mean and PA_mean were included as they provide more interpretable, class-specific accuracy information, enabling a more comprehensive evaluation of classification reliability across all classes.
4.4. Toward More Robust and Scalable SPVD Detection
We will partner with agricultural institutions across multiple regions to build a standardized, open-access SPVD hyperspectral dataset covering diverse infection stages, virus combinations, and environmental gradients. By adopting uniform sampling protocols and metadata standards and including additional varieties beyond the three leaf-morphology/susceptibility types already studied, this multi-center resource will support cross-domain transfer and eliminate geographic bias.
To improve both accuracy and interpretability, we will incorporate ground-truth physiological measurements (chlorophyll, leaf nitrogen, water content) as auxiliary input channels or regularization terms. Simultaneously, we will deploy a two-stage classification pipeline (“coarse segmentation → fine refinement”) augmented by ensemble voting and uncertainty quantification, and leverage semi-supervised techniques (consistency regularization, pseudo-labeling) to exploit unlabeled field data and reduce annotation costs.
Building on PLCNet’s demonstrated efficiency, we will systematically evaluate deeper residual backbones (e.g., ResNet3D-50) and transformer-based models (hyperspectral ViT) under appropriate data and computation conditions to assess their spectral–spatial representation potential and trade-offs between complexity and performance.
5. Conclusions
This study introduced PLCNet, an innovative framework designed for the rapid, non-destructive identification of SPVD using UAV-acquired hyperspectral imagery. PLCNet uniquely addresses SPVD detection by integrating optimized spectral feature selection, deep learning, and a plant-level post-processing pipeline. Unlike conventional pixel-level methods, PLCNet treats each sweetpotato plant as a cohesive spatial unit, effectively capturing whole-plant disease symptomology.
Benchmark evaluations showed that PLCNet achieved superior classification performance (OA = 96.55%, Macro F1 = 95.36%, UA_mean = 0.9498, PA_mean = 0.9504), outperforming traditional classifiers (SVM, GBDT) and simpler CNN architectures (ResNet, CropdocNet). The inclusion of connected-component analysis and a majority voting post-processing module significantly reduced classification noise, improving the spatial coherence and biological interpretability of classification maps.
The demonstrated effectiveness, high throughput, and operational practicality of PLCNet underline its substantial potential for the large-scale precision monitoring and management of SPVD. By combining deep spectral–spatial feature extraction with robust plant-level refinement, the proposed framework provides a scalable, cost-effective, and accurate solution suitable for deployment across diverse agricultural production regions. Future research directions include expanding multi-regional datasets, incorporating physiological parameters, exploring advanced model architectures, and integrating semi-supervised techniques to further enhance the robustness and generalization capabilities of SPVD detection systems.
Conceptualization, Q.Z., W.W. and G.Y.; Data Collection, Q.Z., W.W., H.S., X.G. and H.H.; Methodology, H.S., G.Y. and Q.Z.; Validation, G.Y., W.W. and Q.Z.; Writing—original draft, Q.Z. and Z.X.; Writing—review and editing, Q.Z., Q.C., G.Y., H.S., Z.X. and J.X.; funding acquisition, Z.X. and Q.Z. All authors have read and agreed to the published version of the manuscript.
All datasets presented in this study are available upon request from the corresponding author.
The authors would like to thank Chen Li and Yuan Yi for supporting the field campaigns. The authors would also like to thank the reviewers, whose comments and suggestions were helpful in improving the quality of this manuscript.
The authors declare no conflict of interest.
Footnotes
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.
Figure 1 Technical process of PLCNet.
Figure 2 Location of the sweetpotato experimental field in the study area.
Figure 3 Comparison of the average reflectance curves of different experimental plots. (a) The average spectral curve of the disease from 450 to 998 nm; (b) green peak; (c) red valley; (d) NIR.
Figure 4 Boxplot of leaf SPAD values by group (SPVD-1, FR-1, SPVD-2, FR-2, SPVD-3 and FR-3).
Figure 5 Spatial distribution of healthy and diseased sweetpotato in the study area (scale bar = 0.7 m). (a–c) RGB mosaics of healthy plants FR-1, FR-2, and FR-3; (d–f) corresponding PLCNet classification maps; (g–i) RGB mosaics of infected plants SPVD-1, SPVD-2, and SPVD-3; (j–l) corresponding PLCNet classification maps. (green = healthy, orange = infected, white = background).
Figure 6 Comparison of pixel- (CropdocNet) and plant-level (PLCNet) classifications on a healthy (FR-2; (a–f)) and an infected (SPVD-3; (g–l)) sweetpotato plot. (a,d,g,j) RGB; (b,e,h,k) classification (green = healthy, orange = infected, white = background); (c,f,i,l) overlay.
Figure 7 Validation of PLCNet on an independent dataset (scale bar = 0.7 m; green = healthy; orange = infected; white = background). (a–c) UAV RGB mosaics of healthy plots FR-1, FR-2, and FR-3; (d–f) corresponding PLCNet classification maps; (g–i) UAV RGB mosaics of infected plots SPVD-1, SPVD-2, and SPVD-3; (j–l) corresponding PLCNet classification maps.
Spectral indices used in this study.
| Type | Spectral Index | Short | Formulation | Reference |
|---|---|---|---|---|
| Chlorophyll | Pigment-Specific | PSSRa | R800/R675 | [ |
| PSSRb | R800/R650 | [ | ||
| Ratio Analysis of | RARSa | R675/R700 | [ | |
| RARSb | R675/R650 × R700 | |||
| Normalized Difference | NDVI | (RNIR − RR)/(RNIR + RR) | [ | |
| Red-Edge NDVI | mNDVI | (R750 − R705)/(R750 + R705) | [ | |
| Green NDVI | gNDVI | (R750 − RG)/(R750 + RG) | [ | |
| Macc01 | Macc01 | (R780 − R710)/(R780 − R680) | [ | |
| DATT | DATT | (R850 − R710)/(R850 − R680) | [ | |
| Modified DATT | MDATT | (R721 − R744)/(R721 − R714) | [ | |
| Red-Edge Chlorophyll Index | CI | R750/R710 | [ | |
| Chl_red edge | Chl_red | Rnir/Rred_edge − 1 | [ | |
| Carotenoid | Photochemical | PRI | (R531 − R570)/(R531 + R570) | [ |
| Carotenoid Reflectance | CRI550 | (1/R510) − (1/R550) | [ | |
| CRI700 | (1/R510) − (1/R700) | |||
| CRI515,550 | (1/R515) − (1/R550) | |||
| CRI515,550 | (1/R515) − (1/R700) | |||
| RI530,800 | RI530,800 | R530/R800 | ||
| ND800,530 | ND800,530 | (R800 − R530)/(R800 + R530) | ||
| Plant Stress | Health Index (534,698,704) | HI_2013 | (R534 − R698)/(R534 + R698) − 0.5 × R704 | [ |
| Plant Senescence | PSRI | (R680 − R500)/R750 | [ | |
| Simple Ratio | RR | R695/R670 | [ | |
| R695/R760 | ||||
| R710/R760 |
Optimal wavelength variable selection results.
| Method | Wavelength/nm | VIF (nm) |
|---|---|---|
| LCM | 794, 798, 802, 806, 810, 814, 818, 822, 826, 830, 834, 854, 858, 862, 866, 870, 874, 878, 882, 886, 890, 894, 898, 902, 906, 910, 914, 918, 922, 926 | 794, 926 |
| mRMR | 674, 678, 682, 686, 770, 774, 778, 782, 786, 790, 794, 798, 802, 806, 810, 814, 818, 822, 826, 830, 834, 838, 842, 846, 906, 914, 922, 926, 930, 934 | 674, 686, 906 |
| RF | 450, 454, 458, 646, 650, 662, 666, 670, 674, 678, 682, 686, 690, 694, 698, 702, 706, 710, 714, 718, 722, 790, 794, 802, 806, 810, 814, 818, 822, 834 | 458, 650, 706, 714, 914 |
Comparison of the performance of different feature selection methods for SPVD identification using the SVM classifier (%). (OA: Overall Accuracy; F1: Macro F1-Score).
| Method | OA/% | F1/% |
|---|---|---|
| LCM | 76.75 | 75.37 |
| mRMR | 89.15 | 87.88 |
| RF | 91.36 | 90.41 |
Optimal vegetation indices selection results.
| Type | Chlorophyll | Carotenoid | |||
|---|---|---|---|---|---|
| Index | PSSRb | DATT | MDATT | PRI | ND800,530 |
Performance comparison of feature selection methods combined with vegetation indices and classifiers for SPVD identification. OA and F1 are expressed in %; UA_mean and PA_mean are macro-averaged user’s and producer’s accuracies expressed as proportions.
| Method | SVM | GBDT | ResNet | 3D-CNN | ||||||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| OA/% | F1/% | UA_mean | PA_mean | OA/% | F1/% | UA_mean | PA_mean | OA/% | F1/% | UA_mean | PA_mean | OA/% | F1/% | UA_mean | PA_mean | |
| LCM + Vis | 91.33 | 90.81 | 0.9332 | 0.9299 | 91.50 | 91.07 | 0.9136 | 0.9098 | 93.07 | 92.41 | 0.9238 | 0.9257 | 93.86 | 93.66 | 0.9394 | 0.9369 |
| mRMR + Vis | 91.18 | 90.65 | 0.9333 | 0.9297 | 91.46 | 91.02 | 0.9133 | 0.9093 | 93.37 | 92.67 | 0.9294 | 0.9257 | 94.90 | 94.68 | 0.9303 | 0.9280 |
| RF + Vis | 91.49 | 91.00 | 0.9343 | 0.9311 | 91.90 | 91.49 | 0.9422 | 0.9411 | 93.67 | 92.96 | 0.9351 | 0.9303 | 96.55 | 95.36 | 0.9498 | 0.9504 |
Note: the highest values in each column are highlighted in bold.
1. Adero, J.; Wokorach, G.; Stomeo, F.; Yao, N.; Machuka, E.; Njuguna, J.; Byarugaba, D.K.; Kreuze, J.; Yencho, G.C.; Otema, M.A.
2. Zhang, K.; Lu, H.; Wan, C.; Tang, D.; Zhao, Y.; Luo, K.; Li, S.; Wang, J. The Spread and Transmission of Sweet Potato Virus Disease (SPVD) and Its Effect on the Gene Expression Profile in Sweet Potato. Plants; 2020; 9, 492. [DOI: https://dx.doi.org/10.3390/plants9040492]
3. Fang, D.; Fan, Z.C. Research Progress and Prospects on Control Measures of Sweet Potato Virus Diseases. Crops; 2016; 3, pp. 6-11. [DOI: https://dx.doi.org/10.16035/j.issn.1001-7283.2016.03.002]
4. Karyeija, R.F.; Kreuze, J.F.; Gibson, R.W.; Valkonen, J.P.T. Two Serotypes of Sweetpotato Feathery Mottle Virus in Uganda and Their Interaction with Resistant Sweetpotato Cultivars. Phytopathology®; 2000; 90, pp. 1250-1255. [DOI: https://dx.doi.org/10.1094/PHYTO.2000.90.11.1250]
5. He, Y.; Chen, Z.; Li, Y.; He, M.; Zhang, X.; Zhi, S.; Shen, W.; Qin, S.; Zhang, K.; Ni, Q. Research Progress on Virus Elimination Techniques for Sweet Potato. J. Chang. Veg.; 2018; 8, pp. 36-39.
6. Sun, Z.; Gong, Y.; Zhao, L.; Shi, J.; Mao, B. Advances in Researches on Molecular Biology of SPVD. J. Nucl. Agric. Sci.; 2020; 34, pp. 71-77. [DOI: https://dx.doi.org/10.11869/j.issn.100-8551.2020.01.0071]
7. Zeng, F.; Ding, Z.; Song, Q.; Xiao, J.; Zheng, J.; Li, H.; Luo, Z.; Wang, Z.; Yue, X.; Huang, L. Feasibility of Detecting Sweet Potato (Ipomoea Batatas) Virus Disease from High-Resolution Imagery in the Field Using a Deep Learning Framework. Agronomy; 2023; 13, 2801. [DOI: https://dx.doi.org/10.3390/agronomy13112801]
8. Sarkar, A.; Nandi, U.; Kumar Sarkar, N.; Changdar, C.; Paul, B. Deep Learning Based Hyperspectral Image Classification: A Review For Future Enhancement. Int. J. Comput. Digit. Syst.; 2024; 15, pp. 419-435. [DOI: https://dx.doi.org/10.12785/ijcds/160133] [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/28140390]
9. Wan, L.; Li, H.; Li, C.; Wang, A.; Yang, Y.; Wang, P. Hyperspectral Sensing of Plant Diseases: Principle and Methods. Agronomy; 2022; 12, 1451. [DOI: https://dx.doi.org/10.3390/agronomy12061451]
10. Lowe, A.; Harrison, N.; French, A.P. Hyperspectral Image Analysis Techniques for the Detection and Classification of the Early Onset of Plant Disease and Stress. Plant Methods; 2017; 13, 80. [DOI: https://dx.doi.org/10.1186/s13007-017-0233-z]
11. Zhang, J.; Cheng, T.; Guo, W.; Xu, X.; Qiao, H.; Xie, Y.; Ma, X. Leaf Area Index Estimation Model for UAV Image Hyperspectral Data Based on Wavelength Variable Selection and Machine Learning Methods. Plant Methods; 2021; 17, 49. [DOI: https://dx.doi.org/10.1186/s13007-021-00750-5]
12. Adão, T.; Hruška, J.; Pádua, L.; Bessa, J.; Peres, E.; Morais, R.; Sousa, J. Hyperspectral Imaging: A Review on UAV-Based Sensors, Data Processing and Applications for Agriculture and Forestry. Remote Sens.; 2017; 9, 1110. [DOI: https://dx.doi.org/10.3390/rs9111110]
13. Moriya, É.A.S.; Imai, N.N.; Tommaselli, A.M.G.; Honkavaara, E.; Rosalen, D.L. Design of Vegetation Index for Identifying the Mosaic Virus in Sugarcane Plantation: A Brazilian Case Study. Agronomy; 2023; 13, 1542. [DOI: https://dx.doi.org/10.3390/agronomy13061542]
14. Shi, Y.; Han, L.; Kleerekoper, A.; Chang, S.; Hu, T. Novel CropdocNet Model for Automated Potato Late Blight Disease Detection from Unmanned Aerial Vehicle-Based Hyperspectral Imagery. Remote Sens.; 2022; 14, 396. [DOI: https://dx.doi.org/10.3390/rs14020396]
15. Mickey Wang, Y.; Ostendorf, B.; Pagay, V. Evaluating the Potential of High-Resolution Hyperspectral UAV Imagery for Grapevine Viral Disease Detection in Australian Vineyards. Int. J. Appl. Earth Obs. Geoinf.; 2024; 130, 103876. [DOI: https://dx.doi.org/10.1016/j.jag.2024.103876]
16. Wang, Y.; Xing, M.; Zhang, H.; He, B.; Zhang, Y. Rice False Smut Monitoring Based on Band Selection of UAV Hyperspectral Data. Remote Sens.; 2023; 15, 2961. [DOI: https://dx.doi.org/10.3390/rs15122961]
17. Gao, J.; Ding, M.; Sun, Q.; Dong, J.; Wang, H.; Ma, Z. Classification of Southern Corn Rust Severity Based on Leaf-Level Hyperspectral Data Collected under Solar Illumination. Remote Sens.; 2022; 14, 2551. [DOI: https://dx.doi.org/10.3390/rs14112551]
18. Deng, J.; Zhang, X.; Yang, Z.; Zhou, C.; Wang, R.; Zhang, K.; Lv, X.; Yang, L.; Wang, Z.; Li, P.
19. Zhang, E.; Zhang, J.; Bai, J.; Bian, J.; Fang, S.; Zhan, T.; Feng, M. Attention-Embedded Triple-Fusion Branch CNN for Hyperspectral Image Classification. Remote Sens.; 2023; 15, 2150. [DOI: https://dx.doi.org/10.3390/rs15082150]
20. Datta, D.; Mallick, P.K.; Gupta, D.; Chae, G.-S. Hyperspectral Image Classification Based on Novel Hybridization of Spatial-Spectral-Superpixelwise Principal Component Analysis and Dense 2D-3D Convolutional Neural Network Fusion Architecture. Can. J. Remote Sens.; 2022; 48, pp. 663-680. [DOI: https://dx.doi.org/10.1080/07038992.2022.2114440]
21. Gao, H.; Chen, Z.; Li, C. Sandwich Convolutional Neural Network for Hyperspectral Image Classification Using Spectral Feature Enhancement. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens.; 2021; 14, pp. 3006-3015. [DOI: https://dx.doi.org/10.1109/JSTARS.2021.3062872]
22. Xue, Z.; Yu, X.; Liu, B.; Tan, X.; Wei, X. HResNetAM: Hierarchical Residual Network With Attention Mechanism for Hyperspectral Image Classification. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens.; 2021; 14, pp. 3566-3580. [DOI: https://dx.doi.org/10.1109/JSTARS.2021.3065987]
23. Mu, Q.; Kang, Z.; Guo, Y.; Chen, L.; Wang, S.; Zhao, Y. Hyperspectral Image Classification of Wolfberry with Different Geographical Origins Based on Three-Dimensional Convolutional Neural Network. Int. J. Food Prop.; 2021; 24, pp. 1705-1721. [DOI: https://dx.doi.org/10.1080/10942912.2021.1987457]
24. Trivedi, A.K.; Mahajan, T.; Maheshwari, T.; Mehta, R.; Tiwari, S. Leveraging Feature Fusion Ensemble of VGG16 and ResNet-50 for Automated Potato Leaf Abnormality Detection in Precision Agriculture. Soft Comput.; 2025; 29, pp. 2263-2277. [DOI: https://dx.doi.org/10.1007/s00500-025-10523-0]
25. Zeng, T.; Wang, Y.; Yang, Y.; Liang, Q.; Fang, J.; Li, Y.; Zhang, H.; Fu, W.; Wang, J.; Zhang, X. Early Detection of Rubber Tree Powdery Mildew Using UAV-Based Hyperspectral Imagery and Deep Learning. Comput. Electron. Agric.; 2024; 220, 108909. [DOI: https://dx.doi.org/10.1016/j.compag.2024.108909]
26. Bhatti, U.A.; Bazai, S.U.; Hussain, S.; Fakhar, S.; Ku, C.S.; Marjan, S.; Yee, P.L.; Jing, L. Deep Learning-Based Trees Disease Recognition and Classification Using Hyperspectral Data. Comput. Mater. Contin.; 2023; 77, pp. 681-697. [DOI: https://dx.doi.org/10.32604/cmc.2023.037958]
27. Zheng, J.; Sun, C.; Zhao, S.; Hu, M.; Zhang, S.; Li, J. Classification of Salt Marsh Vegetation in the Yangtze River Delta of China Using the Pixel-Level Time-Series and XGBoost Algorithm. J. Remote Sens.; 2023; 3, 0036. [DOI: https://dx.doi.org/10.34133/remotesensing.0036]
28. Yang, R.; Kan, J. Classification of Tree Species at the Leaf Level Based on Hyperspectral Imaging Technology. J. Appl. Spectrosc.; 2020; 87, pp. 184-193. [DOI: https://dx.doi.org/10.1007/s10812-020-00981-9]
29. Zhang, C.L.; Sun, H.J.; Yang, D.J.; Ma, J.K.; Xie, Y.P. Effects of Leaf Curl Virus on Growth Characteristic and Yield of Sweet Potato. J. North. Agric.; 2020; 48, pp. 94-99. [DOI: https://dx.doi.org/10.12190/j.issn.2096-1197.2020.04.15]
30. Ping, Y. Lipid Metabolism Patterns in SPVD-Infected Sweet Potato Leaves Under Different Temperature Regimes; Jiangsu Normal University: Xuzhou, China, 2018.
31. Wei, X.; Johnson, M.A.; Langston, D.B.; Mehl, H.L.; Li, S. Identifying Optimal Wavelengths as Disease Signatures Using Hyperspectral Sensor and Machine Learning. Remote Sens.; 2021; 13, 2833. [DOI: https://dx.doi.org/10.3390/rs13142833]
32. Fang, L.; He, N.; Li, S.; Plaza, A.J.; Plaza, J. A New Spatial–Spectral Feature Extraction Method for Hyperspectral Images Using Local Covariance Matrix Representation. IEEE Trans. Geosci. Remote Sens.; 2018; 56, pp. 3534-3546. [DOI: https://dx.doi.org/10.1109/TGRS.2018.2801387]
33. Jiang, Y.; Li, C. mRMR-Based Feature Selection for Classification of Cotton Foreign Matter Using Hyperspectral Imaging. Comput. Electron. Agric.; 2015; 119, pp. 191-200. [DOI: https://dx.doi.org/10.1016/j.compag.2015.10.017]
34. Wang, Z.; Yuan, F.; Li, R.; Zhang, M.; Luo, X. Hidden AS Link Prediction Based on Random Forest Feature Selection and GWO-XGBoost Model. Comput. Netw.; 2025; 262, 111164. [DOI: https://dx.doi.org/10.1016/j.comnet.2025.111164]
35. Allouis, T.; Durrieu, S.; Vega, C.; Couteron, P. Stem Volume and Above-Ground Biomass Estimation of Individual Pine Trees From LiDAR Data: Contribution of Full-Waveform Signals. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens.; 2013; 6, pp. 924-934. [DOI: https://dx.doi.org/10.1109/JSTARS.2012.2211863]
36. Blackburn, G.A. Spectral Indices for Estimating Photosynthetic Pigment Concentrations: A Test Using Senescent Tree Leaves. Int. J. Remote Sens.; 1998; 19, pp. 657-675. [DOI: https://dx.doi.org/10.1080/014311698215919]
37. Blackburn, G.A. Relationships between Spectral Reflectance and Pigment Concentrations in Stacks of Deciduous Broadleaves. Remote Sens. Environ.; 1999; 70, pp. 224-237. [DOI: https://dx.doi.org/10.1016/S0034-4257(99)00048-6]
38. Chappelle, E.W.; Kim, M.S.; Iii, M.M. Ratio Analysis of Reflectance Spectra (RARS): An Algorithm for the Remote Estimation of the Concentrations of Chlorophyll A, Chlorophyll B, and Carotenoids in Soybean Leaves. Remote Sens. Environ.; 1992; 39, pp. 239-247. [DOI: https://dx.doi.org/10.1016/0034-4257(92)90089-3]
39. Becker, F.; Choudhury, B.J. Relative sensitivity of normalized difference vegetation Index (NDVI) and microwave polarization difference Index (MPDI) for vegetation and desertification monitoring. Remote Sens. Environ.; 1988; 24, pp. 297-311. [DOI: https://dx.doi.org/10.1016/0034-4257(88)90031-4]
40. Sims, D.A.; Gamon, J.A. Relationships between Leaf Pigment Content and Spectral Reflectance across a Wide Range of Species, Leaf Structures and Developmental Stages. Remote Sens. Environ.; 2002; 81, pp. 337-354. [DOI: https://dx.doi.org/10.1016/S0034-4257(02)00010-X]
41. Gitelson, A.A.; Merzlyak, M.N. Signature Analysis of Leaf Reflectance Spectra: Algorithm Development for Remote Sensing of Chlorophyll. J. Plant Physiol.; 1996; 148, pp. 494-500. [DOI: https://dx.doi.org/10.1016/S0176-1617(96)80284-7]
42. Maccioni, A.; Agati, G.; Mazzinghi, P. New Vegetation Indices for Remote Measurement of Chlorophylls Based on Leaf Directional Reflectance Spectra. J. Photochem. Photobiol. B; 2001; 61, pp. 52-61. [DOI: https://dx.doi.org/10.1016/S1011-1344(01)00145-2]
43. Datt, B. A New Reflectance Index for Remote Sensing of Chlorophyll Content in Higher Plants: Tests Using Eucalyptus Leaves. J. Plant Physiol.; 1999; 154, pp. 30-36. [DOI: https://dx.doi.org/10.1016/S0176-1617(99)80314-9]
44. Lu, S.; Lu, F.; You, W.; Wang, Z.; Liu, Y.; Omasa, K. A Robust Vegetation Index for Remotely Assessing Chlorophyll Content of Dorsiventral Leaves across Several Species in Different Seasons. Plant Methods; 2018; 14, 15. [DOI: https://dx.doi.org/10.1186/s13007-018-0281-z]
45. Haboudane, D.; Miller, J.R.; Tremblay, N.; Zarco-Tejada, P.J.; Dextraze, L. Integrated Narrow-Band Vegetation Indices for Prediction of Crop Chlorophyll Content for Application to Precision Agriculture. Remote Sens. Environ.; 2002; 81, pp. 416-426. [DOI: https://dx.doi.org/10.1016/S0034-4257(02)00018-4]
46. Gitelson, A.A.; Keydan, G.P.; Merzlyak, M.N. Three-band Model for Noninvasive Estimation of Chlorophyll, Carotenoids, and Anthocyanin Contents in Higher Plant Leaves. Geophys. Res. Lett.; 2006; 33, pp. 431-433. [DOI: https://dx.doi.org/10.1029/2006GL026457]
47. Gamon, J.A.; Peñuelas, J.; Field, C.B. A Narrow-Waveband Spectral Index That Tracks Diurnal Changes in Photosynthetic Efficiency. Remote Sens. Environ.; 1992; 41, pp. 35-44. [DOI: https://dx.doi.org/10.1016/0034-4257(92)90059-S]
48. Gitelson, A.A.; Yoav, Z.; Olga, B. Assessing Carotenoid Content in Plant Leaves with Reflectance Spectroscopy. Photochem. Photobiol.; 2002; 75, pp. 272-281. [DOI: https://dx.doi.org/10.1562/0031-8655(2002)075<0272:ACCIPL>2.0.CO;2] [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/11950093]
49. Mahlein, A.K.; Rumpf, T.; Welke, P. Development of Spectral Indices for Detecting and Identifying Plant Diseases. Remote Sens. Environ.; 2013; 128, pp. 21-30. [DOI: https://dx.doi.org/10.1016/j.rse.2012.09.019]
50. Merzlyak, M.N.; Gitelson, A.A.; Chivkunova, O.B.; Rakitin, V.Y. Non-destructive Optical Detection of Pigment Changes during Leaf Senescence and Fruit Ripening. Physiol. Plant.; 1999; 106, pp. 135-141. [DOI: https://dx.doi.org/10.1034/j.1399-3054.1999.106119.x]
51. Carter, G.A. Ratios of Leaf Reflectances in Narrow Wavebands as Indicators of Plant Stress. Int. J. Remote Sens.; 1994; 15, pp. 697-703. [DOI: https://dx.doi.org/10.1080/01431169408954109]
52. Tarabalka, Y.; Fauvel, M.; Chanussot, J.; Benediktsson, J.A. SVM- and MRF-Based Method for Accurate Classification of Hyperspectral Images. IEEE Geosci. Remote Sens. Lett.; 2010; 7, pp. 736-740. [DOI: https://dx.doi.org/10.1109/LGRS.2010.2047711]
53. Li, S.; Sun, L.; Tian, Y.; Lu, X.; Fu, Z.; Lv, G.; Zhang, L.; Xu, Y.; Che, W. Research on Non-Destructive Identification Technology of Rice Varieties Based on HSI and GBDT. Infrared Phys. Technol.; 2024; 142, 105511. [DOI: https://dx.doi.org/10.1016/j.infrared.2024.105511]
54. Roy, S.K.; Krishna, G.; Dubey, S.R.; Chaudhuri, B.B. HybridSN: Exploring 3D-2D CNN Feature Hierarchy for Hyperspectral Image Classification. IEEE Geosci. Remote Sens. Lett.; 2020; 17, pp. 277-281. [DOI: https://dx.doi.org/10.1109/LGRS.2019.2918719]
55. Kalaivani, S.; Tharini, C.; Viswa, T.M.S.; Sara, K.Z.F.; Abinaya, S.T. ResNet-Based Classification for Leaf Disease Detection. J. Inst. Eng. India Ser. B; 2025; 106, pp. 1-14. [DOI: https://dx.doi.org/10.1007/s40031-024-01062-7]
56. Zhang, C.; Bengio, S.; Hardt, M.; Recht, B.; Vinyals, O. Understanding Deep Learning (Still) Requires Rethinking Generalization. Commun. ACM; 2021; 64, pp. 107-115. [DOI: https://dx.doi.org/10.1145/3446776]
57. Smirnov, E.A.; Timoshenko, D.M.; Andrianov, S.N. Comparison of Regularization Methods for ImageNet Classification with Deep Convolutional Neural Networks. AASRI Procedia; 2014; 6, pp. 89-94. [DOI: https://dx.doi.org/10.1016/j.aasri.2014.05.013]
58. Chen, S.; Jin, M.; Ding, J. Hyperspectral Remote Sensing Image Classification Based on Dense Residual Three-Dimensional Convolutional Neural Network. Multimed. Tools Appl.; 2021; 80, pp. 1859-1882. [DOI: https://dx.doi.org/10.1007/s11042-020-09480-7]
59. Ahmad, M.; Khan, A.M.; Mazzara, M.; Distefano, S.; Ali, M.; Sarfraz, M.S. A Fast and Compact 3-D CNN for Hyperspectral Image Classification. IEEE Geosci. Remote Sens. Lett.; 2022; 19, pp. 1-5. [DOI: https://dx.doi.org/10.1109/LGRS.2020.3043710]
60. Zhang, D.; Ma, H.; Pan, L. A Gamma-Signal-Regulated Connected Components Labeling Algorithm. Pattern Recognit.; 2019; 91, pp. 281-290. [DOI: https://dx.doi.org/10.1016/j.patcog.2019.02.022]
61. Lam, L.; Suen, S.Y. Application of Majority Voting to Pattern Recognition: An Analysis of Its Behavior and Performance. IEEE Trans. Syst. Man Cybern.-Part Syst. Hum.; 1997; 27, pp. 553-568. [DOI: https://dx.doi.org/10.1109/3468.618255]
62. Gao, J.; Westergaard, J.C.; Sundmark, E.H.R.; Bagge, M.; Liljeroth, E.; Alexandersson, E. Automatic Late Blight Lesion Recognition and Severity Quantification Based on Field Imagery of Diverse Potato Genotypes by Deep Learning. Knowl.-Based Syst.; 2021; 214, 106723. [DOI: https://dx.doi.org/10.1016/j.knosys.2020.106723]
63. Foody, G.M. Explaining the Unsuitability of the Kappa Coefficient in the Assessment and Comparison of the Accuracy of Thematic Maps Obtained by Image Classification. Remote Sens. Environ.; 2020; 239, 111630. [DOI: https://dx.doi.org/10.1016/j.rse.2019.111630]
64. Untiveros, M.; Fuentes, S.; Salazar, L.F. Synergistic Interaction of Sweet Potato Chlorotic Stunt Virus (Crinivirus) with Carla-, Cucumo-, Ipomo-, and Potyviruses Infecting Sweet Potato. Plant Dis.; 2007; 91, pp. 669-676. [DOI: https://dx.doi.org/10.1094/PDIS-91-6-0669]
65. Kokkinos, C.D.; Clark, C.A.; McGregor, C.E.; LaBonte, D.R. The Effect of Sweet Potato Virus Disease and Its Viral Components on Gene Expression Levels in Sweetpotato. J. Am. Soc. Hortic. Sci.; 2006; 131, pp. 657-666. [DOI: https://dx.doi.org/10.21273/JASHS.131.5.657]
66. Römer, C.; Wahabzada, M.; Ballvora, A.; Pinto, F.; Rossini, M.; Panigada, C.; Behmann, J.; Léon, J.; Thurau, C.; Bauckhage, C.
67. Grisham, M.P.; Johnson, R.M.; Zimba, P.V. Detecting Sugarcane Yellow Leaf Virus Infection in Asymptomatic Leaves with Hyperspectral Remote Sensing and Associated Leaf Pigment Changes. J. Virol. Methods; 2010; 167, pp. 140-145. [DOI: https://dx.doi.org/10.1016/j.jviromet.2010.03.024]
68. Chávez, P.; Zorogastúa, P.; Chuquillanqui, C.; Salazar, L.F.; Mares, V.; Quiroz, R. Assessing Potato Yellow Vein Virus (PYVV) Infection Using Remotely Sensed Data. Int. J. Pest Manag.; 2009; 55, pp. 251-256. [DOI: https://dx.doi.org/10.1080/09670870902862685]
69. Wang, Y.M.; Ostendorf, B.; Pagay, V. Detecting Grapevine Virus Infections in Red and White Winegrape Canopies Using Proximal Hyperspectral Sensing. Sensors; 2023; 23, 2851. [DOI: https://dx.doi.org/10.3390/s23052851]
70. Feng, L.; Zhang, Z.; Ma, Y.; Du, Q.; Williams, P.; Drewry, J.; Luck, B. Alfalfa Yield Prediction Using UAV-Based Hyperspectral Imagery and Ensemble Learning. Remote Sens.; 2020; 12, 2028. [DOI: https://dx.doi.org/10.3390/rs12122028]
71. Dierssen, H.M.; Ackleson, S.G.; Joyce, K.E.; Hestir, E.L.; Castagna, A.; Lavender, S.; McManus, M.A. Living up to the Hype of Hyperspectral Aquatic Remote Sensing: Science, Resources and Outlook. Front. Environ. Sci.; 2021; 9, 649528. [DOI: https://dx.doi.org/10.3389/fenvs.2021.649528]
72. Zhang, M.; Qin, Z.; Liu, X.; Ustin, S.L. Detection of Stress in Tomatoes Induced by Late Blight Disease in California, USA, Using Hyperspectral Remote Sensing. Int. J. Appl. Earth Obs. Geoinf.; 2003; 4, pp. 295-310. [DOI: https://dx.doi.org/10.1016/S0303-2434(03)00008-4]
73. Ali, M.M.; Bachik, N.A.; Muhadi, N.; Atirah,; Tuan Yusof, T.N.; Gomes, C. Non-Destructive Techniques of Detecting Plant Diseases: A Review. Physiol. Mol. Plant Pathol.; 2019; 108, 101426. [DOI: https://dx.doi.org/10.1016/j.pmpp.2019.101426]
74. Larsolle, A.; Hamid Muhammed, H. Measuring Crop Status Using Multivariate Analysis of Hyperspectral Field Reflectance with Application to Disease Severity and Plant Density. Precis. Agric.; 2007; 8, pp. 37-47. [DOI: https://dx.doi.org/10.1007/s11119-006-9027-4]
75. Rumpf, T.; Mahlein, A.-K.; Steiner, U.; Oerke, E.-C.; Dehne, H.-W.; Plümer, L. Early Detection and Classification of Plant Diseases with Support Vector Machines Based on Hyperspectral Reflectance. Comput. Electron. Agric.; 2010; 74, pp. 91-99. [DOI: https://dx.doi.org/10.1016/j.compag.2010.06.009]
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
Abstract
Sweetpotato virus disease (SPVD) poses a significant threat to global sweetpotato production; therefore, early, accurate field-scale detection is necessary. To address the limitations of the currently utilized assays, we propose PLCNet (Plant-Level Classification Network), a rapid, non-destructive SPVD identification framework using UAV-acquired hyperspectral imagery. High-resolution data from early sweetpotato growth stages were processed via three feature selection methods—Random Forest (RF), Minimum Redundancy Maximum Relevance (mRMR), and Local Covariance Matrix (LCM)—in combination with 24 vegetation indices. Variance Inflation Factor (VIF) analysis reduced multicollinearity, yielding an optimized SPVD-sensitive feature set. First, using the RF-selected bands and vegetation indices, we benchmarked four classifiers—Support Vector Machine (SVM), Gradient Boosting Decision Tree (GBDT), Residual Network (ResNet), and 3D Convolutional Neural Network (3D-CNN). Under identical inputs, the 3D-CNN achieved superior performance (OA = 96.55%, Macro F1 = 95.36%, UA_mean = 0.9498, PA_mean = 0.9504), outperforming SVM, GBDT, and ResNet. Second, with the same spectral–spatial features and 3D-CNN backbone, we compared a pixel-level baseline (CropdocNet) against our plant-level PLCNet. CropdocNet exhibited spatial fragmentation and isolated errors, whereas PLCNet’s two-stage pipeline—deep feature extraction followed by connected-component analysis and majority voting—aggregated voxel predictions into coherent whole-plant labels, substantially reducing noise and enhancing biological interpretability. By integrating optimized feature selection, deep learning, and plant-level post-processing, PLCNet delivers a scalable, high-throughput solution for precise SPVD monitoring in agricultural fields.
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer
Details
; Wang, Wei 1 ; Han, Su 1 ; Yang Gaoxiang 2 ; Xue Jiawen 1 ; Hou, Hui 1 ; Geng Xiaoyue 1 ; Cao Qinghe 1
; Xu, Zhen 1 1 Xuzhou Institute of Agricultural Sciences in Jiangsu Xuhuai District, Xuzhou 221131, China; [email protected] (Q.Z.); [email protected] (J.X.); [email protected] (Q.C.)
2 College of Agronomy, Henan Agricultural University, Zhengzhou 450046, China; [email protected]




