1. Introduction
Evidence suggests that human–plant interactions play a crucial role in promoting both physiological well-being and environmental aesthetics, with implications for public health and urban planning [1,2]. Within this context, people primarily gain awareness of the environment through visual perception [3], and the photographic camera is an instrument that can be considered to extend this fundamental human sense [4,5,6]. Modern cameras have evolved to offer robust optical detection, available in consumer-grade products such as the GoPro Hero 12 (GoPro Corporation, SanMateo, CA, USA) or Sony Alpha 9 (Sony Corporation New York, NY, USA), among many others. This benefit is largely attributed to the considerable and sustained investments in academic and commercial optical research which drive advancement for camera products at leading manufacturers [7] such as Canon (Canon Corporation, Melville, NY, USA) and Nikon (Nikon Corporation, Melville, NY, USA). As a result, optoelectronic camera technology provides one of the most sophisticated consumer-available sensor products [8,9,10,11].
Modern digital cameras efficiently record spectral reflectance data as two-dimensional perspective maps that accurately represent real-world objects. The subjects of photography may encompass vegetation, where Red–Green–Blue color (RGB) foliar reflectance is linked to both subjective visual quality and the underlying productive photosystem pigmentation [12,13,14]. Therefore, optical cameras can be expected to provide wide utility in the measurement of plant canopy status and change. They can detect characteristics such as the area of green biomass cover and can resolve subtle color differences presented by plant tissues [15,16,17]. However, the acquisition of scientific data is more widely utilized [18,19,20,21] when it is not only of adequate descriptive quality but also inexpensive and easy to produce, with understandable controls that involve generic parts.
Cameras have been utilized in a wide range of applications across various fields, from industrial fabrication to diverse agricultural and environmental assessments. In industry they have been employed to measure parts [22]. In agriculture and environmental science cameras have been utilized to assess soil moisture [23] and for plant phenotyping [24,25,26,27,28]. They have been instrumental in measuring plant diseases [29], predation effects [30], senescence [31], salt tolerance [32], and drought response [33]. In crop management cameras have been used to determine percent green cover [34,35,36,37,38], make fertilizer recommendations [39], and measure grass-legume overyield [40]. Additionally, cameras have been applied in fruit analysis [41], and in genetic studies to link phenotype to genotype [42,43,44,45]. Most researchers in the turfgrass community follow the work of Richardson and Karcher [35] and use low-cost image methods. In contrast, premier plant phenotyping research often employs expensive sensing equipment (
Imagery used for plant phenotyping is a common practice because it can increase data precision, speed, and consistency over traditional manual plant assessments [51]. Additionally, it can offer increased spatial and statistical power over single electronic point measurement collections [52,53]. Leaders in high-throughput plant phenotyping frequently employ advanced engineering and sensing technologies, generating complex datasets that necessitate sophisticated processing, correction, and analysis techniques. [54,55]. This approach is effective in addressing research questions, but it usually has a limited scope of application and may necessitate a team of experts or expensive equipment acquisitions [56,57,58,59]. High-throughput plant phenotyping can be complemented by employing a simplified approach utilizing consumer-grade cameras with basic controls. The inherent complexity of current consumer camera technology offers a promising opportunity for data generation. This approach can effectively address research questions, albeit with some trade-offs, because it often operates on a smaller scale and with reduced throughput. However, it also comes with reduced costs and lower technical requirements, potentially making it useful across many applications. The modality of uncomplicated image-based proximal phenotyping is most suitable for students, commercial managers, or professional researchers conducting experiments of limited size. This use of modern cameras with simple controls enables detection of plant cover area and color qualities, and although outside the scope of this paper, it opens opportunity for further spatial analysis using shape and texture features.
Grasses comprise a significant component of the Earth’s flora [60,61,62], and turfgrasses provide the primary vegetative cover in many places where humans interact [63,64]. They cover approximately two percent of the continental United States [65], or, for instance, thirty percent of metropolitan Albuquerque, New Mexico [66]. The assessment of turfgrass canopy is an essential aspect of the relationship between humans and plants for effectively gaining ecosystem services through sustainable resource utilization.
The visual quality rating of turfgrass is a traditional procedure that follows standardized guidelines (
This paper builds on related work that shows how imaging is useful in turfgrass research to detect the area of green cover and change in plant establishment over time [87]. This is often achieved by counting green classified pixels and calculating a percentage of green presence [88,89,90,91]. The process is enabled by computers and can be partially automated using programming scripts or recorded macros [35]. Richardson and Karcher [92,93], and Zhang [94] have presented the common approach of using a lightbox for imaging turfgrass, while recent results show how low-cost imaging outside the lightbox can resolve genetic performance in the greenhouse [95]. Once data are digitized in silico, many additional and more advanced opportunities for processing and analysis become available [49,96,97,98,99,100].
To enable the accurate interpretation of visual information, a primary prerequisite is establishing relationships between image-based data and reliable biological metrics. Vegetation metrics calculated from a single set of image data are automatically dependent to some extent, and without a reference they do not provide a measure of alignment with the underlying biological phenomena. Therefore, the isolated analysis of an individual plant image collection may not account for biases induced by the camera, photographic exposure, environmental variations, or computational process. However, once the verification of a specific camera profile response to the plant phenotype trait is obtained, subsequent image data may hold increased independent value and no longer require related attribution. This idea extends to the image correction process implemented post-collection and includes the image file format. Understanding how a specific lossy file compression like JPEG (Joint Photographic Experts Group), image geometric lens distortion correction (LC), and color correction (CC) may affect phenotypic detection ability relative to various image-based calculations ensures experimental control and improves discovery potential. Our literature review did not reveal studies that directly examined the effects of image corrections on image-based calculations in relation to turfgrass phenotyping metrics.
This research aims to validate the efficacy of modern consumer-grade cameras, when used with basic image controls, for plant phenotyping applications. We affirm such imagery can provide data representative of conventional phenotyping metrics, such as human-assessed visual quality (VQ), and we investigate the representation of biomass productivity, water use, and independently measured multi-spectral reflectance using expert-assessed turfgrass VQ and NDVI as benchmarks [101,102,103,104]. Additionally, we example the potential for deriving novel relationships from image-based metrics. By comparing camera-derived information with established metrics, this study seeks to demonstrate the viability of accessible imaging technology for quantitative plant assessment. Furthermore, we evaluate the impact of two typical image file formats and two common corrections on the results.
This work seeks to advance current image analysis methods and provide practical guidance for image-based plant phenotyping. It presents two new image calculations, demonstrates the importance of yellow fraction classification, and illustrates correction effects for two image formats and two corrections derived from reproducible, cost-free software.
Main novelty and contribution list:
Guidance is given for simple image-based phenotyping measurements;
Two new image metrics, HSVi and BA ratio are presented;
The importance of a yellow color classification is demonstrated;
Analysis of image corrections highlights need for standardization.
2. Materials and Methods
An economical (less than USD 1000) Nikon N1 aw1 digital single-lens reflex (DSLR) action camera with 11–27.5 mm f 3.5–5.6 NIKKOR lens, and custom-made LED lightbox were used to investigate 72 custom PVC lysimeters (15.2 cm diameter × 30.5 cm depth) growing ‘TifTuf’ bermudagrass (Cynodon dactylon × C. transvaalensis Burt Davy). The turfgrass was grown concurrently at the U.S. Arid-Land Agricultural Research Center in Maricopa, AZ, USA, as 2 sets of 36 lysimeters inside 2 greenhouses during the Autumn season of 2023. Additional management information is in Appendix A. The lysimeters were treated with four mowing heights (2.5, 5.0, 7.5, and 10.0 cm) and three water application levels [(100 × ETa (actual evapotranspiration), 65 × ETa, and 30 × ETa)], and evaluated for visual quality, clipping productivity (biomass), and water use of the 100 × ETa treatment (gravimetric loss). The experiment is described in detail in Hejl et al. [105]. Appendix A contains descriptions of the plant growth parameters and environmental measurement protocols.
The camera image exposure was controlled by locking manual settings, as described below. Image file format, lens distortion, and color correction were tested. The Holland Scientific RapidScan CS-45 product (Holland Scientific, Lincoln, NE, USA) was used to acquire RED, Red-Edge (RE), and NIR spectral components at 670, 730, and 780 nm wave lengths (10 nm wide), respectively. The active reflectance values were collected concatenate with images to provide a camera-independent measurement of the plants’ optical signature. The collection of three spectral bands enabled the calculation of both Normalized Difference Vegetation Index (NDVI) and Normalized Difference Red-Edge (NDRE). While these indices are related, NDVI is more commonly used to assess canopy cover. However, NDVI can saturate at higher leaf area index values, potentially limiting its effectiveness in dense vegetation. NDRE is considered more sensitive to plant nitrogen and chlorophyll status, because it incorporates information from the spectral inflection point along the RE region [106,107,108].
Spreadsheet data were managed using MS Excel Version 2401 (Microsoft Corporation, Redmond, WA, USA), and statistical analysis was conducted in JMP 15.2.0 (SAS Institute, Cary, NC, USA) software, with a permutation ANOVA technique [109] implemented in the R programming language (R Core Team 2021) [110] as aovperm from the permuco 1.1.3 package [111].
The custom lightbox design (Figure 1) was based on a modification of the foundational 2001 approach of Richardson and Karcher [92,93], and the 2016 design of Zhang et al. [94] that incorporates LED lights, each of which employs a portable photographic studio for digital image-based phenotyping of turfgrass. The rudimentary prototype lightbox used in this paper measured 57 (L) × 40 (W) × 74 (H) cm. It was constructed from a 0.5 cm thick repurposed cardboard, assembled with tape, and equipped with an internal horizontal image backdrop that fit flush around the 16 cm diameter lysimeter top (182.4 cm2). The camera was positioned with a lens protruding into a 24 (L) × 24 (W) × 14 (H) cm upper box containing LEDs. The camera lens was 27 cm distant from the lysimeter and pointed downward from the top of the box through an 11 (L) × 20 (W) × 6 (H) cm polystyrene camera cradle with a 7 cm hole that was 6 cm deep to fit the form of the lens. White printer poster paper was used as a generic, hue-neutral light reflector to cover the inside surfaces of the box. The box was wide enough so that the camera view of the lysimeter top and background would not extend to its sides; therefore, only the lysimeter top and white backdrop were in camera view. For each image sample, the box and one lysimeter were placed on a 60 (L) × 55.5 (W) cm light-colored paper poster board, which was positioned on the greenhouse bench to provide a consistent material foundation.
Photographic illumination was provided by an inexpensive generic 12-volt light emitting diode (LED) strip (Desvorry branded) consisting of 300 individual 6000 K white color lights. The light strip was affixed to the top of the box inside the perimeter around and just above the camera lens. Many LED lights were used for illumination to support light diffusion and reduce shadows. Light emission was measured by a pyranometer positioned at the height of the lysimeter top inside the lightbox. The low-cost light source proved inadequate for sustained use. It decreased in intensity from 16 to 8 watts m2 over 360 s, as described by Equation (1),
(1)
(coefficient of determination, R2 = 0.95) with a steady state of 8 watts m2 after 360 s (Standard Deviation, SD = 0.537). Changes in lighting during LED activation were likely due to temperature effects as the LEDs warmed rapidly. Therefore, the LEDs were only activated immediately before each lysimeter image sample and then turned off to mitigate fluctuations in brightness. The approach reduced measured illumination variance, resulting in an average of 16 watts m2 (SD = 0.482) across twelve test runs. Each test run consisted of 7 to 15 s of illumination with 60 s of de-energized time in between. This scenario was representative of the typical timing for lysimeter data collection of just over one minute per lysimeter.The camera was set to “manual” mode for both acquisition and exposure, no exposure bias, no flash, and “pattern” focus metering. “Standard color” was selected with no “Active D-lighting”. The camera exposure was set to 160 ISO with an aperture of F-Stop 3.5 and a shutter speed of 1/320 s. An 11 mm focal length (equivalent to 29 for a 35 mm crop factor) was used, and the adjustable zoom lens was taped in position. A custom white balance was sampled and recorded on the camera by using the white poster paper placed inside the box and illuminated by the LEDs. The camera viewfinder displayed a circular graphic that was about the same size as the lysimeter top when visualized on the screen. This feature was utilized to verify straightness of camera angle and support geometric consistency between images. The camera display helped verify each time an image was taken and ensured that no settings were accidentally changed between image collections. A custom remote trigger was used to manually activate automatic focus and acquire images. High-resolution 4608 × 3072-pixel JPEG and 4548 × 3012-pixel Nikon NEF (RAW) image file formats were obtained (approximately 5.9 and 12.3 MB in size, respectively). This equated to a measurement resolution close to 11 pixels per mm. Three 24-bit sRGB standard color space image replicates and one RapidScan composite reflectance measurement were taken consecutively for each lysimeter sample. Average values obtained from the imagery were used for comparison with the individual VQ and NDVI lysimeter point measurements. Lysimeters were manually imaged in the morning, approximately every other week for 8 weeks. Chemical ice packs wrapped in microfiber cloth were employed to cool the camera between pictures and avoid overheating. Care was taken to position grass leaves above the imaging backdrop and in view of the camera and any loose leaves were removed from the imaging backdrop when changing lysimeters. Image processing was conducted on a Dell Latitude 5420 laptop with an i7-1185G7 1.8 GHz CPU and 16 GB of RAM (Dell Technologies Inc., Round Rock, TX, USA) running Microsoft Windows 11 Pro 22H2 64-bit OS (Microsoft Corporation, Redmond, WA, USA). This computer represents a typical consumer-grade product, and with the Python 3.12 software (Python Software Foundation, Beaverton, OR, USA) [112] that is commonly used in academic settings. Jupyter Lab and Notebook application tools (Project Jupyter, New York, NY, USA) were run using an Anaconda 2.5.0 installation (Anaconda, Inc., Austin, TX, USA) and included the Pandas, Numpy, Scipy, OpenCV, PIL, Scikit-Image, Matplotlib, and Seaborn libraries. When RAW imagery was used in the Python process, files were initially converted to TIFF (Tagged Image File Format) without compression using Irfan Skiljan’s IrFanView 4.59 (IrFanView, Lower Austria, Austria), which resulted in a file size of 40.1 MB. All imagery was converted to an 8-bit memory depth when processing in Python to reduce computational overhead. An 8-bit image can still display up to 16.7 million colors using 0–255 value tuples. It took approximately 60 s to process a single image file on the small laptop computer, regardless of the file type.
Color correction was used to ensure color accuracy [113] and was performed before application of lens distortion correction using the PerfectColor 1.0 program from Evan Lund of Evens Oddities (
Lens distortion correction was performed after color correction using Python Discorpy 1.6.0 and following the tutorial at
Color classification areas were determined based on a pixel value-based binary mask used to segment regions of interest in the original image (Figure 2). The selected classification areas were analyzed as fractions of the total image area. They represented a chosen range for the total living plant material cover and the percentage of yellow senescent or chlorotic plant material, and a generic green color plant material range. Details are provided below and in Table 1. The bespoke living plant material cover segment was used to calculate discrete image color qualities for that fraction. Individual color scalar components were generated across several color spaces. Many calculations were made to evaluate vegetation indices from existing literature, and custom relationships were also formulated based on pervious linear correlation in separate imagery with NDVI and green cover area. Due to the large number of different calculated terms evaluated (280), only the most relevant metrics that best correlated with VQ and NDVI were selected to report in this paper. Although NDVI was collected at the same time as image metrics, VQ was not. There was an average difference of 2.2 days (SD = 0.92) between the assessment of VQ and the collection of image metrics.
The RGB color space is common in imagery [115], where the R term indicates the intensity of red colors, the G term indicates the intensity of green colors, and the B term indicates the intensity of blue colors. The three terms added together indicate a brightness value. The Hue–Saturation–Value (HSV) color space is derived from RGB values. It separates the color components into a single angle of Hue (H) ranging from 1 to 360 (or another value range of different graduation). Saturation (S) indicates the linear lightness of the Hue color, and Value (V) indicates the linear brightness. Saturation and Value are typically measured on a scale of 0–100 (0–1, or 1–255 for 8-bit imagery). A Value of 0 represents black, while a Saturation of 0 indicates gray without any color. Percent living plant cover classification segmentation (%C) was derived using the HSV color space. Where Hue (on a scale of 0–179) ranged from ≥27 to ≤90, Saturation (on a scale of 0–255) ranged from ≥60 to ≤255, and all brightness Values were included (on a scale of 0–255). The generic green fraction (%G) included Hue ≥ 30 and ≤60. Green fractional cover was previously reported by Casadesus to support wheat research [116]. Yellow area classification (%Y) involved Hue ≥ 16 to ≤26. Both %G and %Y used the same Saturation and brightness Value designations as %C (Table 1).
The Python-based image analysis used in this paper was modeled on the approach exemplified in the TurfAnalyzer 1.0.4 (TA) software. TA is a freely available software tool developed by Douglas Karcher in collaboration with Pennington Seed, Inc. (Madison, GA, USA), and Nexgen Turf Research LLC. (Albany, OR, USA) [117], based on the progressions of Karcher and Richardson [35,93] who originally used SigmaScan macros to measure turfgrass. It enables the analysis of JPEG images for percent green cover and dark green color index (DGCI) [118] and has been used in many experiments [119,120,121,122,123,124,125]. The default color values for TA were included in the Python analysis. These values were determined by converting the full Hue range of 1–360 and the Saturation and brightness Value ranges of 0–100 in TA, to a Hue range of 0–179 and Saturation and brightness Value ranges of 0–255 for the OpenCV commands, respectively. To better understand the computation process and verify data quality, calculated image values such as mean cover area classifications and DGCI were compared between the two software outputs and found to be equal. However, the more intricate Python approach was selected in this research due to a desire for enhanced process control, the capability to conduct full-resolution computations on TIFF format files, the option to export individual pixel values if wanted, the ability to export additional statistical values for each image, the need for additional calculations involving customized relationships between metrics, and the option to visualize and export individual values, histograms, or index calculations as separate charts.
A comprehensive range of Hue angles was selected for the fraction of living cover area (%C) classification segment, to include lower quality yellow-green living plant material through the range of green color to higher-quality green living plant material and into any blue-green material (although no blue-green plant material was present). This broad range was chosen to be inclusive of all the living plant material. Selection of the generic green color area (%G) centered around hue angle 45 and was intended to select only healthy vegetation with high chlorophyll and nitrogen contents. The fraction of senesced yellow plant material (%Y) selection included color values below those of the %C, but also excluded the shades of red and brown colors which indicated soil (the threshold was positioned between the color overlap of brown leaves and brown soil). The three classified color fractions included all brightness Values but used a Saturation threshold of 24% to ensure that selected pixels contained a color intensity which excluded the white background and bright reflection points. These classification parameters were derived from an iterative process of manual selection and evaluation detailed in the Supplemental text.
Figure 2a displays the original 2D view image sample for a 5 cm mow height and low water treatment. The hue-neutral white background is adjacent to the lysimeter perimeter, with grass leaves extending outward. Figure 2b depicts the HSV-based primary color segmentation of all living plant cover area. This comprehensive range of yellow-green to blue-green %C classification accurately includes all illuminated living plant material and was used as the basis for determining color qualities in subsequent calculations. Figure 2c shows the %Y fractional area of senescence with plant relevant Hue angles that are below the %C parameter but above the soil. Figure 2d shows the %G cover segmentation of healthy plant material.
This paper evaluated Guijarro’s RGB-based COMB2 calculation from [126], where three vegetation indices were combined for a model intended to be more sensitive to vegetation changes and less affected by soil background. It uses the Excess Green Index, which is designed to enhance detection of green vegetation, Equation (2):
(2)
from [127], where r, g, and b values are normalized red, green, and blue; the Color Index of Vegetation Extraction index is designed to enhance green color for vegetation segmentation, Equation (3):(3)
from [128]; and the Vegetation Extraction Index is designed to emphasize green reflectance, Equation (4):(4)
from [129], where r, g, and b values are normalized red, green, and blue, and the alpha value of 0.667 was taken from [130] as a generic camera factor. These indices are combined to create Equation (5):(5)
Color spaces beyond the standard RGB and HSV were used. CIE 1976 Lab* (CIELAB) values were calculated with OpenCV, and individual color and brightness scalars were evaluated separately, as well as in relation to each other using various equations. The CIELAB coordinate system was adopted by the International Commission on Illumination (CIE) in 1976 [131] as a color space with increased uniformity relative to human visual perception [132,133]. The CIELAB color space is useful in many applications [134,135,136,137,138] with colors that use the standard observer model for device independency. The CIELAB color space achieves increased human perceptual uniformity because equal Euclidean distances in the color space correspond to equally perceived color differences. The CIELAB L* term represents lightness, but essentially, the a* term represents a green-to-red color dimension axis, and the b* term represents a blue-to-yellow color dimension axis. This paper uses the color opponent axis ratio of b* to a* (BA), intended to emphasize plant health indications of green and yellow pigment status, and the authors are not aware of other phenotyping papers using this ratio, although Wanono [139] investigated the ratio of a* to b* to detect leaf nitrogen. A higher BA ratio could signify chlorosis, senescence, or water stress, while a lower value could indicate presence of chlorophyll, nitrogen or increased water use efficiency.
An additional image-based equation was created to compare with standard DGCI. The unconventional Hue difference illumination ratio, or HSVi, is shown in Equation (6):
(6)
This equation utilizes the HSV color space, where the simple ratio of Saturation and Value was subtracted from Hue, and the result was shifted and offset to scale roughly into a range of 0 to 1. By considering the Saturation-to-Value ratio, we capture a relative color intensity, and so as brightness increases, the Saturation impact reduces. A higher ratio indicates more vivid color relative to brightness. This nuanced HSVi approach modulates Hue angle as a function of color purity and attenuates the positive effect of Hue using lighting to discriminate minor plant features. Therefore, context is important for interpretation, because a larger HSVi result may indicate healthy but less dense vegetation with higher reflectance, while lower values may indicate stress or the presence of very dark green vegetation. The HSVi approach offers a color evaluation related to DGCI, but by subtracting the color purity to brightness ratio it could also help resolve more subtle or mixed canopy features.
Calculated metrics were summarized in tables and related to the experimental class structure. To test the effects of experimental treatments, response variables were assessed using a robust permutational multivariate analysis of variance (permANOVA). Non-parametric permutation ANOVA models are useful in environmental analysis to better handle complex data [140,141,142]. Neither the VQ nor the NDVI data in this paper use camera technology. Therefore, VQ data reported previously and NDVI measurements were chosen as independent references used to compare with the image-based data [36,143,144]. An NDVI time series overview for the original experiment is presented for the first time in this paper, and time series data for two image-based color metrics are offered to show comparative differences in behavior over the course of the experiment.
Image-based metrics were computed and compared with reference measurements to test the effects of file format with standard JPEG compression (JPG), NEF (RAW) to lossless TIFF conversion (TIF), and image adjustments for geometric lens distortion correction (LC) and color correction (CC). Pearson’s correlation coefficient (r) and coefficient of determination (R2) values were calculated, and the image metrics that best corresponded to each reference measurement were selected for reporting. A sensitivity analysis of calculated metrics was conducted to assess the impact of file format and image corrections. Significant differences were tested and a comparison of treatment effects on image metrics, VQ, and NDVI was performed using non-parametric Kruskal–Wallis median test and Dunn’s ranked test with JPEG as control. In a similar approach, the coefficient of variation was evaluated to investigate the effects of file format and image correction on replicate images. The authors are unaware of other published works detailing the sensitivity of file format and image corrections (JPEG, TIFF, LC, and CC) on phenotypic image calculations for turfgrass phenotyping.
3. Results
The %Y quantifies a pixel-based segment of percent cover across the lysimeter top area for yellow-colored senescent plant material. Lower values of yellow can indicate healthy and higher quality turfgrass. The %G area pixel-based fraction indicates presence of healthy plant biomass. DGCI describes the color of turfgrass, where higher values indicate a darker green turfgrass with increased chlorophyll pigments [145]. The HSVi calculation is akin to DGCI as it also describes plant color. HSVi is nuanced however, as higher values can indicate a higher quality of green for turfgrass, but context is important for interpretation since it is intended to capture subtle variations in vegetation density, stress levels, and chlorophyll content. The BA ratio of the b* and a* color scalars also represents a nuanced relationship where higher values indicate more yellowness relative to redness and lower values indicate more blueness relative to greenness. Therefore, pigmentation composition or environmental response may be detected with BA. These image metrics were selected because they showed linear associations with reference measurements and are either traditional plant health indicators or are new calculations which may offer additional insight into plant health.
3.1. Linear Correlation between Variables
Linear correlation and percent variance explanation were analyzed for the image calculations that related strongest to the reference measurements. This involved the continuous variables visual quality (VQ), grass clipping production (mg d−1), water use (mm d−1), NDVI, NDRE, NIR, RED, and RE (Table 2).
Results show that %Y cover area correlated most strongly with VQ, the calculation of DGCI correlated strongest with clipping production, COMB2 correlated most strongly with water use, %G correlated strongest with NDVI as well as RED, BA correlated most strongly with NDRE, and RE, and HSVi correlated strongest with NIR. The presence of correlation between image calculations and measurements with correlations coefficient absolute values greater than 0.8, 0.6, and 0.3 would indicate strong, moderate, and weak linear relationships, respectively, while the respective coefficient of determination values above 0.8, 0.6, and 0.3 would indicate a strong, moderate, and weak explanation of variance. The correlations presented illustrate how imagery data can be useful to resolve phenotypic responses [105]. However, there was no clear trend in the correlations regarding effects of the image corrections or file format.
Data suggest that selected image-based calculations correlate with VQ, clipping production (mg d−1), water use (mm d−1), and spectral reflectance values. However, the image format and correction process only sometimes improved resultant correlations. The uncorrected TIF and the JPG LC for %G classification area exhibited the strongest correlation with NDVI and RED, respectively, but there was a nominal 0.000 and 0.001 respective difference in R2 between the uncorrected TIF and the JPG LC with JPG for each. The relationship between BA with NDRE, and RE showed the strongest correlation, but R2 only improved by 0.004 for NDRE and 0.003 for RE over the uncorrected JPG. %Y and VQ were similar, where the JPG LC CC showed the strongest correlation, but the R2 was only 0.005 higher than the uncorrected JPG. DGCI TIF CC correlated best with productivity, but the correction showed only minor differences in R2 with a 0.098 increase. Likewise, the HSVi TIF CC correlated the strongest with the NIR but the R2 only increased 0.014. One metric, COMB2, exampled a big improvement with TIF LC CC. The COMB2 had the strongest correlation with water use where it exhibited an R2 value 0.310 higher than its uncorrected JPG counterpart. Therefore, it would be improbable to apply none, all, or any individual correction procedure and consistently achieve the highest correlation results across this variety of specific image-based calculations. When correction was optimized, the %Y showed a 0.17 higher R2 value than DGCI for VQ, %G showed a 0.07 higher R2 value than DGCI for NDVI, HSVi showed a 0.09 higher R2 than DGCI for NIR, and BA showed a 0.06 higher R2 than DGCI for NDRE, suggesting that the DGCI calculation did not always explain the most variance in the reference measurements regardless of correction.
3.2. Effects of Image Corrections on Calculated Metrics
The TIFF file format and three correction types were compared against the uncorrected JPEG format to assess their impact on median values for six image-based metrics as shown in Table 3. Results are mixed regarding the significant effects and the magnitude of the correlations with reference measurements. There was a significant difference among all the median values of calculated metrics, except for %Y, even though the JPG LC CC shifted the median value of %Y by 1.6%. TIF format without correction shifted the %G median by 3.7%, but it was not significant. Similarly, JPG LC shifted the BA median by 0.9%, but it was not significant. However, TIF LC did significantly shift medians (24%) for DGCI and HSVi, by 0.075 and 0.102, respectively. Most striking was the effect of corrections on COMB2, where TIF LC CC significantly shifted the median by −13.97 (or 133.7%). Corrections for four of the six metrics that resulted in the highest or lowest median values did not correlate the strongest with their associated reference measurements. However, DGCI TIF CC resulted in the highest median value, and it correlated most with productivity (Table 2), while COMB2 TIF LC CC resulted in the lowest median value, and it correlated the strongest with water use (Table 2). Median values did tend to improve along their relative scales with CC. However, there was no clear trend across all the metrics. Therefore, these data support the idea that neither the file format nor any specific image correction consistently improved model power for these image-based calculated metrics.
3.3. Detection of Experiment Treatment Effects
A robust non-parametric permutation-based ANOVA was chosen to handle the skewed and complex environmental variables. Results indicate the effect sizes and likelihood of significant treatment effects on image metrics and reference measurements. Mowing height had a significant impact on all variables and was primarily explained by VQ. The irrigation effect was significant for all variables and was also primarily explained by VQ. All variables were significantly affected by Date, where the HSVi and DGCI prominently explained the Date effect, followed by BA. A substantial portion of the overall effect size for NDVI, %G, and %Y was also attributed to the Date, indicating the detection of the autumn seasonal trend towards biological dormancy. The interaction between mowing height and water was significant for all variables except DGCI and was best explained by COMB2, although this interaction category had the most unexplained variance overall. The interaction effect of mowing height by date was significant for all variables except VQ and DGCI and was best explained by COMB2 with a reduced but still large effect size. The irrigation-by-date interaction was significant for all variables except DGCI and HSVi, where VQ provided the best explanation followed by %Y. The three-way interaction effect was significant for all variables except VQ and %G, but was explained most by COMB2, indicating the most sensitivity to experimental treatment interactions with this combination index variable.
These results suggest that the different metrics play distinct roles in resolving various aspects of the biological information signal. The mean values were significantly different for all treatment effects, suggesting potential utility for detecting variations in irrigation quantity, mowing height, changes over time, and their interactions. The COMB2 variable may have outperformed NDVI with higher categorical effects significance, but the ηp2 (and F-values) were highest for %Y. The reduced effect sizes for the interaction of mowing height and irrigation suggest that the dynamics of this treatment merit further resolution. Additional tables of the p-values and F-values used to create Table 4 are provided (in the Supplementary Materials) to offer more precise probability values of significant differences and additional model fit measures.
3.4. Image Corrections Effect on Replicates
Three replicate images were initially taken in rapid succession for each of the 316 lysimeter samples. A coefficient of variation (CV) was calculated for each image metric and with each image correction, and the CV values were averaged across all collections for each file type and correction. The average values of CV were compared to the correlations (Table 2) and ANOVA (Table 3) results. The CV values are presented in Table 5.
Significant differences in image correction effects were observed in the average CV values between replicated images for all metrics except %Y and BA, using the Kruskal–Wallis test of medians and Dunn’s test with JPG as control. The JPG LC CC for the %Y correlated strongest with VQ (Table 2), even though TIF LC CC had the lowest CV. The %G TIF data correlated best with NDVI and RED reflectance (Table 2), even though TIF LC CC had the lowest CV and a noticeable difference for CC with reduced CV was evident. The DGCI TIF CC most strongly correlated with productivity (Table 2), even though the uncorrected JPG and TIF formats showed lower CV values and there was an increased CV with LC visible. HSVi calculation in TIF CC imagery correlated the strongest with NIR reflectance (Table 2), even though the TIF and JPG had lower CV values and a general increase in CV with CC was visible. The JPG LC BA calculation most strongly correlated with NDRE and RE reflectance (Table 2), even though the uncorrected JPG had the lowest CV. The COMB2 calculation of JPG LC imagery correlated the strongest with water use, even though the uncorrected TIF had the lowest CV and there was generally a higher CV with corrections.
Image corrections significantly affected the median CV of image replicates in four out of the six metrics. However, the median CV of the image replicates for each calculated metric did not explain the subsequent model fit. Higher values of CV did not result in reduced explanatory capacity. Yet the uncorrected JPEG file format showed a CV lower than the overall average for each metric. These results demonstrate a weak relationship between the dispersion in replicates and subsequent correlation with reference measurements. Therefore, the median dispersion of image values from replicates did not indicate a better model fit or better explain the reference measurements for these six calculated metrics. The analysis of CV also did not consistently identify a specific image format or correction method that led to higher or lower data dispersion. This result supports the notion that the effects of the image corrections applied are present, but they are difficult to parameterize and explain across multiple and varied image-based calculations.
3.5. Time Series Charts for Three Metrics, NDVI, %Y, and COMB2
NDVI was not previously reported by Hejl [105] and is shown here (Figure 3) to illustrate the utility of this common measurement, which was used as a reference for evaluating the imagery data in this paper. Treatments are displayed in a time series throughout the autumn season experiment. A general downward trend in NDVI was evident, but several treatments exhibited increased NDVI over the first third of the experiment before decreasing thereafter. At the beginning of the experiment, the shorter mowing heights automatically had decreased presence of green biomass, resulting in lower NDVI values compared to taller heights. As the experiment progressed and the irrigation deficit treatment intensified, shorter mowing heights were able to sustain their green biomass presence longer. In the lowest irrigation treatment, the relative NDVI values ranged from largest to smallest, corresponding to the heights from shortest to tallest. The middle irrigation treatment also exhibited this effect, but with less pronouncement, where NDVI values became comparable by the end of the experiment. Full irrigation treatment exhibited the highest NDVI values overall, and taller mowing heights were more effective in preserving green biomass when given adequate water. The NDVI values for the fully irrigated plants were generally arranged from taller to shorter heights, except for the shortest height, which had the second highest NDVI values and was comparable to the tallest full irrigation treatment. Results suggest that green biomass presence may be prolonged under reduced water availability with shorter mowing heights. Additionally, a moderate reduction in water availability may have a limited effect on NDVI regardless of mowing height. However, the largest green biomass as measured by NDVI can be expected with higher mowing heights and full demand replacement water applications.
VQ was previously reported by Hejl in 2024 [105]. Therefore, a time series of the senesced plant indication %Y area segment is shown as a proxy relating to the experimental treatments, as calculated from the JPG LC CC image set (Figure 4). The %Y had the highest correlation with VQ, an R2 of 0.79 (Table 2) and demonstrated the highest effects size overall (Table 4). The %Y was scaled as a percentage of the top inside area of the lysimeter. Values could exceed 100% if yellow-colored leaves covered the lysimeter area and extended beyond the perimeter of the lysimeter top. The classification of the %Y showed a more consistent trend than the NDVI, basically increasing for all treatments across the experimental period. However, shorter mow heights generally resulted in less yellow presence. Unexpectedly, the 7.5 cm mow height showed more %Y than the 10 cm height for both the highest and middle irrigation levels indicating a possible microclimate, root system, or nutrient allocation impact. Results suggest that the image-based classification of the fractional yellow area can be an effective metric for assessing visual quality in turfgrass. It can also differentiate the mow height effect when water is significantly reduced.
COMB2 data are presented in time series (Figure 5) to show the different behavior of this color metric over time, as compared to NDVI and %Y, because it most correlated with water use (Table 2), and because it showed significant difference for all experimental effects (Table 4). A complex and nuanced response is evident using this combination of vegetation indices. Higher COMB2 values suggest healthier vegetation and more biomass. This can be seen in the first half of the experiment where values decline as plants tended to decrease in quality over the experiment. However, these results are difficult to interpret because unlike NDVI and %Y (Figure 3 and Figure 4), there are larger COMB2 initial treatment differences that converge as the experiment progressed. This may indicate the effect of season and the lowest irrigation, where the two taller mow heights increased in COMB2 value over the experiment even though their color quality decreased, perhaps as a response to change in canopy density. Likewise, the initial separation regarding mow height groupings that is evident across the irrigation levels may suggest distinct canopy structures. Another consideration is that the image correction effect was very large for this combined index metric (Table 3), where different corrections substantially changed the correlations with the different reference measurements (Table 2). Nevertheless, the statistical significance with treatment effects results suggests an information contribution from the COMB2 color term with TIF LC CC application, where the effect size for three out of the four treatment interaction terms was largest with COMB2.
Additional time series charts for the metrics of %C, %Green, DGCI, HSVi, NDRE, NIR, RED, and RE metrics are provided in the Supplementary Materials.
4. Discussion
To assist researchers, students, and managers, the authors propose a practical approach that includes accessible camera controls for collecting relevant image-based plant phenotyping data. Locked manual photographic exposure settings that balance a lower ISO, moderately higher F-Stop, and, lastly, a higher shutter speed should be selected based on lighting intensity of the target environment. It is important to measure and set a customized white balance for the specific spectral intensity of the ambient illumination rather than a pre-set color temperature. It may be acceptable to rely solely on modern camera technology with simple exposure controls and achieve sufficient image data quality to perform basic plant phenotyping for advancement of scientific knowledge. However, in some instances, such as with the calculation of COMB2 in this paper, image corrections induced significant effects strong enough to change result interpretations. To overcome possible gaps in knowledge, transparent standardized protocols and detailed metadata are needed that include cross or independent validations and robust statistical analysis utilizing sample replicates and controls.
Utilizing a simple camera-based plant phenotyping approach, the irrigation and mow height experiment measurements from [105] were used as references for image calculations. Correlations between image calculations and reference measurements verify that a consumer-grade camera with exposure control can resolve phenotypic responses for turfgrass in a greenhouse, and how new calculations with increased correlations over standard metrics such as DGCI are possible. The highest R2 values were observed in the relationship between VQ with %Y area classification, and NDVI with %G area classification (Table 2). Traditionally, the evaluation of turfgrass has focused only on green color areas. However, our findings reveal that the presence of yellow, which demonstrated congruence with human perception of quality and significant treatment effects, offers insight into assessing the relative presence of green biomass in turfgrass. Casadesus [146] introduced a “Green” and “Greener” segmentation in wheat, incorporating some yellow into the “Green” classification. By introducing a dedicated yellow cover class, such as the %Y metric employed in our study, the yellow stems and senesced plant material is captured and quantified more comprehensively. A more complete classification system that includes the readily observable visual cue of yellow may allow for greater precision in evaluating the composition and health of turfgrass which could improve phenotyping analyses.
Further research could clarify COMB2 value in relation to plant biological status because the detected treatment effects significance with image corrections (Table 4) suggests a utility of this color calculation to support understanding of plant phenotypic traits, but COMB2 behaved differently than traditional terms like NDVI (Figure 3). Although the COMB2 calculation may provide some phenotypic detection capability and correlated the strongest with water use, it was not powerful enough to predict water use. Similarly, although DGCI correlated strongest with clipping production (Table 2), the relationship was not robust enough to explain most of the variation in that reference measurement. The BA ratio correlated most with NDRE and RE reflectance, while the novel term HSVi correlated most with the NIR spectral reflectance component (Table 2). Therefore, the calculation of BA or HSVi may be useful to roughly estimate respective RE or NIR spectral reflectance when only color camera data are available. The BA ratio may serve to emphasize yellowness relative to redness and provide an indication of chlorophyll content or plant health, but authors did not find reference to this ratio in the plant phenotyping literature so additional research is warranted. HSVi showed stronger correlation with NDVI than DGCI (Table 2). Therefore, variations in this illumination modulated Hue angle determined by HSVi may help detect pigmentation status or tissue health across the biomass area or over time. However, because authors are not aware of other research using the HSVi image calculation for plant phenotyping, additional research is needed to determine biological significance.
This paper demonstrates how image acquisition using consumer cameras with simple controls enables not only traditional metrics like the green area or DGCI but also facilitates additional calculations which may be valuable in the description of plant phenotypes such as bespoke yellow area, HSVi, or BA. Although it could be possible to avoid image corrections and still achieve acceptable results when simple exposure controls are used, the application of appropriate image corrections before the calculation of image metrics will likely generate the most useful information. The optimal combination of file format and image correction improved the correlations with reference measurements (Table 2) and significant treatment effects sizes and detections (Table 4). However, the varied and mixed correction effects results in this paper suggest further research is necessary to precisely identify correction effects on individual image calculations relative to phenotypic expression and prescribe the best methods to maximize corrections utility.
The findings in this paper show that the image corrections applied resulted in inconsistent outcomes. While correlations with reference measurements generally improved with image corrections, improvements were often minor and sometimes negative. Only %Y showed no significant effect from the TIFF file format or the image corrections, while CC did cause large significant differences in COMB2, BA, DGCI, and HSVi (Table 3). The mean coefficient of variation for replicates was significantly affected by corrections for %G, DGCI, HSVi, and COMB2 (Table 5). The optimal combination of file format and image corrections improved R2 values for five out of the six image metrics but resulted in less than 2% improvement. However, COMB2 was improved by 31% (Table 2). This suggests that the application of all standard image corrections may not automatically improve plant phenotyping results, and yet a lack of best file format and correction implementation could lead to the oversight of crucial relationships. An improved statistical evaluation using a larger and more diverse set of images and reference measurements, including the observation of additional plants beyond TifTuf over a longer period is needed to gain a deeper understanding of the image adjustments effects and inform best practices. Furthermore, the inclusion of more robust phenotyping equipment to provide thermal, hyperspectral, florescence, and structural information would better contextualize the potential of the traditional camera.
5. Conclusions
A consumer-grade camera and custom-fit lightbox made from rudimentary materials were used to verify the cost-effective method for plant phenotyping. Image-based metrics were calculated using open-source software on a laptop computer and enabled the comparison of human-assessed VQ and active NDVI measurements with image data to determine phenotyping utility. Statistical analysis revealed the highest R2 values for VQ and NDVI with the image-based calculations of %Y and %G areas, respectively. A novel HSV color space calculation, termed HSVi demonstrated the highest R2 with NIR, while the new CIELAB b* to a* ratio correlated strongest with NDRE and RE reflectance. Even though DGCI exhibited the highest R2 value for biomass productivity, it did not explain most of the variation. The COMB2 calculation showed significant experimental treatment effects, superior to VQ and comparable to NDVI but was difficult to interpret. COMB2 correlated the strongest with water consumption but was not adequate to serve as a proxy. %Y displayed increased overall treatment explanatory power for the experimental effects. Guidance on photographic exposure control was provided to improve the quality of image-based phenotyping data, along with a discussion on software options, processes, and limitations in Supplemental text [147,148,149,150,151,152,153,154,155,156,157,158,159,160,161,162,163,164,165,166,167,168,169,170,171,172,173,174,175,176,177,178,179,180,181,182,183,184].
This study indicates that modern camera technology with simple controls is robust enough to empower plant phenotyping for research and commercial applications that involve cover and color quantification. However, the imagery did not fully describe water use or clipping productivity. Despite testing color and lens corrections and the use of uncompressed image format, no large and consistent improvement in the image-based calculations were observed. This suggests that lossless format and corrections may not always be necessary for standard plant phenotyping when modern cameras, sufficient illumination, and straightforward exposure controls are utilized. A method to quantify the effects of image corrections relative to plant phenotype is needed to determine appropriateness for resolving plant traits of interest with sufficient precision. Parameterizing the influence of corrections on image-based calculations for different phenotyping metrics would strengthen the connection between photographic digital information and plant biological processes, standardizing and enhancing phenotyping practices.
Mention of a trade names or commercial products in this publication is solely for the purpose of providing specific information and does not imply recommendation or endorsement by the U.S. Department of Agriculture or any part herein. USDA is an equal opportunity provider and employer.
Conceptualization, M.M.C. and R.W.H.; methodology, M.M.C. and R.W.H.; software, M.M.C.; validation, M.M.C.; investigation, M.M.C. and R.W.H.; resources, R.W.H. and C.F.W.; data curation, M.M.C.; writing original draft preparation, M.M.C.; writing—review and editing, R.W.H. and D.D.S.; visualization, M.M.C.; supervision, R.W.H.; project administration, R.W.H. and C.F.W.; funding acquisition, R.W.H. and C.F.W. All authors have read and agreed to the published version of the manuscript.
Not applicable.
Not applicable.
Datasets are supplied in the
The authors thank Julia Stiles for process quality control; John Heun for engineering support in creating the remote camera trigger; Michael Roybal for information technology support and team leadership with lasting effect; Mitiku Mengistu for helpful discussions; and Sharette Rockholt for experiment technical assistance.
The authors declare no conflicts of interest.
Footnotes
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.
Figure 1. The lightbox is shown in greenhouse #1 (panel (a), left side) with the camera installed on top. The remote trigger with switch and the 12-volt power supply with 7.5 Ah SLA battery and wires are visible on the left and bottom left side. The lightbox diagram (panel (b), right side) illustrates the placement of LED lights and demonstrates how a lysimeter would be inserted into the box and photographed against the white background.
Figure 2. An example lysimeter uncorrected image and three masked views. Experiment treatment 30% water and 5.0 cm mow height is shown in an image taken on 10/26/2023 (Week 2) with associated 0.61 NDVI and 7.0 VQ (panel (a), upper left), 97.8% of the lysimeter area covered in live green material (%C) segment (panel (b), upper right), resulting in 0.280 DGCI, 0.400 HSVi, and 7.010 COMB2 calculation values, with 31.1% yellow (%Y) plant cover (panel (c), lower left), and 59.0% green (%G) cover fractions (panel (d), lower right).
Figure 3. NDVI time series chart with NDVI plotted on the Y-axis and date on the X-axis. The experimental treatments are labeled by their percentage of consumptive demand-based irrigation supplied (i = 100, 65, and 30) and their mowing heights (h = 10, 7.5, 5.0, and 2.5 cm). Each treatment is grouped by irrigation level and is uniquely colored, and the line pattern is based on mowing height. NDVI shows changes in time and differences with experimental treatment.
Figure 4. A %Y time series chart is presented, where the image-based yellow color classification segment is plotted on the inverted Y-axis and the date is on the X-axis. The experimental treatments are labeled by their percentage of consumptive demand-based irrigation supplied (i = 100, 65, and 30) and their mowing heights (h = 10, 7.5, 5.0, and 2.5 cm). Each treatment is grouped by irrigation level and uniquely colored, the line pattern is based on mowing height. Results show change over time and increased treatment separation with the greatly reduced water treatment.
Figure 5. A COMB2 time series chart is presented where the combination term is plotted on the Y-axis and date is on the X-axis. The experimental treatments are labeled by their percentage of consumptive demand-based irrigation supplied (i = 100, 65, and 30 actual evapotranspiration replacement) and their mowing heights (h = 10, 7.5, 5.0, and 2.5 cm). Each treatment is grouped by irrigation level and uniquely colored, the line pattern is based on mowing height. Results show reduced change over time, but increased treatment separation when compared to NDVI and %Y (Figure 3 and Figure 4).
Color classification segmentation pixel value ranges used in the Python process.
Hue | Saturation | Value | |
---|---|---|---|
%C | 27–90 | 60–255 | 1–255 |
%Y | 16–26 | 60–255 | 1–255 |
%G | 30–60 | 60–255 | 1–255 |
Hue is scaled to 0–179, and Saturation and Value are scaled to 0–255 for fractional living plant cover, yellow, and green as %C, %Y and %G, respectively.
Person’s correlation coefficient (r) and coefficient of determination (R2) values for image metrics and associated reference measurements.
VQ (316) | mg (280) | mm (108) | NDVI (316) | NDRE (316) | NIR (316) | RED (316) | RE (316) | ||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
r | R2 | r | R2 | r | R2 | r | R2 | r | R2 | r | R2 | r | R2 | r | R2 | ||
%Y | JPG | −0.886 | 0.785 | −0.463 | 0.214 | −0.153 | 0.023 | −0.804 | 0.646 | −0.829 | 0.688 | −0.798 | 0.637 | 0.723 | 0.522 | 0.695 | 0.483 |
JPG CC | −0.889 | 0.790 | −0.476 | 0.227 | −0.138 | 0.019 | −0.808 | 0.652 | −0.836 | 0.699 | −0.806 | 0.650 | 0.725 | 0.525 | 0.699 | 0.489 | |
JPG LC | −0.887 | 0.787 | −0.463 | 0.214 | −0.156 | 0.024 | −0.804 | 0.646 | −0.829 | 0.688 | −0.798 | 0.636 | 0.723 | 0.522 | 0.695 | 0.483 | |
JPG LC CC | −0.889 | 0.790 | −0.477 | 0.227 | −0.143 | 0.020 | −0.810 | 0.656 | −0.838 | 0.702 | −0.808 | 0.652 | 0.727 | 0.528 | 0.701 | 0.491 | |
TIF | −0.887 | 0.786 | −0.464 | 0.215 | −0.153 | 0.023 | −0.804 | 0.646 | −0.830 | 0.688 | −0.798 | 0.637 | 0.723 | 0.522 | 0.695 | 0.483 | |
TIF CC | −0.889 | 0.790 | −0.476 | 0.227 | −0.133 | 0.018 | −0.805 | 0.648 | −0.836 | 0.698 | −0.805 | 0.649 | 0.722 | 0.521 | 0.698 | 0.488 | |
TIF LC | −0.887 | 0.787 | −0.471 | 0.222 | −0.155 | 0.024 | −0.810 | 0.656 | −0.831 | 0.691 | −0.801 | 0.642 | 0.731 | 0.535 | 0.696 | 0.484 | |
TIF LC CC | −0.885 | 0.783 | −0.493 | 0.243 | −0.131 | 0.017 | −0.812 | 0.659 | −0.833 | 0.694 | −0.807 | 0.651 | 0.734 | 0.539 | 0.694 | 0.482 | |
DGCI | JPG | 0.744 | 0.553 | 0.600 | 0.360 | −0.109 | 0.012 | 0.765 | 0.586 | 0.840 | 0.706 | 0.841 | 0.707 | −0.666 | 0.443 | −0.673 | 0.453 |
JPG CC | 0.785 | 0.616 | 0.608 | 0.369 | −0.068 | 0.005 | 0.804 | 0.646 | 0.850 | 0.723 | 0.849 | 0.721 | −0.717 | 0.515 | −0.686 | 0.471 | |
JPG LC | 0.757 | 0.574 | 0.581 | 0.338 | −0.114 | 0.013 | 0.768 | 0.590 | 0.865 | 0.748 | 0.861 | 0.741 | −0.655 | 0.429 | −0.699 | 0.488 | |
JPG LC CC | 0.793 | 0.629 | 0.604 | 0.364 | −0.069 | 0.005 | 0.813 | 0.661 | 0.873 | 0.761 | 0.870 | 0.756 | −0.718 | 0.515 | −0.706 | 0.498 | |
TIF | 0.738 | 0.545 | 0.599 | 0.359 | −0.108 | 0.012 | 0.765 | 0.585 | 0.839 | 0.704 | 0.840 | 0.706 | −0.666 | 0.444 | −0.671 | 0.451 | |
TIF CC | 0.786 | 0.618 | 0.608 | 0.370 | −0.070 | 0.005 | 0.802 | 0.644 | 0.849 | 0.721 | 0.848 | 0.719 | −0.716 | 0.512 | −0.686 | 0.470 | |
TIF LC | 0.508 | 0.258 | 0.419 | 0.176 | −0.203 | 0.041 | 0.455 | 0.207 | 0.521 | 0.271 | 0.529 | 0.280 | −0.387 | 0.149 | −0.414 | 0.171 | |
TIF LC CC | 0.753 | 0.567 | 0.591 | 0.350 | −0.116 | 0.013 | 0.743 | 0.552 | 0.795 | 0.633 | 0.797 | 0.634 | −0.658 | 0.433 | −0.642 | 0.412 | |
COMB2 | JPG | 0.613 | 0.375 | 0.471 | 0.222 | −0.214 | 0.046 | 0.510 | 0.261 | 0.513 | 0.264 | 0.515 | 0.265 | −0.469 | 0.220 | −0.433 | 0.187 |
JPG CC | 0.073 | 0.005 | 0.121 | 0.015 | −0.479 | 0.229 | −0.126 | 0.016 | −0.044 | 0.002 | −0.022 | 0.000 | 0.160 | 0.026 | 0.028 | 0.001 | |
JPG LC | 0.646 | 0.417 | 0.480 | 0.230 | −0.016 | 0.000 | 0.660 | 0.436 | 0.624 | 0.389 | 0.626 | 0.391 | −0.630 | 0.397 | −0.519 | 0.270 | |
JPG LC CC | 0.012 | 0.000 | 0.044 | 0.002 | −0.580 | 0.337 | −0.245 | 0.060 | −0.085 | 0.007 | −0.070 | 0.005 | 0.317 | 0.100 | 0.047 | 0.002 | |
TIF | 0.616 | 0.380 | 0.471 | 0.222 | −0.212 | 0.045 | 0.513 | 0.263 | 0.515 | 0.265 | 0.515 | 0.266 | −0.471 | 0.222 | −0.434 | 0.188 | |
TIF CC | 0.015 | 0.000 | 0.078 | 0.006 | −0.491 | 0.241 | −0.177 | 0.031 | −0.096 | 0.009 | −0.072 | 0.005 | 0.206 | 0.042 | 0.072 | 0.005 | |
TIF LC | 0.647 | 0.419 | 0.426 | 0.181 | −0.082 | 0.007 | 0.614 | 0.378 | 0.633 | 0.401 | 0.624 | 0.390 | −0.553 | 0.306 | −0.530 | 0.281 | |
TIF LC CC | −0.052 | 0.003 | −0.012 | 0.000 | −0.597 | 0.356 | −0.252 | 0.064 | −0.107 | 0.011 | −0.085 | 0.007 | 0.313 | 0.098 | 0.079 | 0.006 | |
%G | JPG | 0.739 | 0.546 | 0.460 | 0.211 | 0.401 | 0.161 | 0.911 | 0.829 | 0.863 | 0.745 | 0.849 | 0.720 | −0.868 | 0.753 | −0.705 | 0.498 |
JPG CC | 0.731 | 0.534 | 0.433 | 0.187 | 0.439 | 0.193 | 0.897 | 0.804 | 0.844 | 0.713 | 0.826 | 0.683 | −0.857 | 0.734 | −0.694 | 0.481 | |
JPG LC | 0.736 | 0.542 | 0.460 | 0.211 | 0.405 | 0.164 | 0.911 | 0.829 | 0.862 | 0.743 | 0.848 | 0.719 | −0.868 | 0.754 | −0.705 | 0.497 | |
JPG LC CC | 0.727 | 0.529 | 0.430 | 0.185 | 0.447 | 0.200 | 0.896 | 0.802 | 0.841 | 0.708 | 0.823 | 0.677 | −0.857 | 0.734 | −0.692 | 0.479 | |
TIF | 0.738 | 0.545 | 0.460 | 0.211 | 0.403 | 0.162 | 0.911 | 0.830 | 0.863 | 0.744 | 0.848 | 0.720 | −0.868 | 0.753 | −0.705 | 0.497 | |
TIF CC | 0.725 | 0.525 | 0.426 | 0.182 | 0.455 | 0.207 | 0.892 | 0.796 | 0.837 | 0.700 | 0.818 | 0.670 | −0.854 | 0.730 | −0.689 | 0.474 | |
TIF LC | 0.733 | 0.538 | 0.462 | 0.214 | 0.415 | 0.173 | 0.904 | 0.817 | 0.851 | 0.724 | 0.838 | 0.702 | −0.865 | 0.748 | −0.696 | 0.485 | |
TIF LC CC | 0.717 | 0.515 | 0.429 | 0.184 | 0.469 | 0.220 | 0.888 | 0.788 | 0.827 | 0.683 | 0.809 | 0.655 | −0.854 | 0.729 | −0.680 | 0.462 | |
BA | JPG | 0.829 | 0.688 | 0.554 | 0.307 | 0.084 | 0.007 | 0.862 | 0.742 | 0.902 | 0.813 | 0.888 | 0.788 | −0.772 | 0.595 | −0.734 | 0.539 |
JPG CC | 0.836 | 0.698 | 0.553 | 0.305 | 0.095 | 0.009 | 0.865 | 0.748 | 0.902 | 0.814 | 0.886 | 0.786 | −0.776 | 0.602 | −0.736 | 0.542 | |
JPG LC | 0.826 | 0.682 | 0.552 | 0.305 | 0.088 | 0.008 | 0.863 | 0.744 | 0.904 | 0.817 | 0.890 | 0.792 | −0.772 | 0.596 | −0.736 | 0.542 | |
JPG LC CC | 0.834 | 0.696 | 0.555 | 0.308 | 0.096 | 0.009 | 0.865 | 0.749 | 0.901 | 0.812 | 0.886 | 0.786 | −0.778 | 0.605 | −0.735 | 0.541 | |
TIF | 0.827 | 0.683 | 0.555 | 0.308 | 0.084 | 0.007 | 0.862 | 0.744 | 0.902 | 0.813 | 0.888 | 0.789 | −0.773 | 0.597 | −0.734 | 0.539 | |
TIF CC | 0.836 | 0.698 | 0.552 | 0.305 | 0.097 | 0.009 | 0.866 | 0.749 | 0.902 | 0.813 | 0.887 | 0.786 | −0.777 | 0.604 | −0.736 | 0.542 | |
TIF LC | 0.774 | 0.599 | 0.554 | 0.306 | 0.045 | 0.002 | 0.798 | 0.636 | 0.821 | 0.674 | 0.816 | 0.666 | −0.724 | 0.525 | −0.665 | 0.442 | |
TIF LC CC | 0.814 | 0.663 | 0.567 | 0.321 | 0.068 | 0.005 | 0.830 | 0.689 | 0.854 | 0.729 | 0.844 | 0.712 | −0.753 | 0.566 | −0.695 | 0.483 | |
HSVi | JPG | 0.747 | 0.558 | 0.514 | 0.265 | 0.192 | 0.037 | 0.881 | 0.776 | 0.898 | 0.807 | 0.891 | 0.795 | −0.805 | 0.648 | −0.716 | 0.513 |
JPG CC | 0.779 | 0.607 | 0.565 | 0.320 | 0.085 | 0.007 | 0.876 | 0.767 | 0.902 | 0.813 | 0.899 | 0.808 | −0.795 | 0.632 | −0.722 | 0.521 | |
JPG LC | 0.758 | 0.574 | 0.543 | 0.295 | 0.135 | 0.018 | 0.882 | 0.778 | 0.898 | 0.807 | 0.895 | 0.801 | −0.807 | 0.652 | −0.714 | 0.510 | |
JPG LC CC | 0.773 | 0.597 | 0.571 | 0.327 | 0.064 | 0.004 | 0.869 | 0.755 | 0.893 | 0.798 | 0.892 | 0.796 | −0.790 | 0.624 | −0.714 | 0.509 | |
TIF | 0.742 | 0.551 | 0.514 | 0.264 | 0.190 | 0.036 | 0.881 | 0.776 | 0.898 | 0.806 | 0.891 | 0.795 | −0.805 | 0.648 | −0.715 | 0.511 | |
TIF CC | 0.780 | 0.608 | 0.564 | 0.318 | 0.094 | 0.009 | 0.878 | 0.770 | 0.902 | 0.814 | 0.899 | 0.808 | −0.798 | 0.637 | −0.723 | 0.523 | |
TIF LC | 0.682 | 0.465 | 0.512 | 0.262 | 0.103 | 0.011 | 0.754 | 0.568 | 0.758 | 0.575 | 0.759 | 0.575 | −0.697 | 0.485 | −0.602 | 0.363 | |
TIF LC CC | 0.755 | 0.570 | 0.574 | 0.330 | 0.041 | 0.002 | 0.819 | 0.671 | 0.839 | 0.704 | 0.840 | 0.705 | −0.748 | 0.559 | −0.671 | 0.451 |
Bold font identifies which file format and correction that had the highest correlation with the image calculation’s reference measurement. The number of observations is noted in parenthesis beside the reference measurement in the top row. The file format and corrections applied are listed in the left column for each image metric.
Medians and significant difference for file format and correction effects.
%Y | %G | DGCI | HSVi | BA | ALI | COMB2 | |
---|---|---|---|---|---|---|---|
JPG | 0.2652 | 0.8965 | 0.2960 | 0.4331 | −3.5899 | 0.5614 | 10.4521 |
JPG CC | 0.2485 NS | 0.9616 *** | 0.3642 *** | 0.5293 *** | −2.7051 *** | 0.5637 *** | −2.5148 *** |
JPG LC | 0.2653 NS | 0.9305 *** | 0.2787 * | 0.4393 NS | −3.6054 NS | 0.5656 *** | 16.5374 *** |
JPG LC CC | 0.2494 NS | 0.9909 * | 0.3571 * | 0.5359 *** | −2.6843 *** | 0.5681 *** | −3.3534 *** |
TIF | 0.2722 NS | 0.9335 NS | 0.2917 NS | 0.4329 NS | −3.5832 NS | 0.5613 NS | 10.4991 NS |
TIF CC | 0.2568 NS | 0.9958 ** | 0.3667 *** | 0.5351 *** | −2.6672 *** | 0.3641 *** | −2.9138 *** |
TIF LC | 0.2788 NS | 0.9586 NS | 0.2872 NS | 0.4475 * | −3.4666 NS | 0.5676 *** | 12.7586 NS |
TIF LC CC | 0.2638 NS | 1.0121 *** | 0.3637 *** | 0.5503 *** | −2.5862 *** | 0.5676 *** | −3.5182 *** |
Kruskal-Wallis | NS | *** | *** | *** | *** | *** | *** |
Bold font indicates the correction for each metric variable which correlated best with its associated reference measurement (
Partial eta-squared (ηp2) effect sizes and significance level of permutational multivariate ANOVA.
JPG_LC_CC | TIF | TIF_CC | TIF_CC | JPG_LC | TIF LC CC | |||
---|---|---|---|---|---|---|---|---|
VQ | NDVI | %Y | %G | DGCI | HSVi | BA | COMB2 | |
Mowing (M) | 0.09 *** | 0.02 *** | 0.05 *** | 0.04 *** | 0.05 *** | 0.00 * | 0.02 *** | 0.20 *** |
Irrigation (I) | 0.21 *** | 0.06 *** | 0.15 *** | 0.11 *** | 0.05 *** | 0.06 *** | 0.07 *** | 0.05 *** |
Date (D) | 0.41 *** | 0.66 *** | 0.52 *** | 0.57 *** | 0.71 *** | 0.72 *** | 0.69 *** | 0.15 *** |
M × I | 0.02 *** | 0.02 *** | 0.04 *** | 0.02 *** | 0.01 NS | 0.01 * | 0.01 ** | 0.06 *** |
M × D | 0.02 NS | 0.03 ** | 0.06 *** | 0.07 *** | 0.02 NS | 0.03 ** | 0.03 ** | 0.15 *** |
I × D | 0.08 *** | 0.03 *** | 0.07 *** | 0.04 *** | 0.01 NS | 0.01 NS | 0.02 * | 0.05 *** |
M × I × D | 0.04 NS | 0.06 *** | 0.03 * | 0.03 NS | 0.04 * | 0.05 ** | 0.04 * | 0.10 *** |
Results were generated using 100,000 iterations, for the experimental mow height treatments (denoted as Mowing or M), water supplied (denoted as Irrigation or I) and date of measurement (denoted as Date or D), with interactions with VQ, NDVI, and image metrics are presented. The treatments included four mowing heights (2.5, 5.0, 7.5, and 10.0 cm), three water application levels [(100 × ETa (actual evapotranspiration), 65 × ETa, and 30 × ETa)], and eight weekly data collections. The file format and image correction applied are noted above the variable name. ηp2 values measure percentage of variance accounted by each treatment effect where higher values indicate more significant effect sizes, respective values above 0.01, 0.06, and 0.14 indicate small, medium, and large. Significant differences for means are indicated as not significant, p ≤ 0.05, p ≤ 0.01, and p ≤ 0.001 using the symbols NS, *, **, and ***, respectively.
Coefficient of variation (CV) for replicate images across file format and image corrections.
%Y | %G | DGCI | HSVi | BA | COMB2 | |
---|---|---|---|---|---|---|
JPG | 0.0154 | 0.0084 | 0.0033 | 0.0032 | −0.0059 | 0.0131 |
JPG CC | 0.0131 NS | 0.0058 * | 0.0039 NS | 0.0042 * | −0.0066 NS | 0.0206 NS |
JPG LC | 0.0162 NS | 0.0085 NS | 0.0060 *** | 0.0042 NS | −0.0063 NS | 0.0184 NS |
JPG LC CC | 0.0141 NS | 0.0059 * | 0.0057 *** | 0.0051 *** | −0.0070 NS | 0.0326 * |
TIF | 0.0162 NS | 0.0081 NS | 0.0033 NS | 0.0031 NS | −0.0059 NS | 0.0127 NS |
TIF CC | 0.0133 NS | 0.0053 ** | 0.0060 NS | 0.0039 NS | −0.0065 NS | 0.0195 NS |
TIF LC | 0.0154 NS | 0.0080 NS | 0.0054 *** | 0.0036 NS | −0.0063 NS | 0.0143 NS |
TIF LC CC | 0.0129 NS | 0.0053 *** | 0.0063 *** | 0.0048 *** | −0.0070 NS | 0.0354 NS |
Average | 0.0236 NS | 0.0118 *** | 0.0057 *** | 0.0052 *** | −0.0088 NS | 0.0142 ** |
Bold values note the image correction that resulted in the highest correlation between the image-based calculation variable and its associated reference (
Supplementary Materials
The following supporting information can be downloaded at:
References
1. Beard, J.B.; Green, R.L. The Role of Turfgrasses in Environmental Protection and Their Benefits to Humans. J. Environ. Qual.; 1994; 23, pp. 452-460. [DOI: https://dx.doi.org/10.2134/jeq1994.00472425002300030007x]
2. Niazi, P.; Alimyar, O.; Azizi, A.; Monib, A.W.; Ozturk, H. People-plant Interaction: Plant Impact on Humans and Environment. J. Environ. Agric. Stud.; 2023; 4, pp. 01-07. [DOI: https://dx.doi.org/10.32996/jeas.2023.4.2.1]
3. Rehman, I.; Hazhirkarzar, B.; Patel, B.C. Anatomy, Head and Neck, Eye. StatPearls; StatPearls Publishing: Treasure Island, FL, USA, 2024; Available online: http://www.ncbi.nlm.nih.gov/books/NBK482428 (accessed on 14 June 2024).
4. Delgado, S. Dziga Vertov’s ‘Man with a Movie Camera’ and the Phenomenology of Perception. Film Crit.; 2009; 34, pp. 1-16. Available online: https://www.jstor.org/stable/24777403 (accessed on 14 June 2024).
5. Wrathall, M. Skillful Coping: Essays on the Phenomenology of Everyday Perception and Action; Oxford University Press: Oxford, UK, 2014.
6. Ekdahl, D. Review of Daniel O’Shiel, the Phenomenology of Virtual Technology: Perception and Imagination in a Digital Age, Dublin: Bloomsbury Academic, 2022. Phenomenol. Cogn. Sci.; 2023; [DOI: https://dx.doi.org/10.1007/s11097-023-09925-y]
7. Sesario, R.; Satmoko, N.D.; Margery, E.; Octavia, Y.F.; Tarigan, M.I. The Comparison Analysis of Brand Association, Brand Awareness, Brand Loyalty and Perceived Quality of Two Top of Mind Camera Products. JEMSI; 2023; 9, pp. 388-392. [DOI: https://dx.doi.org/10.35870/jemsi.v9i2.1058]
8. Goma, S.; Aleksic, M.; Georgiev, T. Camera Technology at the dawn of digital renascence era. Proceedings of the 2010 Conference Record of the Forty Fourth Asilomar Conference on Signals, Systems and Computers; Pacific Grove, CA, USA, 7–10 November 2010; pp. 847-850. [DOI: https://dx.doi.org/10.1109/ACSSC.2010.5757686]
9. Esposito, M.; Crimaldi, M.; Cirillo, V.; Sarghini, F.; Maggio, A. Drone and sensor technology for sustainable weed management: A review. Chem. Biol. Technol. Agric.; 2021; 8, 18. [DOI: https://dx.doi.org/10.1186/s40538-021-00217-8]
10. Edwards, C.; Nilchiani, R.; Ganguly, A.; Vierlboeck, M. Evaluating the Tipping Point of a Complex System: The Case of Disruptive Technology. Syst. Eng.; 2022; [DOI: https://dx.doi.org/10.1002/sys.21782]
11. Yue, X.; Fossum, E.R. Simulation and design of a burst mode 20Mfps global shutter high conversion gain CMOS image sensor in a standard 180nm CMOS image sensor process using sequential transfer gates. Electron. Imaging; 2023; 35, pp. 328-1-328-5. [DOI: https://dx.doi.org/10.2352/EI.2023.35.6.ISS-328]
12. Riccardi, M.; Mele, G.; Pulvento, C.; Lavini, A.; d’Andria, R.; Jacobsen, S.-E. Non-destructive evaluation of chlorophyll content in quinoa and amaranth leaves by simple and multiple regression analysis of RGB image components. Photosynth. Res.; 2014; 120, pp. 263-272. [DOI: https://dx.doi.org/10.1007/s11120-014-9970-2]
13. Chang, Y.; Moan, S.L.; Bailey, D. RGB Imaging Based Estimation of Leaf Chlorophyll Content. Proceedings of the 2019 International Conference on Image and Vision Computing New Zealand (IVCNZ); Dunedin, New Zealand, 2–4 December 2019; IEEE: New York, NY, USA, 2019; pp. 1-6. [DOI: https://dx.doi.org/10.1109/IVCNZ48456.2019.8961030]
14. Zhang, H.; Ge, Y.; Xie, X.; Atefi, A.; Wijewardane, N.K.; Thapa, S. High throughput analysis of leaf chlorophyll content in sorghum using RGB, hyperspectral, and fluorescence imaging and sensor fusion. Plant Methods; 2022; 18, 60. [DOI: https://dx.doi.org/10.1186/s13007-022-00892-0]
15. Majer, P.; Sass, L.; Horváth, G.V.; Hideg, É. Leaf hue measurements offer a fast, high-throughput initial screening of photosynthesis in leaves. J. Plant Physiol.; 2010; 167, pp. 74-76. [DOI: https://dx.doi.org/10.1016/j.jplph.2009.06.015]
16. Taj-Eddin, I.A.T.F.; Afifi, M.; Korashy, M.; Ahmed, A.H.; Ng, Y.C.; Hernandez, E.; Abdel-Latif, S.M. Can we see photosynthesis? Magnifying the tiny color changes of plant green leaves using Eulerian video magnification. J. Electron. Imaging; 2017; 26, 060501. [DOI: https://dx.doi.org/10.1117/1.JEI.26.6.060501]
17. Vasilev, M.; Stoykova, V.; Veleva, P.; Zlatev, Z. Non-Destructive Determination of Plant Pigments Based on Mobile Phone Data. TEM J.; 2023; 12, pp. 1430-1442. [DOI: https://dx.doi.org/10.18421/TEM123-23]
18. Kandel, S.; Heer, J.; Plaisant, C.; Kennedy, J.; van Ham, F.; Riche, N.H.; Weaver, C.; Lee, B.; Brodbeck, D.; Buono, P. Research directions in data wrangling: Visualizations and transformations for usable and credible data. Inf. Vis.; 2011; 10, pp. 271-288. [DOI: https://dx.doi.org/10.1177/1473871611415994]
19. White, E.P.; Baldridge, E.; Brym, Z.T.; Locey, K.J.; McGlinn, D.J.; Supp, S.R. Nine simple ways to make it easier to (re)use your data. Ideas Ecol. Evol.; 2013; 6, Available online: https://ojs.library.queensu.ca/index.php/IEE/article/view/4608 (accessed on 14 June 2024). [DOI: https://dx.doi.org/10.4033/iee.2013.6b.6.f]
20. Goodman, A.; Pepe, A.; Blocker, A.W.; Borgman, C.L.; Cranmer, K.; Crosas, M.; Di Stefano, R.; Gil, Y.; Groth, P.; Hedstrom, M. et al. Ten Simple Rules for the Care and Feeding of Scientific Data. PLoS Comput. Biol.; 2014; 10, e1003542. [DOI: https://dx.doi.org/10.1371/journal.pcbi.1003542]
21. Wall, T.U.; McNie, E.; Garfin, G.M. Use-inspired science: Making science usable by and useful to decision makers. Front. Ecol. Environ.; 2017; 15, pp. 551-559. [DOI: https://dx.doi.org/10.1002/fee.1735]
22. Gašparovič, D.; Žarnovský, J.; Beloev, H.; Kangalov, P. Evaluation of the Quality of the Photographic Process at the Components Dimensions Measurement. Agric. For. Transp. Mach. Technol.; 2015; II, Available online: https://aftmt.uni-ruse.bg/images/vol.2.1/AFTMT_V_II-1-2015-3.pdf (accessed on 14 June 2024).
23. Liu, G.; Tian, S.; Mo, Y.; Chen, R.; Zhao, Q. On the Acquisition of High-Quality Digital Images and Extraction of Effective Color Information for Soil Water Content Testing. Sensors; 2022; 22, 3130. [DOI: https://dx.doi.org/10.3390/s22093130]
24. Chen, D.; Neumann, K.; Friedel, S.; Kilian, B.; Chen, M.; Altmann, T.; Klukas, C. Dissecting the Phenotypic Components of Crop Plant Growth and Drought Responses Based on High-Throughput Image Analysis. Plant Cell; 2015; 26, pp. 4636-4655. [DOI: https://dx.doi.org/10.1105/tpc.114.129601]
25. Honsdorf, N.; March, T.J.; Berger, B.; Tester, M.; Pillen, K. High-Throughput Phenotyping to Detect Drought Tolerance QTL in Wild Barley Introgression Lines. PLoS ONE; 2014; 9, e97047. [DOI: https://dx.doi.org/10.1371/journal.pone.0097047]
26. Wang, Y.; Wang, D.; Shi, P.; Omasa, K. Estimating rice chlorophyll content and leaf nitrogen concentration with a digital still color camera under natural light. Plant Methods; 2014; 10, 36. [DOI: https://dx.doi.org/10.1186/1746-4811-10-36]
27. Veley, K.M.; Berry, J.C.; Fentress, S.J.; Schachtman, D.P.; Baxter, I.; Bart, R. High-throughput profiling and analysis of plant responses over time to abiotic stress. Plant Direct; 2017; 1, e00023. [DOI: https://dx.doi.org/10.1002/pld3.23]
28. Liang, Z.; Pandey, P.; Stoerger, V.; Xu, Y.; Qiu, Y.; Ge, Y.; Schnable, J.C. Conventional and hyperspectral time-series imaging of maize lines widely used in field trials. GigaScience; 2018; 7, gix117. [DOI: https://dx.doi.org/10.1093/gigascience/gix117]
29. Horvath, B.; Vargas, J., Jr. Analysis of Dollar Spot Disease Severity Using Digital Image Analysis. Int. Turfgrass Soc. Res. J. Vol.; 2005; 10, pp. 196-201. Available online: https://www.researchgate.net/publication/268359230_ANALYSIS_OF_DOLLAR_SPOT_DISEASE_SEVERITY_USING_DIGITAL_IMAGE_ANALYSIS (accessed on 14 June 2024).
30. Horgan, F.G.; Jauregui, A.; Cruz, A.P.; Martínez, E.C.; Bernal, C.C. Changes in reflectance of rice seedlings during planthopper feeding as detected by digital camera: Potential applications for high-throughput phenotyping. PLoS ONE; 2020; 15, e0238173. [DOI: https://dx.doi.org/10.1371/journal.pone.0238173]
31. Adamsen, F.J.; Pinter, P.J.; Barnes, E.M.; LaMorte, R.L.; Wall, G.W.; Leavitt, S.W.; Kimball, B.A. Measuring Wheat Senescence with a Digital Camera. Crop Sci.; 1999; 39, pp. 719-724. [DOI: https://dx.doi.org/10.2135/cropsci1999.0011183X003900030019x]
32. Friell, J.; Watkins, E.; Horgan, B. Salt Tolerance of 74 Turfgrass Cultivars in Nutrient Solution Culture. Crop Sci.; 2013; 53, pp. 1743-1749. [DOI: https://dx.doi.org/10.2135/cropsci2012.08.0476]
33. Zhang, J.; Poudel, B.; Kenworthy, K.; Unruh, J.B.; Rowland, D.; Erickson, J.E.; Kruse, J. Drought responses of above-ground and below-ground characteristics in warm-season turfgrass. J. Agron. Crop Sci.; 2019; 205, pp. 1-12. [DOI: https://dx.doi.org/10.1111/jac.12301]
34. Lukina, E.V.; Stone, M.L.; Raun, W.R. Estimating vegetation coverage in wheat using digital images. J. Plant Nutr.; 1999; 22, pp. 341-350. [DOI: https://dx.doi.org/10.1080/01904169909365631]
35. Karcher, D.E.; Richardson, M.D. Batch Analysis of Digital Images to Evaluate Turfgrass Characteristics. Crop Sci.; 2005; 45, pp. 1536-1539. [DOI: https://dx.doi.org/10.2135/cropsci2004.0562]
36. Bremer, D.J.; Lee, H.; Su, K.; Keeley, S.J. Relationships between Normalized Difference Vegetation Index and Visual Quality in Cool-Season Turfgrass: II. Factors Affecting NDVI and its Component Reflectances. Crop Sci.; 2011; 51, pp. 2219-2227. [DOI: https://dx.doi.org/10.2135/cropsci2010.12.0729]
37. Bushman, B.S.; Waldron, B.L.; Robins, J.G.; Bhattarai, K.; Johnson, P.G. Summer Percent Green Cover among Kentucky Bluegrass Cultivars, Accessions, and Other Poa Species Managed under Deficit Irrigation. Crop Sci.; 2012; 52, pp. 400-407. [DOI: https://dx.doi.org/10.2135/cropsci2011.06.0342]
38. Patrignani, A.; Ochsner, T.E. Canopeo: A Powerful New Tool for Measuring Fractional Green Canopy Cover. Agron. J.; 2015; 107, pp. 2312-2320. [DOI: https://dx.doi.org/10.2134/agronj15.0150]
39. Chung, S.O.; Kabir, M.S.N.; Kim, Y.J. Variable Fertilizer Recommendation by Image-based Grass Growth Status. IFAC-PapersOnLine; 2018; 51, pp. 10-13. [DOI: https://dx.doi.org/10.1016/j.ifacol.2018.08.053]
40. Ball, K.R.; Power, S.A.; Brien, C.; Woodin, S.; Jewell, N.; Berger, B.; Pendall, E. High-throughput, image-based phenotyping reveals nutrient-dependent growth facilitation in a grass-legume mixture. PLoS ONE; 2020; 15, e0239673. [DOI: https://dx.doi.org/10.1371/journal.pone.0239673]
41. Wright, H.C.; Lawrence, F.A.; Ryan, A.J.; Cameron, D.D. Free and open-source software for object detection, size, and colour determination for use in plant phenotyping. Plant Methods; 2023; 19, 126. [DOI: https://dx.doi.org/10.1186/s13007-023-01103-0]
42. Cobb, J.N.; DeClerck, G.; Greenberg, A.; Clark, R.; McCouch, S. Next-generation phenotyping: Requirements and strategies for enhancing our understanding of genotype–phenotype relationships and its relevance to crop improvement. Theor. Appl. Genet.; 2013; 126, pp. 867-887. [DOI: https://dx.doi.org/10.1007/s00122-013-2066-0]
43. Gouveia, B.T.; Rios, E.F.; Nunes, J.A.R.; Gezan, S.A.; Munoz, P.R.; Kenworthy, K.E.; Unruh, J.B.; Miller, G.L.; Milla-Lewis, S.R.; Schwartz, B.M. et al. Multispecies genotype × environment interaction for turfgrass quality in five turfgrass breeding programs in the southeastern United States. Crop Sci.; 2021; 61, pp. 3080-3096. [DOI: https://dx.doi.org/10.1002/csc2.20421]
44. McCabe, M.F.; Tester, M. Digital insights: Bridging the phenotype-to-genotype divide. J. Exp. Bot.; 2021; 72, pp. 2807-2810. [DOI: https://dx.doi.org/10.1093/jxb/erab108]
45. Danilevicz, M.F.; Gill, M.; Anderson, R.; Batley, J.; Bennamoun, M.; Bayer, P.E.; Edwards, D. Plant Genotype to Phenotype Prediction Using Machine Learning. Front. Genet.; 2022; 13, 822173. [DOI: https://dx.doi.org/10.3389/fgene.2022.822173]
46. Tsaftaris, S.A.; Minervini, M.; Scharr, H. Machine Learning for Plant Phenotyping Needs Image Processing. Trends Plant Sci.; 2016; 21, pp. 989-991. [DOI: https://dx.doi.org/10.1016/j.tplants.2016.10.002]
47. Lee, U.; Chang, S.; Putra, G.A.; Kim, H.; Kim, D.H. An automated, high-throughput plant phenotyping system using machine learning-based plant segmentation and image analysis. PLoS ONE; 2018; 13, e0196615. [DOI: https://dx.doi.org/10.1371/journal.pone.0196615]
48. Koh, J.C.O.; Spangenberg, G.; Kant, S. Automated Machine Learning for High-Throughput Image-Based Plant Phenotyping. Remote Sens.; 2021; 13, 858. [DOI: https://dx.doi.org/10.3390/rs13050858]
49. Li, Z.; Guo, R.; Li, M.; Chen, Y.; Li, G. A review of computer vision technologies for plant phenotyping. Comput. Electron. Agric.; 2020; 176, 105672. [DOI: https://dx.doi.org/10.1016/j.compag.2020.105672]
50. Smith, D.T.; Potgieter, A.B.; Chapman, S.C. Scaling up high-throughput phenotyping for abiotic stress selection in the field. Theor. Appl. Genet.; 2021; 134, pp. 1845-1866. [DOI: https://dx.doi.org/10.1007/s00122-021-03864-5]
51. Li, L.; Zhang, Q.; Huang, D. A Review of Imaging Techniques for Plant Phenotyping. Sensors; 2014; 14, pp. 20078-20111. [DOI: https://dx.doi.org/10.3390/s141120078]
52. Rousseau, D.; Dee, H.; Pridmore, T. Imaging Methods for Phenotyping of Plant Traits. Phenomics in Crop Plants: Trends, Options and Limitations; Kumar, J.; Pratap, A.; Kumar, S. Springer: New Delhi, India, 2015; pp. 61-74. [DOI: https://dx.doi.org/10.1007/978-81-322-2226-2_5]
53. Tariq, M.; Ahmed, M.; Iqbal, P.; Fatima, Z.; Ahmad, S. Crop Phenotyping. Systems Modeling; Ahmed, M. Springer: Singapore, 2020; pp. 45-60. [DOI: https://dx.doi.org/10.1007/978-981-15-4728-7_2]
54. Cardellicchio, A.; Solimani, F.; Dimauro, G.; Petrozza, A.; Summerer, S.; Cellini, F.; Renò, V. Detection of tomato plant phenotyping traits using YOLOv5-based single stage detectors. Comput. Electron. Agric.; 2023; 207, 107757. [DOI: https://dx.doi.org/10.1016/j.compag.2023.107757]
55. Harandi, N.; Vandenberghe, B.; Vankerschaver, J.; Depuydt, S.; Van Messem, A. How to make sense of 3D representations for plant phenotyping: A compendium of processing and analysis techniques. Plant Methods; 2023; 19, 60. [DOI: https://dx.doi.org/10.1186/s13007-023-01031-z]
56. Zhang, Y.; Zhang, N. Imaging technologies for plant high-throughput phenotyping: A review. Front. Agr. Sci. Eng.; 2018; 5, pp. 406-419. [DOI: https://dx.doi.org/10.15302/J-FASE-2018242]
57. Choudhury, S.D.; Samal, A.; Awada, T. Leveraging Image Analysis for High-Throughput Plant Phenotyping. Front. Plant Sci.; 2019; 10, 508. [DOI: https://dx.doi.org/10.3389/fpls.2019.00508]
58. Rani, K. Image Analysis Techniqueson Phenotype for Plant System. Int. J. Eng. Adv. Technol.; 2019; 9, pp. 565-568. [DOI: https://dx.doi.org/10.35940/ijeat.A1125.1291S419]
59. Omari, M.K.; Lee, J.; Faqeerzada, M.A.; Joshi, R.; Park, E.; Cho, B.-K. Digital image-based plant phenotyping: A review. Korean J. Agric. Sci.; 2020; 47, pp. 119-130. Available online: https://www.researchgate.net/publication/340032840_Digital_image-based_plant_phenotyping_a_review (accessed on 14 June 2024). [DOI: https://dx.doi.org/10.7744/kjoas.2020004]
60. Shantz, H.L. The Place of Grasslands in the Earth’s Cover. Ecology; 1954; 35, pp. 143-145. [DOI: https://dx.doi.org/10.2307/1931110]
61. Jacobs, B.F.; Kingston, J.D.; Jacobs, L.L. The Origin of Grass-Dominated Ecosystems. Ann. Mo. Bot. Gard.; 1999; 86, pp. 590-643. [DOI: https://dx.doi.org/10.2307/2666186]
62. Strömberg, C.A.E. Evolution of Grasses and Grassland Ecosystems. Annu. Rev. Earth Planet. Sci.; 2011; 39, pp. 517-544. [DOI: https://dx.doi.org/10.1146/annurev-earth-040809-152402]
63. Chawla, S.L.; Agnihotri, M.A.R.; Sudha, P.; Shah, H.P. Turfgrass: A Billion Dollar Industry. In National Conference on Floriculture for Rural and Urban Prosperity in the Scenario of Climate Change. 2018; Available online: https://www.researchgate.net/publication/324483293_Turfgrass_A_Billion_Dollar_Industry (accessed on 14 June 2024).
64. Wu, J.; Bauer, M.E. Estimating Net Primary Production of Turfgrass in an Urban-Suburban Landscape with QuickBird Imagery. Remote Sens.; 2012; 4, pp. 849-866. [DOI: https://dx.doi.org/10.3390/rs4040849]
65. Milesi, C.; Running, S.W.; Elvidge, C.D.; Dietz, J.B.; Tuttle, B.T.; Nemani, R.R. Mapping and Modeling the Biogeochemical Cycling of Turf Grasses in the United States. Environ. Manag.; 2005; 36, pp. 426-438. [DOI: https://dx.doi.org/10.1007/s00267-004-0316-2]
66. Blanco-Montero, C.A.; Bennett, T.B.; Neville, P.; Crawford, C.S.; Milne, B.T.; Ward, C.R. Potential environmental and economic impacts of turfgrass in Albuquerque, New Mexico (USA). Landsc. Ecol.; 1995; 10, pp. 121-128. [DOI: https://dx.doi.org/10.1007/BF00153829]
67. Krans, J.V.; Morris, K. Determining a Profile of Protocols and Standards used in the Visual Field Assessment of Turfgrasses: A Survey of National Turfgrass Evaluation Program-Sponsored University Scientists. Appl. Turfgrass Sci.; 2007; 4, pp. 1-6. [DOI: https://dx.doi.org/10.1094/ATS-2007-1130-01-TT]
68. Beard, J.B. Turfgrass: Science and Culture; Prentice Hall: Englewood Cliffs, NJ, USA, 1973; Available online: https://catalogue.nla.gov.au/catalog/2595129 (accessed on 14 June 2024).
69. Calera, A.; Martínez, C.; Melia, J. A procedure for obtaining green plant cover: Relation to NDVI in a case study for barley. Int. J. Remote Sens.; 2001; 22, pp. 3357-3362. [DOI: https://dx.doi.org/10.1080/01431160010020100]
70. Cabrera-Bosquet, L.; Molero, G.; Stellacci, A.; Bort, J.; Nogués, S.; Araus, J. NDVI as a potential tool for predicting biomass, plant nitrogen content and growth in wheat genotypes subjected to different water and nitrogen conditions. Cereal Res. Commun.; 2011; 39, pp. 147-159. [DOI: https://dx.doi.org/10.1556/CRC.39.2011.1.15]
71. Rorie, R.L.; Purcell, L.C.; Mozaffari, M.; Karcher, D.E.; King, C.A.; Marsh, M.C.; Longer, D.E. Association of ‘Greenness’ in Corn with Yield and Leaf Nitrogen Concentration. Agron. J.; 2011; 103, pp. 529-535. [DOI: https://dx.doi.org/10.2134/agronj2010.0296]
72. Ozyavuz, M.; Bilgili, B.C.; Salici, S. Determination of Vegetation Changes with NDVI Method. J. Environ. Prot. Ecol.; 2015; 16, pp. 264-273. Available online: https://www.researchgate.net/publication/284981527_Determination_of_vegetation_changes_with_NDVI_method (accessed on 14 June 2024).
73. Huang, S.; Tang, L.; Hupy, J.P.; Wang, Y.; Shao, G. A commentary review on the use of normalized difference vegetation index (NDVI) in the era of popular remote sensing. J. For. Res.; 2021; 32, pp. 1-6. [DOI: https://dx.doi.org/10.1007/s11676-020-01155-1]
74. Xu, Y.; Yang, Y.; Chen, X.; Liu, Y. Bibliometric Analysis of Global NDVI Research Trends from 1985 to 2021. Remote Sens.; 2022; 14, 3967. [DOI: https://dx.doi.org/10.3390/rs14163967]
75. Carlson, T.N.; Ripley, D.A. On the relation between NDVI, fractional vegetation cover, and leaf area index. Remote Sens. Environ.; 1997; 62, pp. 241-252. [DOI: https://dx.doi.org/10.1016/S0034-4257(97)00104-1]
76. Tenreiro, T.R.; García-Vila, M.; Gómez, J.A.; Jiménez-Berni, J.A.; Fereres, E. Using NDVI for the assessment of canopy cover in agricultural crops within modelling research. Comput. Electron. Agric.; 2021; 182, 106038. [DOI: https://dx.doi.org/10.1016/j.compag.2021.106038]
77. Lykhovyd, P.V.; Vozhehova, R.A.; Lavrenko, S.O.; Lavrenko, N.M. The Study on the Relationship between Normalized Difference Vegetation Index and Fractional Green Canopy Cover in Five Selected Crops. Sci. World J.; 2022; 2022, pp. 1-6. [DOI: https://dx.doi.org/10.1155/2022/8479424]
78. Pagola, M.; Ortiz, R.; Irigoyen, I.; Bustince, H.; Barrenechea, E.; Aparicio-Tejo, P.; Lamsfus, C.; Lasa, B. New method to assess barley nitrogen nutrition status based on image colour analysis. Comput. Electron. Agric.; 2009; 65, pp. 213-218. [DOI: https://dx.doi.org/10.1016/j.compag.2008.10.003]
79. Straw, C.M.; Henry, G.M. Spatiotemporal variation of site-specific management units on natural turfgrass sports fields during dry down. Precis. Agric.; 2018; 19, pp. 395-420. [DOI: https://dx.doi.org/10.1007/s11119-017-9526-5]
80. Hejl, R.; Straw, C.; Wherley, B.; Bowling, R.; McInnes, K. Factors leading to spatiotemporal variability of soil moisture and turfgrass quality within sand-capped golf course fairways. Precis. Agric.; 2022; 23, pp. 1908-1917. [DOI: https://dx.doi.org/10.1007/s11119-022-09912-4]
81. Kawashima, S. An Algorithm for Estimating Chlorophyll Content in Leaves Using a Video Camera. Ann. Bot.; 1998; 81, pp. 49-54. [DOI: https://dx.doi.org/10.1006/anbo.1997.0544]
82. Hunt, E.R.; Doraiswamy, P.C.; McMurtrey, J.E.; Daughtry, C.S.T.; Perry, E.M.; Akhmedov, B. A visible band index for remote sensing leaf chlorophyll content at the canopy scale. Int. J. Appl. Earth Obs. Geoinf.; 2013; 21, pp. 103-112. [DOI: https://dx.doi.org/10.1016/j.jag.2012.07.020]
83. Schiavon, M.; Leinauer, B.; Sevastionova, E.; Serena, M.; Maier, B. Warm-season Turfgrass Quality, Spring Green-up, and Fall Color Retention under Drip Irrigation. Appl. Turfgrass Sci.; 2011; 8, pp. 1-9. [DOI: https://dx.doi.org/10.1094/ATS-2011-0422-01-RS]
84. Marín, J.; Yousfi, S.; Mauri, P.V.; Parra, L.; Lloret, J.; Masaguer, A. RGB Vegetation Indices, NDVI, and Biomass as Indicators to Evaluate C3 and C4 Turfgrass under Different Water Conditions. Sustainability; 2020; 12, 2160. [DOI: https://dx.doi.org/10.3390/su12062160]
85. Bell, G.E.; Martin, D.L.; Wiese, S.G.; Dobson, D.D.; Smith, M.W.; Stone, M.L.; Solie, J.B. Vehicle-Mounted Optical Sensing: An Objective Means for Evaluating Turf Quality. Crop Sci.; 2002; 42, pp. 197-201. [DOI: https://dx.doi.org/10.2135/cropsci2002.1970]
86. Bell, G.E.; Martin, D.L.; Koh, K.; Han, H.R. Comparison of Turfgrass Visual Quality Ratings with Ratings Determined Using a Handheld Optical Sensor. HortTechnology; 2009; 19, pp. 309-316. [DOI: https://dx.doi.org/10.21273/HORTTECH.19.2.309]
87. Karcher, D.E.; Richardson, M.D. Digital Image Analysis in Turfgrass Research. Turfgrass: Biology, Use, and Management; Stier, J.C.; Horgan, B.P.; Bonos, S.A. American Society of Agronomy, Crop Science Society of America, Soil Science Society of America: Madison, WI, USA, 2015; pp. 1133-1149. [DOI: https://dx.doi.org/10.2134/agronmonogr56.c29]
88. Barbosa, B.D.S.; Ferraz, G.A.S.; Gonçalves, L.M.; Marin, D.B.; Maciel, D.T.; Ferraz, P.F.P.; Rossi, G. RGB vegetation indices applied to grass monitoring: A qualitative analysis. Agron. Res.; 2019; 17, pp. 349-357. [DOI: https://dx.doi.org/10.15159/AR.19.119]
89. Whitman, B.; Iannone, B.V.; Kruse, J.K.; Unruh, J.B.; Dale, A.G. Cultivar blends: A strategy for creating more resilient warm season turfgrass lawns. Urban Ecosyst.; 2022; 25, pp. 797-810. [DOI: https://dx.doi.org/10.1007/s11252-021-01195-3]
90. Hahn, D.; Morales, A.; Velasco-Cruz, C.; Leinauer, B. Assessing Competitiveness of Fine Fescues (Festuca L. spp.) and Tall Fescue (Schedonorus arundinaceous (Schreb.) Dumort) Established with White Clover (Trifolium repens L., WC), Daisy (Bellis perennis L.) and Yarrow (Achillea millefolium L.). Agronomy; 2021; 11, 2226. [DOI: https://dx.doi.org/10.3390/agronomy11112226]
91. Schwartz, B.; Zhang, J.; Fox, J.; Peake, J. Turf Performance of Shaded ‘TifGrand’ and ‘TifSport’ Hybrid Bermudagrass as Affected by Mowing Height and Trinexapac-ethyl. HortTechnology; 2020; 30, pp. 391-397. [DOI: https://dx.doi.org/10.21273/HORTTECH04596-20]
92. Richardson, M.D.; Karcher, D.E.; Purcell, L.C. Quantifying Turfgrass Cover Using Digital Image Analysis. Crop Sci.; 2001; 41, pp. 1884-1888. [DOI: https://dx.doi.org/10.2135/cropsci2001.1884]
93. Karcher, D.E.; Richardson, M.D. Quantifying Turfgrass Color Using Digital Image Analysis. Crop Sci.; 2003; 43, pp. 943-951. [DOI: https://dx.doi.org/10.2135/cropsci2003.9430]
94. Zhang, C.; Pinnix, G.D.; Zhang, Z.; Miller, G.L.; Rufty, T.W. Evaluation of Key Methodology for Digital Image Analysis of Turfgrass Color Using Open-Source Software. Crop Sci.; 2017; 57, pp. 550-558. [DOI: https://dx.doi.org/10.2135/cropsci2016.04.0285]
95. Kim, Y.; Barnaby, J.Y.; Warnke, S.E. Development of a low-cost automated greenhouse imaging system with machine learning-based processing for evaluating genetic performance of drought tolerance in a bentgrass hybrid population. Comput. Electron. Agric.; 2024; 224, 108896. [DOI: https://dx.doi.org/10.1016/j.compag.2024.108896]
96. Hu, J.B.; Dai, M.X.; Peng, S.T. An automated (novel) algorithm for estimating green vegetation cover fraction from digital image: UIP-MGMEP. Environ. Monit. Assess.; 2018; 190, 687. [DOI: https://dx.doi.org/10.1007/s10661-018-7075-7]
97. Walter, A.; Studer, B.; Kölliker, R. Advanced phenotyping offers opportunities for improved breeding of forage and turf species. Ann. Bot.; 2012; 110, pp. 1271-1279. [DOI: https://dx.doi.org/10.1093/aob/mcs026]
98. Xu, P.; Wang, N.; Zheng, X.; Qiu, G.; Luo, B. A New Turfgrass Coverage Evaluation Method Based on Two-Stage k-means Color Classifier. Proceedings of the 2019 American Society of Agricultural and Biological Engineers; Boston, MA, USA, 7–10 July 2019; [DOI: https://dx.doi.org/10.13031/aim.201901754]
99. Xie, S.; Hu, C.; Bagavathiannan, M.; Song, D. Toward Robotic Weed Control: Detection of Nutsedge Weed in Bermudagrass Turf Using Inaccurate and Insufficient Training Data. IEEE Robot. Autom. Lett.; 2021; 6, pp. 7365-7372. [DOI: https://dx.doi.org/10.1109/LRA.2021.3098012]
100. Ortiz, J.B.; Hirsch, C.N.; Ehlke, N.J.; Watkins, E. SpykProps: An imaging pipeline to quantify architecture in unilateral grass inflorescences. Plant Methods; 2023; 19, 125. [DOI: https://dx.doi.org/10.1186/s13007-023-01104-z]
101. Bremer, D.J.; Lee, H.; Su, K.; Keeley, S.J. Relationships between Normalized Difference Vegetation Index and Visual Quality in Cool-Season Turfgrass: I. Variation among Species and Cultivars. Crop Sci.; 2011; 51, pp. 2212-2218. [DOI: https://dx.doi.org/10.2135/cropsci2010.12.0728]
102. Lee, H.; Bremer, D.J.; Su, K.; Keeley, S.J. Relationships between Normalized Difference Vegetation Index and Visual Quality in Turfgrasses: Effects of Mowing Height. Crop Sci.; 2011; 51, pp. 323-332. [DOI: https://dx.doi.org/10.2135/cropsci2010.05.0296]
103. Leinauer, B.; VanLeeuwen, D.M.; Serena, M.; Schiavon, M.; Sevostianova, E. Digital Image Analysis and Spectral Reflectance to Determine Turfgrass Quality. Agron. J.; 2014; 106, pp. 1787-1794. [DOI: https://dx.doi.org/10.2134/agronj14.0088]
104. Haghverdi, A.; Sapkota, A.; Singh, A.; Ghodsi, S.; Reiter, M. Developing Turfgrass Water Response Function and Assessing Visual Quality, Soil Moisture and NDVI Dynamics of Tall Fescue Under Varying Irrigation Scenarios in Inland Southern California. J. ASABE; 2023; 66, pp. 1497-1512. [DOI: https://dx.doi.org/10.13031/ja.15687]
105. Hejl, R.W.; Conley, M.M.; Serba, D.D.; Williams, C.F. Mowing Height Effects on ‘TifTuf’ Bermudagrass during Deficit Irrigation. Agronomy; 2024; 14, 628. [DOI: https://dx.doi.org/10.3390/agronomy14030628]
106. Boiarskii, B. Comparison of NDVI and NDRE Indices to Detect Differences in Vegetation and Chlorophyll Content. J. Mech. Contin. Math. Sci.; 2019; 4, pp. 20-29. [DOI: https://dx.doi.org/10.26782/jmcms.spl.4/2019.11.00003]
107. Pallottino, F.; Antonucci, F.; Costa, C.; Bisaglia, C.; Figorilli, S.; Menesatti, P. Optoelectronic proximal sensing vehicle-mounted technologies in precision agriculture: A review. Comput. Electron. Agric.; 2019; 162, pp. 859-873. [DOI: https://dx.doi.org/10.1016/j.compag.2019.05.034]
108. Kurbanov, R.K.; Zakharova, N.I. Application of Vegetation Indexes to Assess the Condition of Crops. Agric. Mach. Technol.; 2020; 14, pp. 4-11. [DOI: https://dx.doi.org/10.22314/2073-7599-2020-14-4-4-11]
109. Frossard, J.; Renaud, O. Permutation Tests for Regression, ANOVA, and Comparison of Signals: The permuco Package. J. Stat. Soft.; 2021; 99, pp. 1-32. [DOI: https://dx.doi.org/10.18637/jss.v099.i15]
110. R: The R Project for Statistical Computing. Available online: https://www.r-project.org/ (accessed on 9 October 2024).
111. Frossard, J.; Renaud, O. Permuco: Permutation Tests for Regression, (Repeated Measures) ANOVA/ANCOVA and Comparison of Signals. 2024; Available online: https://cran.r-project.org/web/packages/permuco/index.html (accessed on 9 October 2024).
112. Van Rossum, G.; Drake, F.L. Python 3 Reference Manual; CreateSpace: Scotts Valley, CA, USA, 2009.
113. Hong, G.; Luo, M.R.; Rhodes, P.A. A study of digital camera colorimetric characterization based on polynomial modeling. Color Res. Appl.; 2001; 26, pp. 76-84. [DOI: https://dx.doi.org/10.1002/1520-6378(200102)26:1<76::AID-COL8>3.0.CO;2-3]
114. Woolf, M.S.; Dignan, L.M.; Scott, A.T.; Landers, J.P. Digital postprocessing and image segmentation for objective analysis of colorimetric reactions. Nat. Protoc.; 2021; 16, pp. 218-238. [DOI: https://dx.doi.org/10.1038/s41596-020-00413-0]
115. Anderson, M.; Motta, R.; Chandrasekar, S.; Stokes, M. Proposal for a Standard Default Color Space for the Internet—sRGB. Proceedings of the Fourth Color Imaging Conference, Society of Imaging Science and Technology; Scottsdale, AZ, USA, 19–22 November 1996; pp. 239-245. Available online: https://library.imaging.org/admin/apis/public/api/ist/website/downloadArticle/cic/4/1/art00061 (accessed on 14 June 2024).
116. Casadesús, J.; Villegas, D. Conventional digital cameras as a tool for assessing leaf area index and biomass for cereal breeding. J. Integr. Plant Biol.; 2014; 56, pp. 7-14. [DOI: https://dx.doi.org/10.1111/jipb.12117]
117. Karcher, D.; Purcell, C.; Hignight, K. Devices, Systems and Methods for Digital Image Analysis. U.S. Patent; 20220270206A1, 25 August 2022; Available online: https://patents.google.com/patent/US20220270206A1/en (accessed on 14 June 2024).
118. Rorie, R.L.; Purcell, L.C.; Karcher, D.E.; King, C.A. The Assessment of Leaf Nitrogen in Corn from Digital Images. Crop Sci.; 2011; 51, pp. 2174-2180. [DOI: https://dx.doi.org/10.2135/cropsci2010.12.0699]
119. Schiavon, M.; Leinauer, B.; Serena, M.; Sallenave, R.; Maier, B. Establishing Tall Fescue and Kentucky Bluegrass Using Subsurface Irrigation and Saline Water. Agron. J.; 2013; 105, pp. 183-190. [DOI: https://dx.doi.org/10.2134/agronj2012.0187]
120. Giolo, M.; Pornaro, C.; Onofri, A.; Macolino, S. Seeding Time Affects Bermudagrass Establishment in the Transition Zone Environment. Agronomy; 2020; 10, 1151. [DOI: https://dx.doi.org/10.3390/agronomy10081151]
121. Schiavon, M.; Pornaro, C.; Macolino, S. Tall Fescue (Schedonorus arundinaceus (Schreb.) Dumort.) Turfgrass Cultivars Performance under Reduced N Fertilization. Agronomy; 2021; 11, 193. [DOI: https://dx.doi.org/10.3390/agronomy11020193]
122. Zhao, B.; Zhang, Y.; Duan, A.; Liu, Z.; Xiao, J.; Liu, Z.; Qin, A.; Ning, D.; Li, S.; Ata-Ul-Karim, S.T. Estimating the Growth Indices and Nitrogen Status Based on Color Digital Image Analysis During Early Growth Period of Winter Wheat. Front. Plant Sci.; 2021; 12, 619522. [DOI: https://dx.doi.org/10.3389/fpls.2021.619522]
123. Augustinus, A.S.; McLoughlin, P.H.; Alvarenga, A.F.A.; Unruh, J.B.; Schiavon, M. Evaluation of Different Aerification Methods for Ultradwarf Hybrid Bermudagrass Putting Greens. HortScience; 2023; 33, pp. 333-341. [DOI: https://dx.doi.org/10.21273/HORTTECH05213-23]
124. Singh, S.; Yu, S.; Xiang, M.; Fontanier, C.H.; Wu, Y.; Martin, D.L.; Kajla, A. Genetic Variability of Traffic Tolerance and Surface Playability of Bermudagrass (Cynodon spp.) under Fall Simulated Traffic Stress. HortScience; 2024; 59, pp. 73-83. [DOI: https://dx.doi.org/10.21273/HORTSCI17488-23]
125. Glab, T.; Szewczyk, W.; Gondek, K. Response of Kentucky Bluegrass Turfgrass to Plant Growth Regulators. Agronomy; 2023; 13, 799. [DOI: https://dx.doi.org/10.3390/agronomy13030799]
126. Guijarro, M.; Pajares, G.; Riomoros, I.; Herrera, P.J.; Burgos-Artizzu, X.P.; Ribeiro, A. Automatic segmentation of relevant textures in agricultural images. Comput. Electron. Agric.; 2011; 75, pp. 75-83. [DOI: https://dx.doi.org/10.1016/j.compag.2010.09.013]
127. Woebbecke, D.M.; Meyer, G.E.; Von Bargen, K.; Mortensen, D.A. Color Indices for Weed Identification Under Various Soil, Residue, and Lighting Conditions. Trans. ASAE; 1995; 38, pp. 259-269. [DOI: https://dx.doi.org/10.13031/2013.27838]
128. Kataoka, T.; Kaneko, T.; Okamoto, H.; Hata, S. Crop growth estimation system using machine vision. Proceedings of the 2003 IEEE/ASME International Conference on Advanced Intelligent Mechatronics (AIM 2003); Kobe, Japan, 20–24 July 2003; IEEE: New York, NY, USA, 2003; pp. b1079-b1083. [DOI: https://dx.doi.org/10.1109/AIM.2003.1225492]
129. Marchant, J.A.; Onyango, C.M. Shadow-invariant classification for scenes illuminated by daylight. J. Opt. Soc. Am. A JOSAA; 2000; 17, pp. 1952-1961. [DOI: https://dx.doi.org/10.1364/JOSAA.17.001952]
130. Hague, T.; Tillett, N.D.; Wheeler, H. Automated Crop and Weed Monitoring in Widely Spaced Cereals. Precis. Agric.; 2006; 7, pp. 21-32. [DOI: https://dx.doi.org/10.1007/s11119-005-6787-1]
131. Robertson, A.R. The CIE 1976 Color-Difference Formulae. Color Res. Appl.; 1977; 2, pp. 7-11. [DOI: https://dx.doi.org/10.1002/j.1520-6378.1977.tb00104.x]
132. Schwiegerling, J. Field Guide to Visual and Ophthalmic Optics FG04; SPIE: Bellingham, DC, USA, 2004; Available online: https://spie.org/publications/spie-publication-resources/optipedia-free-optics-information/fg04_p12_phoscoresponse?SSO=1 (accessed on 14 June 2024).
133. Koschan, A.; Abidi, M.A. Digital Color Image Processing; 1st ed. Wiley: Hoboken, NJ, USA, 2008; [DOI: https://dx.doi.org/10.1002/9780470230367]
134. Chopin, J.; Kumar, P.; Miklavcic, S.J. Land-based crop phenotyping by image analysis: Consistent canopy characterization from inconsistent field illumination. Plant Methods; 2018; 14, 39. [DOI: https://dx.doi.org/10.1186/s13007-018-0308-5]
135. Pape, J.M.; Klukas, C. Utilizing Machine Learning Approaches to Improve the Prediction of Leaf Counts and Individual Leaf Segmentation of Rosette Plant Images; Department of Molecular Genetics, Leibniz Institute of Plant Genetics and Crop Plant Research (IPK): Gatersleben, Germany, 2015; Available online: https://openimageanalysisgroup.github.io/MCCCS/publications/Pape_Klukas_LSC_2015.pdf (accessed on 14 June 2024).
136. Kefauver, S.C.; Vicente, R.; Vergara-Díaz, O.; Fernandez-Gallego, J.A.; Kerfal, S.; Lopez, A.; Melichar, J.P.E.; Serret Molins, M.D.; Araus, J.L. Comparative UAV and Field Phenotyping to Assess Yield and Nitrogen Use Efficiency in Hybrid and Conventional Barley. Front. Plant Sci.; 2017; 8, 1733. [DOI: https://dx.doi.org/10.3389/fpls.2017.01733]
137. Du, J.; Li, B.; Lu, X.; Yang, X.; Guo, X.; Zhao, C. Quantitative phenotyping and evaluation for lettuce leaves of multiple semantic components. Plant Methods; 2022; 18, 54. [DOI: https://dx.doi.org/10.1186/s13007-022-00890-2]
138. Xie, X.; Xia, F.; Wu, Y.; Liu, S.; Yan, K.; Xu, H.; Ji, Z. A Novel Feature Selection Strategy Based on Salp Swarm Algorithm for Plant Disease Detection. Plant Phenomics; 2023; 5, 0039. [DOI: https://dx.doi.org/10.34133/plantphenomics.0039]
139. Wahono, W.; Indradewa, D.; Sunarminto, B.H.; Haryono, E.; Prajitno, D. CIE L*a*b* Color Space Based Vegetation Indices Derived from Unmanned Aerial Vehicle Captured Images for Chlorophyll and Nitrogen Content Estimation of Tea (Camellia sinensis L. Kuntze) Leaves. Ilmu Pertan. Agric. Sci.; 2019; 4, pp. 46-51. [DOI: https://dx.doi.org/10.22146/ipas.40693]
140. De Casas, R.R.; Vargas, P.; Pérez-Corona, E.; Manrique, E.; García-Verdugo, C.; Balaguer, L. Sun and shade leaves of Olea europaea respond differently to plant size, light availability and genetic variation: Canopy plasticity in Olea europea. Funct. Ecol.; 2011; 25, pp. 802-812. [DOI: https://dx.doi.org/10.1111/j.1365-2435.2011.01851.x]
141. Rasmann, S.; Chassin, E.; Bilat, J.; Glauser, G.; Reymond, P. Trade-off between constitutive and inducible resistance against herbivores is only partially explained by gene expression and glucosinolate production. J. Exp. Bot.; 2015; 66, pp. 2527-2534. [DOI: https://dx.doi.org/10.1093/jxb/erv033]
142. Berry, J.C.; Qi, M.; Sonawane, B.V.; Sheflin, A.; Cousins, A.; Prenni, J.; Schachtman, D.P.; Liu, P.; Bart, R.S.; Biology, M. et al. Increased signal-to-noise ratios within experimental field trials by regressing spatially distributed soil properties as principal components. eLife; 2022; 11, e70056. [DOI: https://dx.doi.org/10.7554/eLife.70056]
143. Fitz, E.; Choi, C.Y. Monitoring Turfgrass Quality Using Multispectral Radiometry. Trans. ASAE; 2002; 45, pp. 865-871. [DOI: https://dx.doi.org/10.13031/2013.8839]
144. Aynalem, H.M.; Righetti, T.L.; Reed, B.M. Non-destructive evaluation of in vitro-stored plants: A comparison of visual and image analysis. Vitr. Cell. Dev. Biol. Plant; 2006; 42, pp. 562-567. [DOI: https://dx.doi.org/10.1079/IVP2006816]
145. Kaler, A.S.; Abdel-Haleem, H.; Fritschi, F.B.; Gillman, J.D.; Ray, J.D.; Smith, J.R.; Purcell, L.C. Genome-Wide Association Mapping of Dark Green Color Index using a Diverse Panel of Soybean Accessions. Sci. Rep.; 2020; 10, 5166. [DOI: https://dx.doi.org/10.1038/s41598-020-62034-7]
146. Casadesús, J.; Kaya, Y.; Bort, J.; Nachit, M.M.; Araus, J.L.; Amor, S.; Ferrazzano, G.; Maalouf, F.; Maccaferri, M.; Martos, V. et al. Using vegetation indices derived from conventional digital cameras as selection criteria for wheat breeding in water-limited environments. Ann. Appl. Biol.; 2007; 150, pp. 227-236. [DOI: https://dx.doi.org/10.1111/j.1744-7348.2007.00116.x]
147. Houser, K.W.; Wei, M.; David, A.; Krames, M.R.; Shen, X.S. Review of measures for light-source color rendition and considerations for a two-measure system for characterizing color rendition. Opt. Express; 2013; 21, 10393. [DOI: https://dx.doi.org/10.1364/OE.21.010393]
148. Wu, W.L.; Fang, M.-H.; Zhou, W.; Lesniewski, T.; Mahlik, S.; Grinberg, M.; Brik, M.G.; Sheu, H.-S.; Cheng, B.-M.; Wang, J. et al. High Color Rendering Index of Rb2GeF6:Mn4+ for Light-Emitting Diodes. Chem. Mater.; 2017; 29, pp. 935-939. [DOI: https://dx.doi.org/10.1021/acs.chemmater.6b05244]
149. Schewe, J. The Digital Negative: Raw Image Processing in Lightroom, Camera Raw, and Photoshop; Peachpit Pr: Hoboken, NJ, USA, 2012.
150. Lee, S.H.; Choi, J.S. Design and implementation of color correction system for images captured by digital camera. IEEE Trans. Consum. Electron.; 2008; 54, pp. 268-276. [DOI: https://dx.doi.org/10.1109/TCE.2008.4560085]
151. Finlayson, G.D.; Mackiewicz, M.; Hurlbert, A. Color Correction Using Root-Polynomial Regression. IEEE Trans. Image Process.; 2015; 24, pp. 1460-1470. [DOI: https://dx.doi.org/10.1109/TIP.2015.2405336]
152. Senthilkumaran, V. Color Correction Using Color Checkers. Proceedings of the the First International Conference on Combinatorial and Optimization, ICCAP 2021; Chennai, India, 7–8 December 2021; Available online: https://eudl.eu/doi/10.4108/eai.7-12-2021.2314537 (accessed on 14 June 2024).
153. Okkalides, D. Assessment of commercial compression algorithms, of the lossy DCT and lossless types, applied to diagnostic digital image files. Comput. Med. Imaging Graph.; 1998; 22, pp. 25-30. [DOI: https://dx.doi.org/10.1016/S0895-6111(98)00009-3]
154. Lebourgeois, V.; Bégué, A.; Labbé, S.; Mallavan, B.; Prévot, L.; Roux, B. Can Commercial Digital Cameras Be Used as Multispectral Sensors? A Crop Monitoring Test. Sensors; 2008; 8, pp. 7300-7322. [DOI: https://dx.doi.org/10.3390/s8117300]
155. Zabala, A.; Pons, X. Effects of lossy compression on remote sensing image classification of forest areas. Int. J. Appl. Earth Obs. Geoinf.; 2011; 13, pp. 43-51. [DOI: https://dx.doi.org/10.1016/j.jag.2010.06.005]
156. Zabala, A.; Pons, X. Impact of lossy compression on mapping crop areas from remote sensing. Int. J. Remote Sens.; 2013; 34, pp. 2796-2813. [DOI: https://dx.doi.org/10.1080/01431161.2012.750772]
157. Casadesús, J.; Biel, C.; Savé, R. Turf Color Measurement with Conventional Digital Cameras; Universidade de Trás-os-Montes e Alto Douro: Vila Real, Portugal, 2005.
158. Ku, K.; Mansoor, S.; Han, G.D.; Chung, Y.S.; Tuan, T.T. Identification of new cold tolerant Zoysia grass species using high-resolution RGB and multi-spectral imaging. Sci. Rep.; 2023; 13, 13209. [DOI: https://dx.doi.org/10.1038/s41598-023-40128-2]
159. Mutlu, S.S.; Sönmez, N.K.; Çoşlu, M.; Türkkan, H.R.; Zorlu, D. UAV-based imaging for selection of turfgrass drought resistant cultivars in breeding trials. Euphytica; 2023; 219, 83. [DOI: https://dx.doi.org/10.1007/s10681-023-03211-3]
160. Matsuoka, R.; Asonuma, K.; Takahashi, G.; Danjo, T.; Hirana, K. Evaluation of Correction Methods of Chromatic Aberration in Digital Camera Images. ISPRS Ann. Photogramm. Remote Sens. Spatial Inf. Sci.; 2012; I-3, pp. 49-55. [DOI: https://dx.doi.org/10.5194/isprsannals-I-3-49-2012]
161. Mackiewicz, M.; Andersen, C.F.; Finlayson, G. Method for hue plane preserving color correction. J. Opt. Soc. Am. A; 2016; 33, 2166. [DOI: https://dx.doi.org/10.1364/JOSAA.33.002166]
162. Berry, J.C.; Fahlgren, N.; Pokorny, A.A.; Bart, R.S.; Veley, K.M. An automated, high-throughput method for standardizing image color profiles to improve image-based plant phenotyping. PeerJ; 2018; 6, e5727. [DOI: https://dx.doi.org/10.7717/peerj.5727]
163. Burggraaff, O.; Schmidt, N.; Zamorano, J.; Pauly, K.; Pascual, S.; Tapia, C.; Spyrakos, E.; Snik, F. Standardized spectral and radiometric calibration of consumer cameras. Opt. Express; 2019; 27, 19075. [DOI: https://dx.doi.org/10.1364/OE.27.019075]
164. Tu, L.; Peng, Q.; Li, C.; Zhang, A. 2D In Situ Method for Measuring Plant Leaf Area with Camera Correction and Background Color Calibration. Sci. Program.; 2021; 2021, pp. 1-11. [DOI: https://dx.doi.org/10.1155/2021/6650099]
165. Lozano-Claros, D.; Custovic, E.; Deng, G.; Whelan, J.; Lewsey, M.G. ColorBayes: Improved color correction of high-throughput plant phenotyping images to account for local illumination differences. bioRxiv; 2022; [DOI: https://dx.doi.org/10.1101/2022.03.01.482532]
166. Brown, M.S. Color Processing for Digital Cameras. Fundamentals and Applications of Colour Engineering; 1st ed. Green, P. Wiley: Hoboken, NJ, USA, 2023; pp. 81-98. [DOI: https://dx.doi.org/10.1002/9781119827214.ch5]
167. Liang, Y.; Urano, D.; Liao, K.-L.; Hedrick, T.L.; Gao, Y.; Jones, A.M. A nondestructive method to estimate the chlorophyll content of Arabidopsis seedlings. Plant Methods; 2017; 13, 26. [DOI: https://dx.doi.org/10.1186/s13007-017-0174-6]
168. Koh, J.C.O.; Hayden, M.; Daetwyler, H.; Kant, S. Estimation of crop plant density at early mixed growth stages using UAV imagery. Plant Methods; 2019; 15, 64. [DOI: https://dx.doi.org/10.1186/s13007-019-0449-1]
169. Ryan, W.H.; Heiser, S.; Curtis, M.D.; Amsler, C.D.; Bayer, T.; Bonthond, G.; Wang, G.; Weinberger, F.; Krueger-Hadfield, S.A. The Use of Photographic Color Information for High-Throughput Phenotyping of Pigment Composition in Agarophyton vermiculophyllum (Ohmi) Gurgel, J.N.Norris & Fredericq. Cryptogam. Algol.; 2019; 40, 73. [DOI: https://dx.doi.org/10.5252/cryptogamie-algologie2019v40a7]
170. Borra-Serrano, I.; Kemeltaeva, A.; Van Laere, K.; Lootens, P.; Leus, L. A view from above: Can drones be used for image-based phenotyping in garden rose breeding?. Acta Hortic.; 2023; pp. 271-280. [DOI: https://dx.doi.org/10.17660/ActaHortic.2023.1368.35]
171. Yin, W.; Zang, X.; Wu, L.; Zhang, X.; Zhao, J. A Distortion Correction Method Based on Actual Camera Imaging Principles. Sensors; 2024; 24, 2406. [DOI: https://dx.doi.org/10.3390/s24082406]
172. Paril, J.F.; Fournier-Level, A.J. instaGraminoid, a Novel Colorimetric Method to Assess Herbicide Resistance, Identifies Patterns of Cross-Resistance in Annual Ryegrass. Plant Phenomics; 2019; 2019, 7937156. [DOI: https://dx.doi.org/10.34133/2019/7937156]
173. Vines, P.L.; Zhang, J.; Vines, P.L.; Zhang, J. High-throughput plant phenotyping for improved turfgrass breeding applications. Grass Res.; 2022; 2, pp. 1-13. [DOI: https://dx.doi.org/10.48130/GR-2022-0001]
174. Wang, T.; Chandra, A.; Jung, J.; Chang, A. UAV remote sensing based estimation of green cover during turfgrass establishment. Comput. Electron. Agric.; 2022; 194, 106721. [DOI: https://dx.doi.org/10.1016/j.compag.2022.106721]
175. Atefi, A.; Ge, Y.; Pitla, S.; Schnable, J. In vivo human-like robotic phenotyping of leaf traits in maize and sorghum in greenhouse. Comput. Electron. Agric.; 2019; 163, 104854. [DOI: https://dx.doi.org/10.1016/j.compag.2019.104854]
176. Atefi, A.; Ge, Y.; Pitla, S.; Schnable, J. Robotic Technologies for High-Throughput Plant Phenotyping: Contemporary Reviews and Future Perspectives. Front. Plant Sci.; 2021; 12, 611940. [DOI: https://dx.doi.org/10.3389/fpls.2021.611940]
177. Arunachalam, A.; Andreasson, H. Real-time plant phenomics under robotic farming setup: A vision-based platform for complex plant phenotyping tasks. Comput. Electr. Eng.; 2021; 92, 107098. [DOI: https://dx.doi.org/10.1016/j.compeleceng.2021.107098]
178. Fonteijn, H.; Afonso, M.; Lensink, D.; Mooij, M.; Faber, N.; Vroegop, A.; Polder, G.; Wehrens, R. Automatic Phenotyping of Tomatoes in Production Greenhouses Using Robotics and Computer Vision: From Theory to Practice. Agronomy; 2021; 11, 1599. [DOI: https://dx.doi.org/10.3390/agronomy11081599]
179. Yao, L.; Van De Zedde, R.; Kowalchuk, G. Recent developments and potential of robotics in plant eco-phenotyping. Emerg. Top. Life Sci.; 2021; 5, pp. 289-300. [DOI: https://dx.doi.org/10.1042/ETLS20200275]
180. Pongpiyapaiboon, S.; Tanaka, H.; Hashiguchi, M.; Hashiguchi, T.; Hayashi, A.; Tanabata, T.; Isobe, S.; Akashi, R. Development of a digital phenotyping system using 3D model reconstruction for zoysiagrass. Plant Phenome J.; 2023; 6, e20076. [DOI: https://dx.doi.org/10.1002/ppj2.20076]
181. Bethge, H.; Winkelmann, T.; Lüdeke, P.; Rath, T. Low-cost and automated phenotyping system ‘Phenomenon’ for multi-sensor in situ monitoring in plant in vitro culture. Plant Methods; 2023; 19, 42. [DOI: https://dx.doi.org/10.1186/s13007-023-01018-w]
182. Ma, K.; Cui, X.; Huang, G.; Yuan, D. Effect of Light Intensity on Close-Range Photographic Imaging Quality and Measurement Precision. Int. J. Multimedia Ubiquitous Eng.; 2016; 11, pp. 69-78. [DOI: https://dx.doi.org/10.14257/ijmue.2016.11.2.09]
183. Bendig, J.; Gautam, D.; Malenovsky, Z.; Lucieer, A. Influence of Cosine Corrector and Uas Platform Dynamics on Airborne Spectral Irradiance Measurements. Proceedings of the IGARSS 2018—2018 IEEE International Geoscience and Remote Sensing Symposium; Valencia, Spain, 22–27 July 2018; IEEE: New York, NY, USA, 2018; pp. 8822-8825. [DOI: https://dx.doi.org/10.1109/IGARSS.2018.8518864]
184. Feng, Z.; Liang, Q.; Zhang, Z.; Ji, W. Camera Calibration Method Based on Sine Cosine Algorithm. Proceedings of the 2021 IEEE International Conference on Progress in Informatics and Computing (PIC); Shanghai, China, 17–19 December 2021; IEEE: New York, NY, USA, 2021; pp. 174-178. [DOI: https://dx.doi.org/10.1109/PIC53636.2021.9687082]
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
Abstract
Efficient and affordable plant phenotyping methods are an essential response to global climatic pressures. This study demonstrates the continued potential of consumer-grade photography to capture plant phenotypic traits in turfgrass and derive new calculations. Yet the effects of image corrections on individual calculations are often unreported. Turfgrass lysimeters were photographed over 8 weeks using a custom lightbox and consumer-grade camera. Subsequent imagery was analyzed for area of cover, color metrics, and sensitivity to image corrections. Findings were compared to active spectral reflectance data and previously reported measurements of visual quality, productivity, and water use. Results confirm that Red–Green–Blue imagery effectively measures plant treatment effects. Notable correlations were observed for corrected imagery, including between yellow fractional area with human visual quality ratings (r = −0.89), dark green color index with clipping productivity (r = 0.61), and an index combination term with water use (r = −0.60). The calculation of green fractional area correlated with Normalized Difference Vegetation Index (r = 0.91), and its RED reflectance spectra (r = −0.87). A new chromatic ratio correlated with Normalized Difference Red-Edge index (r = 0.90) and its Red-Edge reflectance spectra (r = −0.74), while a new calculation correlated strongest to Near-Infrared (r = 0.90). Additionally, the combined index term significantly differentiated between the treatment effects of date, mowing height, deficit irrigation, and their interactions (p < 0.001). Sensitivity and statistical analyses of typical image file formats and corrections that included JPEG, TIFF, geometric lens distortion correction, and color correction were conducted. Findings highlight the need for more standardization in image corrections and to determine the biological relevance of the new image data calculations.
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer