1. Introduction
Plant growth and development are dynamic in nature, and phenotypes result from the cumulative effect of genetic and environmental factors’ interactions through the entire plant life cycle [1]. Plant breeders, agronomists, and farmers evaluate such crop phenotypic traits to characterize different plant genotype/variety–environment interactions, estimate key plant development stages and make crop selection/management decisions accordingly. Thus, quantifying plant phenotypes associated with different plant genetics under diverse environmental circumstances can improve the understanding of genotype-environment interactions (G × E). In general, this concept is applied for plant variety selection in standard crop breeding schemes [2,3].
Crop breeding is critical to enhance crop performance under climate change scenarios and address food security needs for the growing global population, which is expected to exceed nine billion by 2050 [4,5,6]. In the crop breeding programs, precise methods to assess plant phenotypes associated with key performance traits, such as yield potential, disease resistance, and drought tolerance, are required. Besides, accurate measurements are needed to ensure that plant varieties/cultivars with high performance are selected before distribution to farmers [7,8]. Nevertheless, traditional field phenotyping in crop breeding programs and genetics/genomics research can be laborious, time-consuming, and subjective. It is also quite often with limited access to environmental data, which is needed to integrate with phenotypic data for in-depth analysis [9,10].
Remote sensing platforms with image-based sensors for field-based high-throughput phenotyping (HTP) are practical techniques for increasing phenotype evaluation efficiency through automated or semi-automated data acquisition and analysis approaches [11,12,13]. The field-based HTP systems include phenotyping towers [14,15], field scanning platforms [16,17], autonomous ground-based systems or modified agricultural vehicles [18,19,20], systems such as unmanned aerial vehicles (UAVs), blimps and fixed-wing planes [21], and, more recently, low-orbiting satellites [22]. These platforms noticeably strengthen a crop breeding program’s capabilities to phenotype; however, these systems can be expensive and may not allow continuous crop monitoring with efficient utilization of resources (personnel, travel). Other common challenges encountered include operation skills, inadequate analysis tools, and restricted data collection frequency [23,24]. It is ideal for developing HTP tools for field-based conditions with reasonable cost and versatility, which are adaptable/expandable with third-party hardware and open-source software [24]. Such systems should closely monitor plants and allow data collection (sensor and weather) at high temporal resolution to provide high-precision phenotypic analysis to assess dynamic crop responses to the environment.
Internet of Things (IoT)-based systems are becoming more available and affordable, with automation and real-time decision-making capabilities in various fields, including agriculture and plant phenotyping [25,26,27]. Moreover, IoT systems are operated through open-source software, providing a broad range of interfacing solutions and options to adapt and extend based on diverse experimental scenarios [28]. Raspberry Pi products (
In general, the Raspberry Pi RGB camera is used to extract plant features such as size, plant height, leaf counting, and color [34,35], as well as to detect plant phenology [36] and disease [37,38]. The applications of a multispectral camera (such as a camera with no infrared, commonly referred to as a NoIR camera) are sparse. The NoIR camera provides more information on plant health status, where the healthy plants absorb more visible light and reflect more near-infrared in general [39]. This allows the extraction of vegetation indices, such as NDVI, in addition to others, allowing better scope to extract more crop traits [40]. Moreover, this plant property also allows other benefits such as easier segmentation of the plants of interest from a background (e.g., weeds, soil, etc.). Thus, the integration of the RGB camera with the NoIR camera offers broader scope for plant science research, as explored in this study.
The overall research goal in the presented work was to develop a Raspberry Pi-based sensor system for automated in-field phenotyping and acquire high-frequency data to support crop breeding programs. Two camera systems (RGB and NoIR) were integrated to allow the extraction of vegetation indices that represent the crop vigor and health status of wheat. The imaging system was integrated with a weather station for microclimate monitoring to evaluate the dynamics of crop growth and development in a wheat breeding trial. The low-cost sensor system was developed using open-source hardware products such as Raspberry Pi. Open-source software, such as Python 3 and OpenCV, were utilized to automatically control the system’s operation and extraction of the digital plant traits (vegetation indices).
2. System Development
2.1. System Hardware
The sensor system’s main hardware was from StereoPi (Virt2real ltd., Moscow, Russia) [41], as shown in Figure 1. The StereoPi is a carrier board for a Raspberry Pi compute module 3+ lite with Cortex-A53 (ARMv8) 64-bit at 1.2 GHz processor and 1 GB RAM as a microprocessor to control the sensor system in this study (Raspberry Pi Foundation, Cambridge, England, United Kingdom). The module is flexible for developers to leverage the Raspberry Pi hardware and software stack in their customized system. The module was connected to two Raspberry Pi cameras, version 2 RGB and NoIR (no infrared filter above CMOS sensor), via camera serial interface ports on the StereoPi board. These camera sensors are based on the image sensor Sony IMX219PQ CMOS 8 megapixel (Sony Corporation, Tokyo, Japan), and the active pixel count of the camera is 3280 × 2464. The Witty Pi 3 board (UUGear, Prague, Czech Republic), connected via StereoPi general-purpose input/output (GPIO) pins, provides a real-time clock with an accuracy of ±2 ppm. In this study, the system was powered by a 4000 mAh Voltaic V15 USB battery (Voltaic Systems, Brooklyn, NY, United States), and it has an “Always On” mode. This mode will never turn off the battery while consuming only ~3 mA and supplying it to the sensor system. The battery and the Witty Pi 3 capabilities allowed the camera to automatically power ON–OFF in custom–duty cycles to control the data collection schedule and save the system’s energy consumption.
An enclosure for the camera was designed using the 3D Builder software (a free software operating on Windows Operating System, Microsoft Corporation, Albuquerque, NM, United States) and fabricated via a 3D printer with acrylonitrile butadiene styrene plastic filament (Ultimaker, Geldermalsen, The Netherlands). The system’s complete set-up (Figure 1b) was made up of the following four parts: main body, top, front, and back, assembled with M2 bolts.
2.2. Programming Code for System Operation
The Linux-based operating system Raspbian Buster was run on the sensor system, and the open-source Python 3 (Python Core Team 2015) was utilized as the primary programming language to command the board and the sensor system to capture images at a user-defined time point. Picamera was an essential Python library to create a time-lapse imaging algorithm to capture images in this study. Initially, the program was coded to automatically adjust white balance, image resolution, frame rate, exposure mode, and shutter speed associated with variable in-field lighting conditions. The program next controlled the camera to take a picture shot with a 30 s delay between each shot, thus capturing a total of 15 shots per time point. Therefore, from the StereoPi board function, the system could capture two images (from RGB and NoIR camera) simultaneously at a single shot, and the two images were captured on one screen. As the real-time clock of the Witty Pi 3 was synchronized with the sensor system, a timestamp for each captured image was recorded and saved as a file name on a secure digital (SD) card.
The camera system was scheduled to power ON–OFF automatically through the power management software from Witty Pi 3. Similarly, the time-lapse imaging script was run automatically once the operating system began using the Linux crontab scheduling system. In this study, the camera was controlled to capture images using the time-lapse script at the following four time points each day: 7.30, 10.30, 13.30, and 16.30 h. Therefore, for the one cycle at each time point, the sensor system was set to power ON for warming up the system before running the script for 15 min and power OFF after running the script for 15 min to assure all operations finished, as described in Figure 2. All the critical settings, such as imaging settings, the number of captured images, and the camera operation schedule, can be adjusted by the user based on their research objectives.
2.3. Weather Station and Reference Panel
The weather station ATMOS 41 (Meter Inc., Pullman, WA, USA) following ISO 22522 standards (ISO, 2007, p. 22522) and CR1000 dataloggers (Campbell Scientific, Logan, UT, USA) were utilized in the study to collect environmental data such as air temperature (resolution: 0.1 °C, accuracy: ±0.6 °C), solar radiation (resolution: 1 W/m2, accuracy: ±5%), and wind speed (resolution: 0.01 m/s, accuracy: ±3%). The weather station could operate with the power supplied from an external 12 VDC battery. In regard to the reference panel, the panel units were customized using tiles with the following three different colors: light brown, dark grey, and light grey. Spectroradiometer SVC HR-1024i, equipped with the leaf clip probe LC-RP PRO (Spectra Vista, Poughkeepsie, NY, USA) with a 350–2500 nm wavelength range, can be applied to measure the reflectance value of the panel units [42]. The reflectance data should be calculated for the wavelengths between 400 and 900 nm, covering the camera wavebands’ spectral region.
2.4. Field Installation
Five sensor systems with five reference panels, and the weather station, were installed in spring wheat field trials located near Pullman, WA (GPS: 46°41′49.9″ N, 117°08′59.1″ W) in the 2020 growing season. The field was planted in a randomized complete block design (RCBD) with 12 varieties, three replicates, and the plot size was 1.5 × 6.0 m, as shown in Figure 3a.
The sensors were installed facing toward the west side, and the cameras were set at 1.7 m above ground level (AGL) and 1.9 m away from the targeted plot boundary (Figure 4), and oriented 50–60° downward from a vertical plane to focus on the wheat plot for the growth stages from heading to maturity. In this study, the camera was adjusted to capture two targeted plots at the left and right sides of the reference panels. The distance between the reference panel stand and the camera stand was ~4.2 m (nearly the center of the two targeted plots). Additionally, the weather station was installed at 1.8 m AGL at the center of the field. The cameras, reference panels, and the weather station, were mounted on the aluminum pipes.
2.5. Digital Trait Extraction Programming Code
Based on the captured images, each image’s digital trait was extracted using OpenCV, the open-source computer vision library (
(1)
where,NDVI = normalized difference vegetation index
Blue = blue band of the image from the NoIR camera
Red = red band of the image from the NoIR camera
In regard to the image processing steps, firstly, the code performs the separation of the RGB and NoIR images, and only the NoIR image was used to calculate the NDVI. With two camera systems, there is a possibility to extract several vegetation indices from both the RGB and NoIR camera bands, as well as plant height (stereo vision technique). Before beginning the NDVI calculation, the reflectance panel’s average pixel value was standardized (radiometric image correction was applied if necessary) on the red (near-infrared band in the NoIR camera) and blue bands. The NDVI was created following Equation (1) and, after that, plot segmentation (two targeted plots) was performed. The regions of interest (ROI) in this study were fixed. Lastly, the summary statistics of the NDVI data based on the ROI from each image were extracted, as described in Figure 5.
3. System Evaluation
3.1. Camera Operations
The affordable hardware and open-source software for the sensor system in this study could successfully generate the multi-band image (RGB and NoIR image) with a high temporal resolution (targeted at four times a day for around one month of data collection). The sample raw data from two cameras are as shown in Figure 6. The resolution for a received image was 2560 × 1248 pixels; thus, each RGB or NoIR image resolution was 1280 × 1248 pixels individually. In addition, Figure 6 also presented the consistency of the sensor’s operation system and the camera efficiency in field research under different environmental conditions.
The sensor system was installed in the field, starting from 7 July to 31 July 2020; however, the first week of installation was for testing the system and evaluating the factors that affected the image quality. The issues were resolved to establish the standard operating procedure to capture images using the sensor system. In this study, the challenges were the power for operating the cameras in the field, and after adjusting the automated setting for the power ON–OFF system as well as the operation of the time-lapse imaging script, the camera could operate for a maximum of five days without the need to re-charge. The system for storing images will be vital to consider as well because the data were stored on the SD card (32 GB) in the 2020 season, resulting in the operator having to manually access the data and charge the power to the system every five days. Thus, the power generating from the solar cell and cloud-based system or local database system for the 2021 season will be developed and integrated into the sensor system.
3.2. Image Analysis
The image processing algorithm analyzed the data offline using the Python 3 programming language, mainly utilizing the OpenCV library. The first step was to separate the RGB and NoIR images such that the same image resolution, as set in the time-lapse image acquisition algorithm, was used to get the individual images. Thus, the NoIR images were imported to create the NDVI images by splitting each band, and Equation (1) was used to calculate the NDVI image from the two bands. The plot segmentation process was semi-automated. Upon manual selection of four points of targeted plots (performed only on the first image), the point coordinates were applied to the remaining images from the same sensor. The statistical data, as described in Figure 5 from the targeted plot area, were extracted using the NumPy library (
Figure 7 shows the sample mean NDVI data from the image processing script and the average field trial temperature at three time points within a day for a period of about one month. Overall, the trend of the NDVI data gradually declined along the period of data collection, as the sensor systems were installed starting at the mid to late heading stages of wheat and the data collection ceased at the ripening stage; thus, the trend of the NDVI curve related to the growth cycle of wheat represents maturity and senescence.
Image processing for extracting digital traits was challenging as environmental factors, such as wind direction and speed, moved the sensor system position. This factor created a challenge in plot segmentation and ROI defining; thus, the process will be further automated using reference regions/objects in the future. Similarly, the different light intensity and quality of the camera sensor also impact the digital traits. Another aspect to consider is that the image viewpoint in this study was not from nadir, resulting in the wheat fields’ rectangular shape distorted within the images to a trapezoid. In the future, the geometric transformation (e.g., skew correction methods) will be applied before image processing and for a better comparison of the acquired UAV data.
The sensor system can provide multiple images, and image processing algorithms can be adapted to extract several traits. For instance, different vegetation indices can be estimated that represent canopy vigor and crop health (stress/disease). In addition, plant growth rate (plant height) and different stages of crop growth and development (emergence, heading, senescence) can also be estimated. Integrating the environmental data with the crop information (growth, development, disease, etc.) through the growing season can provide valuable data for genotype- environment studies, in order to understand the genetic variability of growth dynamics. These will assist in robust, accurate, and data-driven germplasm selection.
4. Discussion
Sensor applications for phenotyping (i.e., phenomics) continue to advance, primarily motivated by the necessity to boost genetic improvements in crops/plants for resource optimization, gains in yield, and overall quality. The development of sensors provides opportunities for researchers to assess precise phenotypes with a high temporal resolution, which can be associated with key performance and quality traits assisting in informed variety selection/development [43,44,45]. Moreover, high temporal resolution data allows scientists to define dynamic crop responses to the environment (genotype × environment interactions) to assess genotype performance.
In this study, open-source hardware and software platforms were utilized to develop an automated sensor system capable of capturing phenotypic data. The study specifically explored the utilization of the Raspberry Pi board as the main processing unit, integrated with a stereo camera board such as the StereoPi, to combine two Raspberry cameras (RGB and NoIR). There are several Arduino and Raspberry Pi options, in addition to those used in this study. For example, the Compute Module 4 IO board from Raspberry Pi can combine multiple cameras. Similarly, in this study, ATMOS 41 was selected as the main weather sensor based on the standardization and accuracy of the sensor to precisely analyze G × E interactions. Other soil sensors can also be integrated with the data loggers to capture more information. The integration of sensors to units such as Microsoft’s FarmBeats system is also an option [46]. The sensor system in this study can operate for up to 5 days without the re-charging of the battery. Integration with the solar panel would extend this capacity and potentially make the data collection process sustainable. In the current form, the capabilities can also be improved by limiting the images per time point and the number of time points, which would reduce the battery usage time.
In regard to the NDVI, the feature was extracted as it associates with crop biomass and yield [47]. It is also known to be a key trait associated with phenotyping applications in wheat breeding programs [48,49,50]. Therefore, the NDVI assessment can be used as a reference index for the dynamic monitoring of wheat growth. Researchers have used multiple sensing platforms over the growing season to capture NDVI data, including satellite [51], UAV [52], and proximal sensors [53]. In general, the NDVI value is low during the stem elongation/jointing stage, which reaches its maximum at around the booting/heading state. The NDVI values start to decrease slightly at the flowering/anthesis stage until the grain-filling and the ripening stage. Nevertheless, the NDVI estimates are influenced by many factors, such as measurement time, sensors, environmental conditions, and wheat varieties among others [54]. In this study, a similar profile of NDVI data (Figure 7) was observed with a decrease in NDVI values from the mid to late heading stages (13 July 2020), reaching flowering about one week later (20 July 2020), and finally, the NDVI values decreasing up to the grain-filling and maturity stages (27 July 2020). Besides, with two camera systems, more digital crop traits can be extracted for multivariate data analysis and understanding genotype and environment interactions.
In terms of data management in this study, an onboard data storage system was designed. There were two separate SD cards on the system; one installed with the Raspbian image and other operating systems to control the sensor system, and another with the universal serial bus (USB) card reader placed on the Stereo Pi board to store the images captured from the sensor system. With the 32 GB USB card used in this study, the estimated data size would be about 300 MB/day (~5 MB/image × 15 images/timepoint × 4 timepoints/day); therefore, the card could be used for up to approximately 100 days (that covers all critical stages of crop growth in spring wheat). However, storing the data at the site can be risky due to potential losses (e.g., sensor errors), and a backup plan is needed. This could be potentially resolved by incorporating internet capabilities. Another aspect of data management is data processing. Rapid data processing protocols are needed for the automated data processing and analysis. This could be performed over nighttime, through edge computing, with the implementation of the image processing algorithm onboard to quantify the NDVI value on the ROI. The values can be transmitted to end-users/local database/cloud system for further processing or sharing (web-based interface). Image quality is also an important aspect affecting the data quality, which can be affected by sensor specifications and environmental factors such as wind and rainfall.
For incorporating internet capabilities, Wi-Fi modem/router with a subscriber identity module (SIM) card slot will be installed in the field, which allows users to manage the sensor through virtual network computing (VNC) or secure shell (SSH) for manipulating and organizing the data remotely. Cloud computing can also be explored with such implementation. Given the functionality of the sensor system, integration with the internet to enable IoT capabilities would allow not only the dynamic sampling of crop growth and microclimate data, but also the real-time monitoring of crop growth.
5. Summary
The major capabilities of the sensor system are as described below:
(1). The low-cost sensor systems with dual cameras assembled from broadly available hardware operating on open-source software enabling tasks for continuous crop monitoring, especially for in-field crop evaluation, which is essential for field phenotyping;
(2). Camera operation script and automated trait analysis script integrated into the sensor system are open-source and expandable software based on community-driven numeric and scientific libraries, which are freely available and easily accessible.
Overall, the sensor system’s outputs had satisfactory performance in producing high-frequency data in field-based phenotyping environments. The system can give researchers access to an affordable option for high-throughput and accurate phenotyping in the absence of personnel, allowing the conservation of resources (time for travel and data acquisition in remote locations). Likewise, this system can improve the reproducibility of experiments and reliability of in-field phenotyping data as it can build large data sets. It also can uniquely provide the dynamic sampling of crop growth and development alongside microclimate data. Our ongoing efforts will also explore internet connection to convert the system with IoT capabilities and allow real-time monitoring and cloud computing.
Author Contributions
Conceptualization, W.S. and S.S.; methodology, W.S. and S.S.; software, W.S. and S.S.; validation, W.S., S.S., A.H.C. and M.O.P., formal analysis, W.S.; investigation, W.S., V.J., S.S., A.H.C. and M.O.P., resources, S.S., A.H.C. and M.O.P., data curation, W.S.; writing—original draft preparation, W.S.; writing—review and editing, W.S., S.S., A.H.C. and M.O.P., visualization, W.S.; supervision, S.S., A.H.C. and M.O.P., project administration, S.S., A.H.C. and M.O.P., funding acquisition, S.S., A.H.C. and M.O.P. All authors have read and agreed to the published version of the manuscript.
Funding
This research was funded by the Emerging Research Issues Internal Competitive Grant (20-04) from the College of Agricultural, Human, and Natural Resource Sciences Office of Research at Washington State University, and United States Department of Agriculture (USDA)—National Institute for Food and Agriculture (NIFA) Hatch Project (accession number 1014919).
Institutional Review Board Statement
Not applicable.
Informed Consent Statement
Not applicable.
Data Availability Statement
The data presented in this study are available on request from the corresponding author.
Conflicts of Interest
The authors declare no conflict of interest.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Figures
Figure 1. Sensor system for field high-throughput phenotyping applications. (a) Individual components, and (b) the sensor system.
Figure 2. System operation protocol at each time point. The user can define the operational settings, time points, and the number of images.
Figure 3. Spring wheat field trials. (a) Installation map, and (b) sensor system on the field.
Figure 4. Sensor system and associated components. (a) Complete sensor set-up, (b) the weather station installed at 1.8 m AGL, (c) the sensor system installed at 1.7 m AGL, (d) the reference panel installed at 1.0 m AGL, and (e) the sensor system angle (50–60° downward) installed at a distance of 4.2 m between the sensor system and the reference panel and 1.9 m between the sensor system and targeted plot boundary.
Figure 6. Images from a sample sensor system on 18 July 2020 consisting of the images from RGB and NoIR cameras in one screen captured at (a) 7.30, (b) 10.30, (c) 13.30, and (d) 16.30 h.
Figure 7. Box plot of average NDVI data during the data acquisition period at different time points with temperature data from two representative plots.
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer
© 2021 by the authors.
Abstract
Sensor applications for plant phenotyping can advance and strengthen crop breeding programs. One of the powerful sensing options is the automated sensor system, which can be customized and applied for plant science research. The system can provide high spatial and temporal resolution data to delineate crop interaction with weather changes in a diverse environment. Such a system can be integrated with the internet to enable the internet of things (IoT)-based sensor system development for real-time crop monitoring and management. In this study, the Raspberry Pi-based sensor (imaging) system was fabricated and integrated with a microclimate sensor to evaluate crop growth in a spring wheat breeding trial for automated phenotyping applications. Such an in-field sensor system will increase the reproducibility of measurements and improve the selection efficiency by investigating dynamic crop responses as well as identifying key growth stages (e.g., heading), assisting in the development of high-performing crop varieties. In the low-cost system developed here-in, a Raspberry Pi computer and multiple cameras (RGB and multispectral) were the main components. The system was programmed to automatically capture and manage the crop image data at user-defined time points throughout the season. The acquired images were suitable for extracting quantifiable plant traits, and the images were automatically processed through a Python script (an open-source programming language) to extract vegetation indices, representing crop growth and overall health. Ongoing efforts are conducted towards integrating the sensor system for real-time data monitoring via the internet that will allow plant breeders to monitor multiple trials for timely crop management and decision making.
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer
Details

1 Department of Biological Systems Engineering, Washington State University, Pullman, WA 99164, USA;
2 Department of Crop and Soil Sciences, Washington State University, Pullman, WA 99164, USA;