Abstract: Accurate data acquisition and analysis to obtain crop canopy information are critical steps to understand plant growth dynamics and to assess the potential impacts of biotic or abiotic stresses on plant development. A versatile and easy to use monitoring system will allow researchers and growers to improve the follow-up management strategies within farms once potential problems have been detected. This study reviewed existing remote sensing platforms and relevant information applied to crops and specifically grapevines to equip a simple Unmanned Aerial Vehicle (UAV) using a visible high definition RGB camera. The objective of the proposed Unmanned Aerial System (UAS) was to implement a Digital Surface Model (DSM) in order to obtain accurate information about the affected or missing grapevines that can be attributed to potential biotic or abiotic stress effects. The analysis process started with a three-dimensional (3D) reconstruction from the RGB images collected from grapevines using the UAS and the Structure from Motion (SfM) technique to obtain the DSM applied on a per-plant basis. Then, the DSM was expressed as greyscale images according to the halftone technique to finally extract the information of affected and missing grapevines using computer vision algorithms based on canopy cover measurement and classification. To validate the automated method proposed, each grapevine row was visually inspected within the study area. The inspection was then compared to the digital assessment using the proposed UAS in order to validate calculations of affected and missing grapevines for the whole studied vineyard. Results showed that the percentage of affected and missing grapevines was 9.5% and 7.3%, respectively from the area studied. Therefore, for this specific study, the abiotic stress that affected the experimental vineyard (frost) impacted a total of 16.8 % of plants. This study provided a new method for automatically surveying affected or missing grapevines in the field and an evaluation tool for plant growth conditions, which can be implemented for other uses such as canopy management, irrigation scheduling and other precision agricultural applications.
Keywords: remote sensing, canopy cover, viticultural management, frost damage, digital surface model
DOI: 10.3965/j.ijabe.20160906.2908
1Introduction
Worldwide, 80% of the grape production is processed for winemaking through fermentation of grapes, which has been identified as the main focus of the grape industry in the future. The cultivation of grapevines and winemaking has become an important industry to promote economic development of burgeoning wine producing countries, such as China. However, this industry is limited by the monitoring tools implemented, which determines the level and efficiency of management strategies to prevent and minimize the effects of biotic, abiotic stresses and natural disasters on crop yield and quality. Specifically, for vineyards, these effects can result in affected or even missing grapevine plants due to partial or complete plant loss. Hence, these effects have a direct relationship to the level of production and quality of grapes in the short and long term[1,2]. Furthermore, these effects can lead to significant economic losses depending on the severity of the specific detrimental effects. Therefore, it is necessary to efficiently acquire accurate and timely information from grapevines for management purposes, which can be time consuming and costly using only visual inspections or any other common management techniques that are currently available. Specifically, the quantification of canopy loss from affected and/or missing grapevines by moderate to severe effects of biotic or abiotic stresses is important for the assessment of the potential impacts on vigor, grape development, grape quality and final yield. A rapid, accurate and easy to implement assessment tool that is able to identify affected plants based on remote sensing techniques will allow growers to improve management strategies related to pest and disease management, frost control and plant replacement if necessary, among others.
1.1Remote sensing technologies and platforms applied to vineyards
Remote sensing technologies using different platforms, such as satellite, airborne and more recently Unmanned Aerial Systems (UAS), have been applied in the viticultural industry worldwide. In relation to remote sensing platforms, it is important to note the advantages and disadvantages between satellite, airborne and UAS in terms of spatial and temporal resolution of data acquisition. The advantage from satellite based remote sensing, besides higher spatial resolution, is that data can be accessible for free from some platforms such as Landsat 7-8 or Moderate-Resolution Imaging Spectroradiometer (MODIS) for visible and multispectral information. The main disadvantage is the pixel resolution, which for the MODIS is 30 m x 30 m per pixel, making the assessment difficult in a plant by plant basis, since plants are separated just by a couple of meters. However, in the case of IKONOS and WorldView 2 (data not free), the panchromatic pixel resolution is 1 m x 1 m and multispectral resolution is 2 m x 2 m respectively. This information can be useful for some studies related to canopy structure and leaf area estimations^31. It was also shown that more accurate and spatially representative Leaf Area Index (LAI) can be achieved from MODIS data combined with higher spatial resolution information and interpolation techniques^41. However, a big satellite pixel footprint will mix information from canopies and soil in a heterogeneously distributed crop plantation such as vineyards. A temporal resolution problem from satellites is also found in the re-visitation time to the same monitoring area, which is around 16 d in the case of Landsat 7-8. Furthermore, these intervals will render poor to useless information in cloudy days.
Most of the indices that can be calculated from satellite remote sensing data requires atmospheric corrections to be more accurate and valid[51. This is not the case for UAS due to low altitude survey (50100 m.a.g.). The most common indices for grapevines have been used are Plant Density Maps[61 and Normalized Difference Vegetation Index maps (NDVI)[71.
There are lots of researches related to the application of remote sensing techniques using different platforms. Li[81 analyzed the variability within canopy growth of grapevines using the NDVI obtained from spectral remote sensing techniques. This study provided the basis for the potential applications of remote sensing for growth monitoring and yield estimation of field crops. Recently, authors have proposed monitoring crop evapotranspiration at high spatial resolution using airborne and UAS platforms coupled with thermal infrared and multispectral remote sensing techniques[9-111. Carrasco-Benavides et al.[12,131 obtained crop coefficients and actual evapotranspiration in vineyards by using multispectral satellite images and the Mapping Evapo-Transpiration at high Resolution with Internalized Calibration (METRIC) model. In the case of canopy structure and vigor monitoring, Johnson et al.[61 analyzed grapevine leaf area with multispectral images obtained from satellite remote sensing techniques. Mathews and Jensen[14] performed visual and quantitative analyses of grape LAI with high-density point clouds acquired with an UAV. Fuentes et al.[3] associated grapevine leaf area with high resolution (2 m x 2 m per pixel) satellite multispectral remote sensing from WorldView2 compared to the cover photography method and ground truth data. Finally, Ballesteros et al.[15] characterized grapevine canopies using UAV-based remote sensing and photogrammetry techniques.
The main advantage of high spatial and temporal resolution remote sensing technologies is that they can rapidly survey a wider area of vineyards compared to what is feasible using ground based methods or visual inspections. Furthermore, having higher spatial resolution information from vineyards, such as those acquired by UAS platforms, could allow a plant-by-plant based inspection of abiotic and biotic stresses that can affect canopy development and growth[16'17] and open up the development of automated systems for the identification and classification of affected plants.
1.2Unmanned Aerial Systems (UAS): multispectral versus visible imagery
UAS remote sensing has been one of the technologies with the highest rate of development for precision agriculture applications within the last five years[18-20]. With higher temporal and spatial resolution capabilities, this technique has been increasingly applied in viticulture. For example, Primicerio et al.[19] acquired information from grapevines in central Italy using an UAS with promising results for precision viticulture. Baluja et al.[7] acquired multispectral remote sensing images using an UAS to estimate grape moisture changes. Hall et al.[13,16] analyzed grapevine canopies using high-resolution multispectral images obtained from an UAS. Most recently, multispectral and thermal imagery information has been used to calculate energy balance from crops, including vineyards, at the pixel level to obtain evapotranspiration in high spatial resolution using UAS[10'11].
Current remote sensing methods for canopy growth assessment are mainly based on multispectral remote sensing imagery. However, spectral remote sensing methods are affected by spatial and temporal resolutions, spectral bands and other properties from objects of interest. The spectral bands can result in some problems in data analysis associated to plant material with different spectra (different growth rate), separation of plant material and shade, different plant material with similar spectra (such as the case of weeds in the inter-row) and mixed information per pixel (associated to spatial resolution)[3]. These problems increase the difficulty for crop identification and classification[2122]. Moreover, multiple spectral bands require expensive instrumentation with specific requirements in operator skills and knowhow for data acquisition, processing and high computer processing power. However, the later issues can change in the near future due to rapid advances in specialized technical training of personnel, technology, software development and computing processing capabilities.
High resolution photogrammetry capabilities for UAS are currently very easy and affordable to acquire and they can offer a variety of applications to assess canopy structure and spatial differences in growth patterns for crops.
1.3Application of the Digital Surface Model (DSM) to photogrammetry
A DSM is a model of identifiable ground objects including buildings, forest trees and crop plants, among others. It can provide elevation information of various ground objects to show surface undulation conditions and crop growth conditions. The DSM technique has been widely applied to monitoring forests, urban green infrastructures and vegetation analysis of different environments. In the case of vineyards, Turner et al.[23], and Burgos et al.[2324] collected the images with UAV to generate a DSM for vineyards from RGB images applied to precision viticulture. Lucieer et al.[22], performed 3D reconstruction with the remote sensing images acquired using a fixed-wing UAV and obtained an high-resolution DSM information to monitoring soil erosion using Structure from Motion algorithms (SfM). In the same way, Zhang et al.[25] used these techniques to construct 3D plant maps. Furthermore, Turner et al.[23] obtained micro landscape information from Antarctic moss seedbed using DSM. Bendig et al.[26] generated the crop DSM with RGB images captured with UAV to estimate barley biomass.
For ground-based photogrammetry, several studies have been conducted to detect canopy architecture and vigor parameters on horticultural tree canopies using digital imagery and image analysis algorithms, which have been based on the automated analysis proposed by Fuentes et al.[27] applied to Eucalyptus trees. This cover photography technique has been applied successfully to grapevines[3], apple trees[28] and cherry trees[29]. Currently, there is a free computer application (App) released in 2015, which incorporates the algorithms proposed by Fuentes et al.[30] called VitiCanopy, that can be applied to other tree species by selecting an appropriate light extinction coefficient (k). All the previous studies are based on upward-looking cover photography and automated analysis algorithms with the exception of Fuentes et al.[3] who used the cover photography technique to images obtained from the top of the canopy at 0o Nadir with very accurate results compared to allometric measurements. Proximal cover photography has been recently applied to UAS digital photography by automatically detecting plants and cropping images with a surface equal to the plantation area (determined by the distance between plants and between rows).
Based on the previous research described, this study proposed the use of DSM through three-dimensional reconstruction with SfM techniques, and photogrammetry algorithms applied to RGB imagery obtained using UAS at the plant by plant scale to assess potential biotic or abiotic stresses affecting grapevine canopies.
2Materials and methods
2.1Site description and plant material
The experimental field is located at the vineyards belonging to Chateau Zhihui Yuanshi in the East Helan Mountain Area of Ningxia (38.58°N, 106.01°E, 1194 m.a.s.l.) (Figure 1). The climate of the region is classified as a temperate arid with a mean annual precipitation of about 180 mm. The frost-free season corresponds to 180 days per year. The accumulated degree days (ADD base 10°C) from April to September corresponds to 3360°C. The soil is mainly sandy in texture with high water infiltration velocity. After many years of development, this area has become a renowned region for wine grapes in China.
The experimental vineyard is planted with 6-year-old grapevines (Vitis vinifera L.), cultivar Carbernet Sauvignon. The row direction is S-N with spacing between plants and between rows corresponding to 1.5 m x 3.8 m respectively (1754 plants/hm2). The grapevines are drip irrigated with 0.2 L/h drippers spaced at 0.3 m between them. The irrigation strategy used is the Regulated Deficit Irrigation (RDI) technique, which imposes mild to moderate water stress levels from the veraison phonological stage until harvest[31-33]. The total water application per season varies between 9.0 ML/hm2 to 13.5 ML/hm2 depending on weather for specific seasons.
The original soil in the region is highly infertile, which coupled with high probability of frost damage can cause moderate to severe losses of canopies. This study was performed in September 27th of 2015 in the area under RDI, corresponding to 1.5 hm2. In this experimental area, management strategies, such as irrigation, canopy management and fertilization are more closely monitored compared to the rest of the vineyard. As a result, the growing of grapes in this area is optimal compared to the whole region. Even though this intense management schedule, affected and missing plants can be found in the experimental area, which was the focus of the experiment.
2.2UAS platform and image acquisition specifications
In this study, a DJI Phantom 2 Vision plus quadcopter (DJI, Shenzhen, China) was used as the survey platform (Figure 2). This aircraft is a small sized four-axis quadrotor aerial vehicle with a flight control system, an RGB camera, a three-axis stabilizing gimbal and WiFi communication capabilities. The quadcopter specifications are: 1242 g of weight, 5200 mA h of battery capacity, maximum speed of 15 m/s, and maximum flight time of around 15 min. The camera specifications are 4384^3288 pixel resolution, with an effective resolution of 14 megapixels. The camera can obtain real-time imaging and record high-definition videos. The high-precision three-axis gimbal can control stability with an accuracy of ±0.03° to guarantee the accuracy and precision of image acquisition. In addition, users can connect a smart mobile device wirelessly to the UAV via WiFi to obtain real-time transmission of images and videos.
To reduce the complexity of the UAS operation, the Pix4Dcapture software (Pix4D, Lausanne, Switzerland) was used to control the flight and capture the RGB images. The Pix4D capture is a free version of the Pix4D mapper Pro software, which allows to using the UAS as a mapping and measuring tool by defining autonomous mapping flights through pre-defined waypoints for data acquisition.
2.3Analysis of elevation resolution
A test was conducted in the vicinity of Teaching Building of Electrical and Mechanical College in Northwest Agriculture and Forestry University in China to verify the resolution of images at the flight elevation selected (80 m). For this purpose, 21 cardboard boxes of 50 cmx60 cm of width and length respectively were aligned with incremental height difference of 5 cm from right to left starting with 56 cm height (Figure 3). Data was acquired from an altitude of 80 m (the same with the flight elevation of UAV), which was the same altitude for data acquisition in the experimental field.
2.4SfM and halftone technique to assess canopy structure
The Structure from Motion (SfM) technique facilitates the establishment of DSMs. The SfM is a digital reconstruction technique, in which the overlapping areas are extracted from a series of images captured from different angles of view for 3D reconstruction according to the image feature matching algorithm. With the camera parameters and 3D data, the SfM technique can be used to perform motion estimation according to the geometric relationships between 2D images of multiple view angles. Then the SfM optimizes the motion calculation and determine 3D points with bundle adjustment to finally obtain a dense 3D point cloud through point cloud extension[2534].
The SfM technique can be described in four steps as follows: i) Based on the pinhole model, the conversion relationships between the geo-reference, camera and image coordinate systems are obtained in order to assess the relationship between the camera parameters, 2D images and 3D points; ii) The image feature matching is performed using the Scale-Invariant Feature Transform (SIFT) algorithm proposed by Lowe[35] in 2004 for processing original images by dividing them into sub-images of identical size. For the sub-images, a scale space is established and extreme points are detected to generate feature descriptors and to save the feature point file from each sub-image. Then, sub-images are merged and the positions of feature points are converted into the positions within the original images[34,36]; iii) After optimization by bundle adjustment, an accurate 3D camera posture and sparse point cloud are obtained. Triggs[37] comprehensively explained the application process of the method for the 3D image reconstruction. According to the objective optimization method, the 3D structure and camera matrix are optimized so that the total error is minimized; iv) According to the Multi View Stereo (MVS) algorithm proposed by Furukawa and Ponce[36], sparse point cloud is extended to obtain a dense 3D point cloud. According to the inverse distance weighted interpolation method, dense 3D point cloud was interpolated to obtain 3D grid digital model to generate the DSM.
In this study, the Pix4D mapper software was used to generate a DSM. Pix4D mapper is a 3D reconstruction software based on the principle of SfM. In Pix4D mapper, after feature extraction and matching of UAS aerial images, a 3D point cloud was generated and the DSMs were obtained in Tag Image File Format (TIFF).
2.5Identification of missing grapevines
The 3D DSM allows only qualitative analysis of missing grapevines, but the missing percentage of grapevines in the studied area cannot be assessed quantitatively using this method alone. Therefore, the halftone technique was implemented to express DSM as a greyscale image. In the halftone technique, according to the linear interpolation method, the elevation data in the DSM was converted into the grey values in the greyscale domain (0-255). Based on the greyscale differences, 3D digital models were expressed in the 2D plane. Then through the analysis of grey values, information on missing grapevines can be obtained applying computer vision algorithms described in section 2.5.
This step is necessary since the total greyscale distributes unevenly from the original images due to greyscale images affected by topography characteristics, such as slope and other plant material in the inter-row. Therefore, the grapevine rows cannot be effectively separated from the ground area through simple grey threshold segmentation, especially in situations where there is cover crops or high density weeds in the inter row.
2.5.1 Filtering canopy rows
The height difference between the rows and the ground makes it possible to generate a local grey difference at the boundary level of the canopies, so the boundary can be extracted by calculating the roughness values. Roughness refers to the difference in elevation between the distance of central point to the peak and nadir in the neighborhood area. The specific analysis steps are: i) centering on a pixel point; ii) taking a 3^3 pixel region from that centroid; ii) calculating the maximum and minimum value of all points within the region; iv) calculate calculating the roughness at this point, which can be obtained by subtracting the minimum to the maximum value. The roughness of the whole image can be obtained by moving regions through all the points in an image. The roughness can be expressed as follows:
Roughness=Max (et) - Min (ei) (1)
Since there is a one-to-one correspondence between the grey value and the elevation value for each point after converting into greyscale images, et represents the elevation and grey value at any point in a 3x3 pixel region.
The image obtained after roughness calculation contains all the boundaries from different objects within an image, which require a filtering system between plant of interest and non-plant objects. Filtering can be obtained based on the boundary and grey color difference between the non-plan and plant related objects, since for the elevation difference from non-plant objects, the boundary is generally much smaller which also coincide with smaller grey color differences. This difference was used to de-noise the images and filter grapevine canopies from the inter-row.
Once the de-noising is performed, it is required to fill the boundaries related to grapevine canopies to obtain a binary image with value 1 related to canopy, and value 0 related to the inter-row.
2.5.2Affected and missing grapevine canopies detection and classification
Once obtained the binary image depicting grapevine canopies and rows, it is rotated to get a vertical row alignment. This rotation facilitates treatment of the image as a matrix to be separated according to the pixels corresponding to the distance between plants and between rows defined by the plantation. For this purpose, a customized code written in Matlab (Mathworks Inc., Matick, MA. USA) allowed to batch crop and analyze the regions corresponding to every single plant. From each cropped sub-image, a canopy projective cover (f ) analysis was performed by counting the total number of pixels from the sub-image (tp) and the number of pixels corresponding to gaps (tg). The ff was then calculated using the following equation:
ff = f (2)
The ff values per plant were interpolated using a spline function to map canopy cover using a color code for values between 0 (no canopy) to 1 (full canopy cover within the plantation area). This map allows a quick visualization of the spatial variability of canopy growth and vigor distribution throughout the studied vineyard.
The ff values corresponding to 0 can be classified as missing grapevines and they can be automatically extracted to obtain a mean ff value calculated from the remaining data. This mean ff value was used to compared with every ff value from the vineyard. An affected grapevine identification ff threshold was set to classify grapevine canopies that have decreased vigor between 20% and 90 % from the averaged ff.
Considering all the previous criteria, an automated batch classification algorithm was created to color code and position affected grapevines using blue color filling and missing grapevines using red color filling. A detailed list was then produced by the code with the row number and the plant number for each classification criteria.
3Results
3.1 Analysis of elevation resolution
The image obtained by the UAS from the cardboard boxes (Figure 4a) at different heights depicted in Figure 3 was used to generate a DSM for this particular test (Figure 4b). The differences in height can be visualized by the length of shadow generated by each box (Figure 4a). A 17^17 pixel region of interest (ROI) was cropped per box to convert the DSM into greyscale using the methodology described before.
A strong and positive linear relationship was found between the reference boxes height and the averaged grey scale color extracted from ROIs from each box (Figure 5). These changes in color from clear grey to darker grey corresponds to the intensity values forum in the DSM from right to left (Figure 4b). This process allowed to scaling up differences in grey color to height.
3.2Analysis of missing grapevines
For the selected study site, the DSM was generated using 60 RGB images obtained with UAS flying at 80 m height. Figure 6a shows the DSM converted into a greyscale image and the corresponding image after roughness analysis can be seen in Figure 6b. The final segmentation analysis showed that a threshold of 1 allowed an accurate discrimination between boundaries corresponding to plant objects (Figure 6c). Thereafter, the segmented image was filled to obtain the binary image (Figure 6d).
3.3Analysis and automatic classification of affected and missing grapevines
After rotating the resulting binary image (Figure 7a) to align vertically the grapevines rows (Figure 7b), the automated code cropped the images into sub-images corresponding to individual plants using the equivalent dimensions in pixels from the planting area from the vineyard. Figure 7c shows a map of interpolated f values and their variability throughout the vineyard. This figure also shows in dark blue color missing vine sections with values of ff=0 and bright orange and yellow colors vigorous vines with ff close to 1. From this image, missing vines are concentrated in the upper middle section of the vineyard. The automated classification per grapevine into affected (blue filling corresponding to 166 plants or 9.5% of total plants from the 1.5 hm2), and missing grapevines (red filling corresponding to 128 plants or 7.3% of total plants from the 1.5 hm2) can be seen in Figure 7d. Most of the affected grapevines in lighter blue are surrounding the areas of missing grapevines with isolated affected grapevines in the upper part and lower part of the studied vineyard. A complete list of locations of affected and missing grapevines was produced by the code showing row number and plant number from the top part of the vineyard for each classification, which was highly accurate compared to visual inspection of the vineyard (data not shown).
4Discussion
China is considered one of the new world wine countries with around 80% of the grape production destined to winemaking. To become competitive nationally and internationally, grape growers require the implementation of the latest technological advances to be efficient in implementing management strategies in large vineyards. However, these technologies are currently prohibitive due to cost of instrumentation and requirements of high level of technical skills and knowhow to install, maintain, acquire and process data from them.
The use of UAS has become very promising for precision agriculture and viticulture related applications with affordable availability of light weight aircrafts using high definition visible cameras (RGB)[38'39]. More technical and detailed use of UAS can be implemented to assess different physiological aspects of plants, such as: i) vigor and canopy architecture monitoring (using multispectral cameras and NDVI); ii) plant water status (using infrared thermal cameras)[40-44] or both and iii) sub-meter evapotranspiration estimations, which increases the cost of instrumentation required. Sub-meter evapotranspiration estimation systems from remote sensing data will also require specialized personnel for aircraft operation, data acquisition and analysis. Furthermore, more camera requirements result in higher payload for the UAS, which is a restriction for flying UAS in many viticultural countries, such as Australia, for which the Civil Aviation Safety Authority (CASA) has very strict regulations for aircrafts weighting more than 23 kg[45]. It is a requirement to have a license for piloting, have the aircraft always at eyesight from the pilot, which requires a second person to man the computer to verify whether data has been acquired and general UAS functioning parameters. Currently, these restrictions do not affect China, which increases the possibilities of using UAS for precision agriculture.
For the above-mentioned reasons, this research concentrated on an affordable UAS and developed a pipeline of data acquisition and analysis to obtain an automated classification of a vineyard at the per-plant scale. This study specifically proposed a simple DSM-based survey method coupled with computer vision algorithms to identify affected and missing grapevines. Unlike the traditional field survey methods, the method proposed was able to quickly and accurately acquire imagery and post process all the information from RGB images, thus avoiding problems when using multispectral remote sensing images which have been previously identified. After grey scale images of DSM were successfully processed and information on missing grapevines was visualized, an interpolated map was obtained from the automated calculation of the fraction of canopy projective cover per plant (Figure 7c). This map can be used to visualize the edge effect which is predominantly visible in the top and sides edges of the studied vineyard. In this sense, many research studies try to avoid the edge effect by arbitrarily measuring grapevines after a determined number of plants from the edges. The system proposed could be useful to detect accurately the real edge effect per site to ensure representative measurements for scientific purposes or the implementation of management strategies. Furthermore, this map offers a snapshot of the variability of canopy cover within the vineyard studied. This information can be of great value for management purposes to uniform productivity and quality of grapes. It is well known that grapevines need to have a balance between the vegetative parts (canopy vigor) and reproductive organs (berries) to achieve optimal water use efficiency and grape quality[46,47].
From a ground based sensor network application, one of the most common questions is how many sensors should be installed per hectare and where. With high level of soil and plant growth variability a site-specific assessment is required through either visual or soil surveys with limited measuring points and high variability assumptions. The mapping system proposed could offer information on different zones within a specific vineyard separated mainly by vigor. Vigor is a physiological parameter that integrates effects of the soil-plant-atmosphere continuum. Therefore, after a simple optimization procedure, relevant growth zones can be identified to install a statistically significant number of sensors per representative zones to maximize investment.
The automated classification of affected and missing grapevines depicted in Figure 7d, and after the pipeline analysis described previously, can be used for a number of different applications, such as: i) assessment of the onset effects of biotic and abiotic stresses which will facilitate visual inspection of plants in a large plantation area, if required; ii) identification of broken or blocked driplines which affect canopy growth; iii) an inversed analysis system for edge detection from grayscale images obtained from DSM and the relationship found in Figure 5, could be used to the identification of weed infestation in the inter-row.
5Conclusions
This study reviewed existing remote sensing platforms that can be applied to grapevine management and tested the simplest, cheapest and accurate method to extract information from plants related to detrimental effects of abiotic stress (frost) on canopy growth. The UAS was based on a simple DSM-based survey method to assess affected and missing grapevine canopies by potential biotic or abiotic stresses. The affordable requirements to implement the UAS proposed, related to the aircraft and RGB high definition camera used, facilitates the practical implementation for research and viticultural management applications. Furthermore, owing to the light weight of the UAS (less than 23 kg), this system can be readily used in countries with high flight restrictions for precision agriculture applications, such as Australia.
Acknowledgements
This study was supported by the National Natural Science Foundation of China (No. 41401391), the Fundamental Research Funds for the Central Universities of China (No. 2014YB071), the Scientific Research Foundation for the Returned Overseas Chinese Scholars, State Education Ministry, and the Exclusive Talent Funds of Northwest A&F University of China (No.2013BSJJ017).
Citation: Su B F, Xue J R, Xie C Y, Fang Y L, Song Y Y, Fuentes S. Digital surface model applied to unmanned aerial vehicle based photogrammetry to assess potential biotic or abiotic effects on grapevine canopies. Int J Agric & Biol Eng, 2016; 9(6): 119-130.
[References]
[1]Cramer G R, Ergül A, Grimplet J, Tillett R L, Tattersall E A R, Bohlman M C, et al. Water and salinity stress in grapevines: early and late changes in transcript and metabolite profiles. Functional & Integrative Genomics, 2007; 7(2): 111-134.
[2] Cramer G R, Urano K, Delrot S, Pezzotti M, Shinozaki K. Effects of abiotic stress on plants: a systems biology perspective. BMC Plant Biology, 2011; 11(1): 1-14.
[3] Fuentes S, Poblete-Echeverría C, Ortega-Farias S, Tyerman S, De Bei R. Automated estimation of leaf area index from grapevine canopies using cover photography, video and computational analysis methods. Australian Journal of Grape and Wine Research, 2014; 20(3): 465-473.
[4] Liu Z, Huang R, Hu Y, Fan S,Feng P. Generating high spatiotemporal resolution LAI based on MODIS/GF-1 data and combined Kriging-Cressman interpolation. International Journal of Agricultural and Biological Engineering, 2016; 9(5): 120-131.
[5] Hadjimitsis D G, Papadavid G, Agapiou A, Themistocleous K, Hadjimitsis M G, Retalis A, et al. Atmospheric correction for satellite remotely sensed data intended for agricultural applications: impact on vegetation indices. Natural Hazards and Earth System Sciences, 2010; 10(1): 89-95.
[6] Johnson L F, Roczen D E, Youkhana S K, Nemani R R,Bosch D F. Mapping vineyard leaf area with multispectral satellite imagery. Computers & Electronics in Agriculture, 2003; 38(1): 33-44.
[7] Baluja J, Diago M P, Balda P, Zorer R, Meggio F, Morales F, et al. Assessment of vineyard water status variability by thermal and multispectral imagery using an unmanned aerial vehicle (UAV). Irrigation Science, 2012; 30(6): 511-522.
[8] Li D. Open access publishing of scientific scholarly journals in China. Master dissertation. Beijing: China Agriculture University, 2005; 42p. (in Chinese with English abstract).
[9] Xia T, Kustas W P, Anderson M C, Alfieri J G, Gao F, McKee L, et al. Mapping evapotranspiration with high-resolution aircraft imagery over vineyards using one-and two-source modeling schemes. Hydrology and Earth System Sciences, 2016; 20(4): 1523-1545.
[10] Hoffmann H, Nieto H, Jensen R, Guzinski R, Zarco-Tejada P,Friborg T. Estimating evapotranspiration with thermal UAV data and two source energy balance models. Hydrology and Earth System Sciences Discussions, 2015; 12: 7469-7502.
[11] Ortega-Farías S, Ortega-Salazar S, Poblete T, Kilic A, Allen R, Poblete-Echeverría C, et al. Estimation of energy balance components over a drip-irrigated olive orchard using thermal and multispectral cameras placed on a helicopter-based unmanned aerial vehicle (UAV). Remote Sensing, 2016; 8(8): 638.
[12] Carrasco-Benavides M, Ortega-Farías S, Lagos L O, Kleissl J, Morales L, Poblete-Echeverría C, et al. Crop coefficients mand actual evapotranspiration of a drip-irrigated Merlot vineyard using multispectral satellite images. Irrigation Science, 2012; 30(6): 485-497.
[13] Hall A, Louis J, Lamb D. Characterising and mapping vineyard canopy using high-spatial-resolution aerial multispectral images. Computers & Geosciences, 2003; 29(7): 813-822.
[14] Mathews A J, Jensen J L R. Visualizing and quantifying vineyard canopy LAI using an unmanned aerial vehicle (UAV) vollected high density structure from motion point cloud. Remote Sensing, 2013; 5(5): 2164-2183.
[15] Ballesteros R, Ortega J F, Hernández D, Moreno M Á. Characterization of vitis vinifera L. canopy using unmanned aerial vehicle-based remote sensing and photogrammetry techniques. American Journal of Enology & Viticulture, 2015; 66(2): 120-129.
[16] Hall A, Lamb D W, Holzapfel B, Louis J. Optical remote sensing applications in viticulture-a review. Australian Journal of Grape & Wine Research, 2002; 8(1): 36-47.
[17] Powell K S. A holistic approach to future management of grapevine phylloxera, in Arthropod Management in Vineyards: Pests, Approaches, and Future Directions. Bostanian N J, Vincent C, Isaacs R, Ed. 2012; Springer Netherlands: Dordrecht. 219-251.
[18] Honkavaara E, Saari H, Kaivosoja J, Pölönen I, Hakala T, Litkey P, et al. Processing and assessment of spectrometric, stereoscopic smagery collected using a lightweight UAV spectral camera for precision agriculture. Remote Sensing, 2013; 5(10): 5006-5039.
[19] Primicerio J, Gennaro S F D, Fiorillo E, Genesio L, Lugato E, Matese A, et al. A flexible unmanned aerial vehicle for precision agriculture. Precision Agriculture, 2012; 13(4): 517-523.
[20] Gago J, Douthe C, Coopman R, Gallego P, Ribas-Carbo M, Flexas J, et al. UAVs challenge to assess water stress for sustainable agriculture. Agricultural Water Management, 2015; 153(1): 9-19.
[21] Li Z, Chen Z, Wang L, Liu J, Zhong Q. Area extraction of maize loading based on remote sensing by small unmanned aerial vehicle. Transaction of the Chinese Society of Agricultural Engineering, 2014; 30(19): 207-213. (in Chinese with English abstract).
[22] Lucieer A, Jong S M D, Turner D. Mapping landslide displacements using Structure from Motion (SfM) and image correlation of multi-temporal UAV photography. Progress in Physical Geography, 2014; 38(1): 97-116.
[23] Turner D, Lucieer A, Watson C S. Development of an Unmanned Aerial Vehicle (UAV) for hyper-resolution vineyard mapping based on visible, multispectral and thermal imagery. In International Symposium on Remote Sensing of Environment, 2011.
[24] Burgos S, Mota M, Noll D, Cannelle B. Use of very high-resolution airborne images to analyse 3D canopy architecture of a vineyard. The International Archives of Photogrammetry, Remote Sensing and Spatial Information Sciences, 2015; 40(3): 399-403.
[25] Zhang Y, Zhuang Z, Xiao Y, He Y. Rape plant NDVI 3D distribution based on structure from motion. Transaction of the Chinese Society of Agricultural Engineering, 2015; 31(17): 207-214. (in Chinese with English abstract).
[26] Bendig J, Bolten A, Bennertz S, Broscheit J, Eichfuss S, Bareth G. Estimating biomass of barley using Crop Surface Models (CSMs) derived from UAV-based RGB imaging. Remote Sensing, 2014; 6(3): 10395-10412.
[27] Fuentes S, Palmer A R, Taylor D, Zeppel M, Whitley R, Eamus D. An automated procedure for estimating the Leaf Area Index (LAI) of woodland ecosystems using digital imagery, MATLAB programming and its application to an examination of the relationship between remotely sensed and field measurements of LAI. Functional Plant Biology, 2008; 35(10): 1070-1079.
[28] Poblete-Echeverría C, Fuentes S, Ortega-Farias S, Gonzalez-Talice J, Yuri J A. Digital cover photography for estimating Leaf Area Index (LAI) in apple trees using a variable light extinction coefficient. Sensors, 2015; 15(2): 2860-2872.
[29] Mora M, Avila F, Carrasco-Benavides M, Maldonado G, Olguín-Cáceres J, Fuentes S. Automated computation of leaf area index from fruit trees using improved image processing algorithms applied to canopy cover digital photograpies. Computers & Electronics in Agriculture, 2016; 123: 195-202.
[30] Bei R D, Fuentes S, Gilliham M, Tyerman S, Edwards E, Bianchini N, et al. VitiCanopy: A free computer App to estimate canopy vigor and porosity for grapevine. Sensors, 2016; 16(4): 585.
[31] McCarthy M G, Loveys B R, Dry P R, Stoll M. Regulated deficit irrigation and partial rootzone drying as irrigation management techniques for grapevines. Deficit irrigation practices, FAO Water Reports, 2002; 22: 79-87.
[32] Chaves M M, Santos T P, Souza C R d, Ortuño M F, Rodrigues M L, Lopes C M, et al. Deficit irrigation in grapevine improves water-use efficiency while controlling vigour and production quality. Annals of Applied Biology, 2007; 150(2): 237-252.
[33] Chaves M M, Zarrouk O, Francisco R, Costa J M, Santos T, Regalado A P, et al. Grapevine under deficit irrigation: hints from physiological and molecular data. Annals of Botany, 2010; 105(5): 661-676.
[34] Xu Z, Wu L, Liu J, Shen Y, Li F, Wang R. Modification of SFM algorithm referring to image topology and its application in 3-Dimension reconstruction of disaster area. Geomatics and Information Science of Wuhan University, 2015; 40(5): 599-606. (in Chinese with English abstract).
[35] Lowe D G. Distinctive image features from scale-invariant keypoints. International Journal of Computer Vision, 2004; 60(60): 91-110.
[36] Furukawa Y, Ponce J. Accurate, dense, and robust multiview stereopsis. IEEE Transactions on Pattern Analysis & Machine Intelligence, 2010; 32(8): 1362-76.
[37] Triggs B, Mclauchlan P F, Hartley R I, Fitzgibbon A W. Bundle Adjustment-A. modern synthesis. in International Workshop on Vision Algorithms: Theory and Practice, 1999.
[38] Jiao L C, Tan S. Development and prospect of image multiscale geometric analysis. Acta Electronica Sinica, 2003; 31(Z1): 1975-1981. (in Chinese with English abstract).
[39] Huang Y B, Thomson S J, Brand H J, Reddy K N. Development and evaluation of low-altitude remote sensing systems for crop production management. Int J Agric & Biol Eng, 2016; 9(4): 1-11.
[40] Zia S, Spohrer K, Merkt N, Du W Y, He X K, Müller J. Non-invasive water status detection in grapevine (Vitis vinifera L.) by thermography. Int J Agric & Biol Eng, 2009; 2(4): 46-54.
[41] Fuentes S, Bei R D, Pech J, Tyerman S. Computational water stress indices obtained from thermal image analysis of grapevine canopies. Irrigation Science, 2012; 30(6): 523-536.
[42] Zia S, Du W, Spreer W, Spohrer K, He X, Muller J. Assessing crop water stress of winter wheat by thermography under different irrigation regimes in North China Plain. Int J Agric & Biol Eng , 2012; 5(3): 24-34.
[43] Poblete-Echeverría C, Sepulveda-Reyes D, Ortega-Farias S, Zuñiga M, Fuentes S. Plant water stress detection based on aerial and terrestrial infrared thermography: a study case from vineyard and olive orchard. Acta Horticulturae, 2016(1112): 141-146.
[44] Poblete-Echeverría C, Sepulveda-Reyes D, Ortega-Farias S, Zuñiga M, Fuentes S. Plant water stress detection based on aerial and terrestrial infrared thermography: a study case from vineyard and olive orchard. in XXIX International Horticultural Congress on Horticulture: Sustaining Lives, Livelihoods and Landscapes (IHC2014): 1112. 2014.
[45] CASA. Civil Aviation Safety Authority, Australia. https://www.casa.gov.au/aircraft/landing-page/flying-dronesaustralia. Accessed on [2016-10-16].
[46] Poni S, Bernizzoni F, Civardi S. The effect of early leaf removal on whole-canopy gas exchange and vine performance of Vitis vinifera L. 'Sangiovese'. VITIS-Journal of Grapevine Research, 2015; 47(1): 1.
[47] Medrano H, Tomás M, Martorell S, Escalona J-M, Pou A, Fuentes S, et al. Improving water use efficiency of vineyards in semi-arid regions. A review. Agronomy for Sustainable Development, 2015; 35(2): 499-517.
Su Baofeng1, Xue Jinru1, Xie Chunyu1, Fang Yulin2, Song Yuyang2*, Sigfredo Fuentes3
(1. College of Mechanical and Electronic Engineering, Northwest A&F University, Yangling 712100, China;
2. College of Enology, Northwest A&F University, Yangling 712100, China;
3. Faculty of Veterinary and Agricultural Sciences, University of Melbourne, Parkville 3010, Australia)
Received date: 2016-10-10 Accepted date: 2016-11-18
Biographies: Su Baofeng, PhD, Lecturer, research interests: precision agriculture, remote sensing of agriculture, Email: [email protected]; Xue Jinru, Master, research interests: precision agriculture, remote sensing of agriculture, Email: [email protected]; Xie Chunyu, Undergraduate, research interests: machine learning, image processing, Email: cyxie@ nwsuaf.edu.cn; Fang Yulin, PhD, Professor, research interests: viticulture, Email: [email protected]; Sigfredo Fuentes, PhD, Senior Lecturer, research interests: viticulture, remote sensing and precision agriculture, Email: [email protected].
*Corresponding author: Song Yuyang, PhD, Lecturer, research interests: fermentation, viticulture. College of Enology, Northwest A&F University, 712100, Shaanxi, China. Tel: +8629-87091527, Email: [email protected].
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer
Copyright International Journal of Agricultural and Biological Engineering (IJABE) Nov 2016
Abstract
Accurate data acquisition and analysis to obtain crop canopy information are critical steps to understand plant growth dynamics and to assess the potential impacts of biotic or abiotic stresses on plant development. A versatile and easy to use monitoring system will allow researchers and growers to improve the follow-up management strategies within farms once potential problems have been detected. This study reviewed existing remote sensing platforms and relevant information applied to crops and specifically grapevines to equip a simple Unmanned Aerial Vehicle (UAV) using a visible high definition RGB camera. The objective of the proposed Unmanned Aerial System (UAS) was to implement a Digital Surface Model (DSM) in order to obtain accurate information about the affected or missing grapevines that can be attributed to potential biotic or abiotic stress effects. The analysis process started with a three-dimensional (3D) reconstruction from the RGB images collected from grapevines using the UAS and the Structure from Motion (SfM) technique to obtain the DSM applied on a per-plant basis. Then, the DSM was expressed as greyscale images according to the halftone technique to finally extract the information of affected and missing grapevines using computer vision algorithms based on canopy cover measurement and classification. To validate the automated method proposed, each grapevine row was visually inspected within the study area. The inspection was then compared to the digital assessment using the proposed UAS in order to validate calculations of affected and missing grapevines for the whole studied vineyard. Results showed that the percentage of affected and missing grapevines was 9.5% and 7.3%, respectively from the area studied. Therefore, for this specific study, the abiotic stress that affected the experimental vineyard (frost) impacted a total of 16.8 % of plants. This study provided a new method for automatically surveying affected or missing grapevines in the field and an evaluation tool for plant growth conditions, which can be implemented for other uses such as canopy management, irrigation scheduling and other precision agricultural applications.
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer