Proceedings Volume 10664

Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping III

J. Alex Thomasson, Mac McKee, Robert J. Moorhead
cover
Proceedings Volume 10664

Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping III

J. Alex Thomasson, Mac McKee, Robert J. Moorhead
Purchase the printed version of this volume at proceedings.com or access the digital version at SPIE Digital Library.

Volume Details

Date Published: 1 August 2018
Contents: 8 Sessions, 27 Papers, 22 Presentations
Conference: SPIE Commercial + Scientific Sensing and Imaging 2018
Volume Number: 10664

Table of Contents

icon_mobile_dropdown

Table of Contents

All links to SPIE Proceedings will open in the SPIE Digital Library. external link icon
View Session icon_mobile_dropdown
  • Front Matter: Volume 10664
  • Collecting Reliable Image Data with UAVs
  • Proximal and Remote Sensing for Phenotyping
  • Thermal and Hyperspectral Imaging from UAVs
  • Detecting Yield, Disease, and Water Stress from UAVs
  • Analytics for UAV-based Crop Management
  • Innovative UAV Applications
  • Poster Session
Front Matter: Volume 10664
icon_mobile_dropdown
Front Matter: Volume 10664
This PDF file contains the front matter associated with SPIE Proceedings Volume 10664, including the Title Page, Copyright information, Table of Contents, Introduction (if any), and Conference Committee listing.
Collecting Reliable Image Data with UAVs
icon_mobile_dropdown
Implications of sensor inconsistencies and remote sensing error in the use of small unmanned aerial systems for generation of information products for agricultural management
Mac McKee, Ayman Nassar, Alfonso Torres-Rua, et al.
Small, unmanned aerial systems (sUAS) for remote sensing represent a relatively new and growing technology to support decisions for agricultural operations. The size and power limitations of these systems present challenges for the weight, size, and capability of the sensors that can be carried, as well as the geographical coverage that is possible. These factors, together with a lack of standards for sensor technology, its deployment, and data analysis, lead to uncertainties in data quality that can be difficult to detect or characterize. These, in turn, limit comparability between data from different sources and, more importantly, imply limits on the analyses that can be accomplished with the data that are acquired with sUAS. This paper offers a simple statistical examination of the implications toward information products of an array of sensor data uncertainty issues. The analysis relies upon high-resolution data collected in 2016 over a commercial vineyard, located near Lodi, California, for the USDA Agricultural Research Service Grape Remote sensing Atmospheric Profile and Evapotranspiration eXperiment (GRAPEX) Program. A Monte Carlo analysis is offered of how uncertainty in sensor spectral response and/or orthorectification accuracy can affect the estimation of information products of potential interest to growers, as illustrated in the form of common vegetation indices.
Ground-truthing of UAV-based remote sensing data of citrus plants
Subodh Bhandari, Amar Raheja, Mohammad R. Chaichi, et al.
This paper presents the ground-truthing of remote sensing data of citrus plants collected from unmanned aerial vehicles (UAVs). The main advantage of the UAV-based remote sensing is the reduced cost and immediate availability of high resolution data. This helps detect crop stresses throughout the crop season. Near infrared (NIR) images obtained using remote sensing techniques help determine the crop performances and stresses of a large area in a short amount of time for precision agriculture, which aims to optimize the amount of water, fertilizers, and pesticides using site-specific management of crops. However, to be useful for the real-world applications, the accuracy of remote sensing data must be validated using the proven ground-based methods. UAVs equipped with multispectral sensors were flown over the citrus orchard at Cal Poly Pomona’s Spadra Farm. The multispectral/hyperspectral images are used in the determination of vegetation indices that provide information on the health of the plant. Handheld spectroradiometer, water potential meter, and chlorophyll meter were used to collect ground-truth data. Correlations between the vegetation indices calculated using airborne data and proximal sensor data are shown.
Quality assessment of radiometric calibration of UAV image mosaics
Cody Bagnall, J. Alex Thomasson, Chao Sima, et al.
The use of UAV (unmanned aerial vehicle) based imaging in agriculture adds the ability to incorporate vast amounts of data into analyses designed to improve efficiency in the use of agricultural inputs. One reason this ability has not yet been realized is that producing UAV based radiometrically calibrated images for the purpose of ensuring data reliability is difficult at the large scale. This paper presents an investigation of field-based image-mosaic calibration procedures using a commercial off-the-shelf fixed-wing small UAV and a five-band multispectral sensor. To determine the quality of the radiometric calibration procedure for UAV image mosaics, images were also collected with an identical camera on a manned aircraft, and ground based radiometric calibration tarps were used to produce high-quality calibrated field images. Satellite images were also collected on the same day as the aircraft images in a two-hour flight window centered on solar noon. The manned aircraft and satellite images were large enough for a single image to cover the entire field. The multispectral camera used enables two kinds of exposure settings; auto exposure allows the camera to automatically select exposure and gain settings for each image in a flight, and manual exposure allows the user to select settings preflight which are used for all the images in that flight. In this work we compare the radiometrically calibrated UAV images, collected with both auto-exposure and manual-exposure methods, to the radiometrically calibrated single-frame image generated with the manned aircraft, as well as to a satellite image.
Correction of in-flight luminosity variations in multispectral UAS images, using a luminosity sensor and camera pair for improved biomass estimation in precision agriculture
Jean-Marc Gilliot, Joël Michelin, Romain Faroux, et al.
Precision farming or agriculture (PA) is a concept where agricultural practices are modulated according to intra-field crop variability. Multispectral sensors have standing use in remote sensing, onboard aircraft and satellites for mapping biomass. With increased miniaturization of sensors, Unmanned Aerial Systems (UAS) become more widely used for multispectral imaging. UAS offer several advantages for PA, such as a relative insensitivity to weather conditions, especially to cloud cover. Most UAS images are acquired in cloudless conditions or with a complete cloud cover to reduce the impact of changing luminosity. This work quantifies the ability to correct luminosity variations on images from UAS flights under varying weather conditions. Measurements were performed with the Parrot Sequoia multispectral camera paired with its Sunshine sensor. Control ground measurements were repeated over two hours on a series of five targets of increasing gray levels. These measurements correlate with corresponding reference spectra from a Spectral Evolution SR-3500 field spectroradiometer. In a second experiment, the camera recorded images every thirty seconds in time-lapse mode, for over an hour, above a reference reflectance target, in order to analyze the evolution of the reflectance over time as a function of the variations of illumination. Finally two different types of UAS carried out several series of flights: a fixed-wing senseFly eBee and an Innovadrone hexacopter rotary wing. This paper presents data analysis with and without the Sunshine sensor correction to quantify the improvement in the quality of reflectance measurements and biomass estimates.
An initial exploration of vicarious and in-scene calibration techniques for small unmanned aircraft systems
Baabak G. Mamaghani, Geoffrey V. Sasaki, Ryan J. Connal, et al.
The use of small unmanned aircraft systems (sUAS) for applications in the field of precision agriculture has demonstrated the need to produce temporally consistent imagery to allow for quantitative comparisons. In order for these aerial images to be used to identify actual changes on the ground, conversion of raw digital count to reflectance, or to an atmospherically normalized space, needs to be carried out. This paper will describe an experiment that compares the use of reflectance calibration panels, for use with the empirical line method (ELM), against a newly proposed ratio of the target radiance and the downwelling radiance, to predict the reflectance of known targets in the scene. We propose that the use of an on-board downwelling light sensor (DLS) may provide the sUAS remote sensing practitioner with an approach that does not require the expensive and time consuming task of placing known reflectance standards in the scene. Three calibration methods were tested in this study: 2-Point ELM, 1-Point ELM, and At-altitude Radiance Ratio (AARR). Our study indicates that the traditional 2-Point ELM produces the lowest mean error in band effective reflectance factor, 0.0165. The 1-Point ELM and AARR produce mean errors of 0.0343 and 0.0287 respectively. A modeling of the proposed AARR approach indicates that the technique has the potential to perform better than the 2-Point ELM method, with a 0.0026 mean error in band effective reflectance factor, indicating that this newly proposed technique may prove to be a viable alternative with suitable on-board sensors.
Behavior of vegetation/soil indices in shaded and sunlit pixels and evaluation of different shadow compensation methods using UAV high-resolution imagery over vineyards
In high-resolution imagery, shadows may cause problems in object segmentation and recognition due to their low reflectance. For instance, the spectral reflectance of shadows and water are similar, particularly in the visible band. In precision agriculture, the vegetation condition in terms of plant water use, plant water stress, and chlorophyll content can be estimated using vegetation indices. Normalized Difference Vegetation Index (NDVI), Leaf Area Index (LAI) and Enhanced Vegetation Index (EVI) are widely used vegetation indices for characterizing the condition of the vegetation. In addition, many soil indices have been developed for describing soil characteristics, such as Soil-Adjusted Vegetation Index (SAVI). However, shadows can have an influence on the performance of these vegetation and soil indices. Moreover, enhancing spatial resolution heightens the impact of shadows in the imagery. In this study, the behavior of vegetation and soil indices are evaluated using four sets of high-resolution imagery captured by the Utah State University AggieAir unmanned aerial vehicle (UAV) system. These indices were obtained from flights conducted in 2014, 2015, and 2016 over a commercial vineyard located in California for the USDA Agricultural Research Service Grape Remote sensing Atmospheric Profile and Evapotranspiration Experiment (GRAPEX) Program. Different shadow restoration methods are used to alleviate the impact of shadows in information products that might be developed from the high-resolution imagery. The histogram pattern of vegetation and soil indices before and after shadow compensation, are compared using analysis of variance (ANOVA). The results of this study indicate how shadows can affect the vegetation/soil indices and whether shadow compensation methods are able to remove the statistical difference between sunlit and shadowed vegetation/soil indices.
Studying CO2 from plant respiration in controlled and natural environment: How can plant breeding industry benefit from it? (Conference Presentation)
Magda Mandic, Mehmet Senbayram, Christoph Bauer, et al.
Looking at the global demand for feed crops it is predicted to increase almost double by 2050 due to growing world population (Foley et al., 2011). Considering the increase in global temperature and water scarcity, crops in the future need to be more water and nutrient use efficient to sustain food security. Photosynthesis or net canopy CO2 exchange is one of the driving forces of crop yield formation. Since most commercially available equipment have been designed for single leaf measurements, photosynthesis at a leaf level has been studied more intensively than canopy photosynthesis. Leaf photosynthesis measurements are often poorly correlated with crop yield, whereas whole plant (canopy) photosynthesis measurements correlate well with crop yield (Kim et al., 2006). Whole canopy measurements bypass the problem of finding a representative leaf and give information about the whole plant physiology and other plant physiological processes. In addition to canopy photosynthesis measurements, non-destructive approaches such as stable isotope measurements via online lasers are excellent tools to study the efficiency in transpiration and photosynthesis in crop plants (Senbayram et al., 2015). Here we show different applications of the the Thermo Scientific™ Delta Ray™ Isotope Ratio Infrared Spectrometer (IRIS) to investigate processes related to photosynthesis and respiration in various ecosystems on scales ranging from the whole plant to the whole ecosystem. With the aim to monitor photosynthesis and plant respiration Delta Ray was deployed also in the automated chamber program. Because several plant chambers were measured in sequence, electrical trigger signals allowed synchronizing the Delta Ray with the automated chamber program. In the field project different locations were monitored by time constructed sequence, allowing system to change to different locations every 30 minutes. The Delta Ray analyzer can be easily integrated in gas exchange experiments to measure the δ13C and δ18O in CO2 of one or several plant chambers sequentially. This results in a high-resolution dataset of plant gas exchange and its isotopic signature, which allows to identify short-term and long-term changes in plant metabolism.
Proximal and Remote Sensing for Phenotyping
icon_mobile_dropdown
Detection of canola flowering using proximal and aerial remote sensing techniques
Chongyuan Zhang, Wilson Craine, James B. Davis, et al.
In plant breeding, the time and length of flowering are important phenotypes that determine the seed yield potential in plants. Currently, flowering traits are visually assessed, which can be time-consuming, less accurate and subjective. To address this challenge, in this study, proximal and remote sensing with an unmanned aerial vehicle (UAV) were applied to monitor the canola flowers in a breeding trial with 35 varieties. Visible digital images (RGB) acquired were processed to extract the flowering features. The results indicated that flowering features extracted from both proximal and aerial images were significantly and positively correlated (P < 0.0001) with each other and with visual ratings. In general, aerial imaging overestimated canola flowering rates, which could be resulting from lower resolution at measured altitude (30 m), and rendered lower correlation coefficients (r = 0.53 – 0.62) with visual ratings. Proximal sensing resulted in better estimation of canola flowering with r ranging from 0.65 to 0.91. This study indicated that remote sensing can be used for high-throughput phenotyping of canola flowers with confidence. High-throughput phenotyping techniques will potentially improve the throughput and objectivity of detecting flowers in canola and other crops, and contribute to the development of new cultivars in breeding programs and yield estimation in precision agriculture.
Vinobot and vinoculer: from real to simulated platforms
Ali Shafiekhani, Felix B. Fritschi, Guilherme N. DeSouza
In this work, a new element of our research for autonomous plant phenotyping is presented: a simulated environment for development and testing. As explained in our previous work, our architecture consists of two robotic platforms: an autonomous ground vehicle (Vinobot) and a mobile observation tower (Vinoculer). The ground vehicle collects data from individual plants, while the observation tower oversees an entire field, identifying specific plants for further inspection by the ground vehicle. Indeed, while real robotic platforms for field phenotyping can only be deployed during the planting season, simulated platforms can help us to improve the various algorithms throughout the year. In order to do that, the simulation must be designed to mimic not only the robots, but also the field with all its uncertainties, noises and other unexpected circumstances that could lead to errors in those same algorithms under real conditions. This paper details the current state in the implementation of such simulation. It describes how the target navigation algorithms are being tested and it provides the first insights on the functionality of the simulation and its usefulness for testing those same robotic platforms.
Phenotyping of sorghum panicles using unmanned aerial system (UAS) data
A. Chang, J. Jung, J. Yeom, et al.
Unmanned Aerial System (UAS) is getting to be the most important technique in recent days for precision agriculture and High Throughput Phenotyping (HTP). Attributes of sorghum panicle, especially, are critical information to assess overall crop condition, irrigation, and yield estimation. In this study, it is proposed a method to extract phenotypes of sorghum panicles using UAS data. UAS data were acquired with 85% overlap at an altitude of 10m above ground to generate super high resolution data. Orthomosaic, Digital Surface Model (DSM), and 3D point cloud were generated by applying the Structure from Motion (SfM) algorithm to the imagery from UAS. Sorghum panicles were identified from orthomosaic and DSM by using color ratio and circle fitting. The cylinder fitting method and disk tacking method were proposed to estimate panicle volume. Yield prediction models were generated between field-measured yield data and UAS-measured attributes of sorghum panicles.
Calibrated plant height estimates with structure from motion from fixed-wing UAV images
Field-based high-throughput phenotyping is a bottleneck to future breeding advances. The use of remote sensing with unmanned aerial vehicles (UAVs) can change the way agricultural research operates by increasing the spatiotemporal resolution of data collection to monitor status of plant growth. A fixed-wing UAV (Tuffwing) was operated to collect images of a sorghum breeding research field with 70% overlap at an altitude of 120 m. The study site was located at Texas A and M AgriLife Research’s Brazos Bottom research farm near College Station, Texas, USA. Relatively high-resolution (>2.7cm/pixel) images were collected from May to July 2017 over 880 sorghum plots (including six treatments with four replications). The collected images were mosaicked and structure from motion (SfM) calculated, which involves construction of a digital surface model (DSM) by interpolation of 3D point clouds. Maximum plant height for each genotype (plot) was estimated from the DSM and height calibration implemented with aerial measured values of groundcontrol points with known height. Correlations and RMSE values between actual height and estimated height were observed over sorghum across all genotypes and flight dates. Results indicate that the proposed height calibration method has a potential for future application to improve accuracy in plant height estimations from UAVs.
Thermal and Hyperspectral Imaging from UAVs
icon_mobile_dropdown
Inter-comparison of thermal measurements using ground-based sensors, UAV thermal cameras, and eddy covariance radiometers
Alfonso Torres-Rua, Hector Nieto, Chistopher Parry, et al.
With the increasing availability of thermal proximity sensors, UAV-borne cameras, and eddy covariance radiometers there may be an assumption that information produced by these sensors is interchangeable or compatible. This assumption is often held for estimation of agricultural parameters such as canopy and soil temperature, energy balance components, and evapotranspiration. Nevertheless, environmental conditions, calibration, and ground settings may affect the relationship between measurements from each of these thermal sensors. This work presents a comparison between proximity infrared radiometer (IRT) sensors, microbolometer thermal cameras used in UAVs, and thermal radiometers used in eddy covariance towers in an agricultural setting. The information was collected in the 2015 and 2016 irrigation seasons at a commercial vineyard located in California for the USDA Agricultural Research Service Grape Remote Sensing Atmospheric Profile and Evapotranspiration Experiment (GRAPEX) Program. Information was captured at different times during diurnal cycles, and IRT and radiometer footprint areas were calculated for comparison with UAV thermal raster information. Issues such as sensor accuracy, the location of IRT sensors, diurnal temperature changes, and surface characterizations are presented.
A detailed study on accuracy of uncooled thermal cameras by exploring the data collection workflow
Tiebiao Zhao, Haoyu Niu, Andreas Anderson, et al.
Thermal cameras have been widely used in small Unmanned Aerial Systems (sUAS) recently. In order to analyze a particular object, they can translate thermal energy into visible images and temperatures. The thermal imaging has a great potential in agricultural applications. It can be used for estimating the soil water status, scheduling irrigation, estimating almond trees yields, estimating water stress, evaluating maturity of crops. Their ability to measure the temperature is great, though, there are still some concerns about uncooled thermal cameras. Unstable outdoor environmental factors can cause serious measurement drift during flight missions. Post-processing like mosaicking might further lead to measurement errors. To answer these two fundamental questions, it finished three experiments to research the best practice for thermal images collection. In this paper, the thermal camera models being used are ICI 9640 P-Series, which are commonly used in many study areas. Apogee MI-220 is used as the ground truth. In the first experiment, it tries to figure out how long the thermal camera needs to warm up to be at (or close to) thermal equilibrium in order to produce accurate data. Second, different view angles were set up for thermal camera to figure out if the view angle has any effect on a thermal camera. Third, it attempts to find out that, after the thermal images are processed by Agisoft PhotoScan, if the stitching has any effect on the temperature data.
Image quality and accuracy of different thermal sensor at varying operation parameters (Conference Presentation)
Thermal image quality is very critical for accurately quantify spatial and temporal growth and stress patterns of field crops. Image quality can be impacted by many factors including environment, flying altitude, and camera focal length. Often times the thermal sensor selection is based upon price or already owned sensor. Metrics are available to select the flight altitude based on the thermal sensor for desired ground resolution, however, no study has been conducted to provide the relative difference in image quality and efficiency of generating a thermal orthomosaic. Therefore, this study was conducted with the goal to compare the accuracy of canopy temperature quantification and assess the quality of thermal orthomosaic when using a thermal sensor of different focal length and image acquisition at varying flying altitudes. Three thermal infrared cameras were selected with focal lengths of 9mm, 13mm, and 19mm. All three cameras were flown at altitudes of 10m, 40m, and 70m, to collect aerial imagery of 7,000 m2 soybeans field. The cameras were mounted on a rotary quadcopter. All flights were conducted at 3 m/s flying speed, and 1 second shutter trigger interval. A ground reference system provided ground truth data for thermometric transformations. Imagery data was compared to assess differences in number of images collected, percentage overlap required for 1 second shutter trigger interval, quality of orthomosaic and accuracy of canopy temperatures. Preliminary results show that 13 mm focal length and 40 m altitude result in a finer resolution orthomosaic. The canopy temperatures were quantified accurately regardless of altitude and focal length.
A low-cost method for collecting hyperspectral measurements from a small unmanned aircraft system
Small unmanned aircraft systems (UAS) are a relatively new tool for collecting remote sensing data at dense spatial and temporal resolutions. This study aimed to develop a spectral measurement platform for deployment on a UAS for quantifying and delineating moisture zones within an agricultural landscape. A series of portable spectrometers covering ultraviolet (UV), visible (VIS), and near-infrared (NIR) wavelengths were instrumented using a Raspberry Pi embedded computer that was programmed to interface with the UAS autopilot for autonomous data acquisition. A second set of identical spectrometers were fitted with calibrated irradiance lenses to capture ambient light during data acquisition. Data were collected during the 2017 Great American Eclipse while observing a reflectance target to determine the ability to compensate for ambient light conditions. A calibration routine was developed that scaled raw reflectance data by sensor integration time and ambient light energy. The resulting calibrated reflectance exhibited a consistent spectral profile and average intensity across a wide range of ambient light conditions. Results indicated the potential for mitigating the effect of ambient light when passively measuring reflectance on a portable spectral measurement system. Future work will use multiple reflectance targets to test the ability to classify targets based on spectral signatures under a wide range of ambient light conditions.
Hyperspectral detection of methane stressed vegetation
Margot Accettura, Tim Bauch, Nina Raqueño, et al.
This study examines the hyperspectral reflectance characteristics of vegetation stressed by the influence of low-level sub-terrainean methane leakage from buried pipelines. The purpose is to ascertain whether high-spatial resolution spectral imagery can be used to geolocate small methane leaks in imagery collected from small unmanned aerial systems (sUAS). This could lead to rapid detection of methane leaks by finding spectrally unique regions of stressed vegetation which might benefit a variety of industries including utility inspectors, grounds maintenance crews, and construction personnel. This document describes an experiment to manually stress vegetation by introducing methane at a low ow rate beneath a layer of turf, allowing it to percolate to the surface and affect the vitality of the overlying turf. For comparison, a turf plot was stressed by root rot caused by overwatering, as well as a sample of turf used as a control area (healthy grass). The three areas of vegetation were observed daily over the course of a one-month period with a ground spectrometer to determine the onset and time line of damage to the vegetation. High-spatial resolution spectral imagery was also collected each day to observe wavelength characteristics of the damage. First derivative analysis was used alongside physiology-based indices and logistic regression to detect differences between healthy and stressed vegetation. The hyperspectral data showed that as vegetation is stressed the red-edge slope decreases along with values through the near infrared (NIR) while the short wave infrared (SWIR) region increases. The normalized difference index (NDI) calculation of stressed vegetation in relation to healthy vegetation is maximum using a ratio of reflectance values at 750 and 1910 nm. Conclusions will be presented as to whether sUAS may be used to determine if vegetation stressed by methane can be easily detected and which spectral bands are most effective for spotting this particular stressor.
Detecting Yield, Disease, and Water Stress from UAVs
icon_mobile_dropdown
Multispectral remote sensing for yield estimation using high-resolution imagery from an unmanned aerial vehicle
Satellites and autonomous unmanned aerial vehicles (UAVs) are two major platforms for acquiring remotely-sensed information of the earth’s surface. Due to the limitations of satellite-based imagery, such as coarse spatial resolution and fixed schedules, applications of UAVs as low-cost remote sensing systems are rapidly expanding in many research areas, particularly precision agriculture. UAVs can provide imagery with high spatial resolution (finer than 1 meter) and acquire information in visible, near infrared, and even thermal bands. In agriculture, vegetation characteristics such as health, water stress, and the amount of biomass, can be estimated using UAV imagery. In this study, three sets of high-resolution aerial imagery have been used for yield estimation based on vegetation indices. These images were captured by the Utah State University AggieAir™ UAV system flown in June 2017, August 2017, and October 2017 over a field experiment pasture site located in northern Utah. The pasture study area is primarily tall fescue. The field experiment includes 20 50 x 20-m plots, with 4 replications of 5 irrigation levels. Approximately 60 yield samples were harvested after each flight. Sample locations were recorded with high-accuracy real-time kinematic (RTK) GPS. In addition, the leaf area index (LAI) for each sample plot was measured using an optical sensor (LAI2200C) before harvesting. The relationship of yield for each sample versus vegetation indices (VIs) was explored. The VIs include the normalized difference vegetation index (NDVI), calculated using AggieAir imagery, and LAI measured using a ground-based sensor. The results of this study reveal the correlation between vegetation indices and the amount of biomass.
Disease detection and mitigation in a cotton crop with UAV remote sensing
J. Alex Thomasson, Tianyi Wang, Xiwei Wang, et al.
In many fields in the southwestern U.S. and Mexico, a soil-borne fungus (Phymatotrichopsis omnivorum) causes a disease called cotton root rot (CRR) that can devastate a cotton crop by infecting the roots and destroying large numbers of cotton plants. In the last few years a fungicide treatment including the chemical, flutriafol, has proven effective at protecting cotton plants from CRR infection. However, the fungicide is expensive, and growers desire to minimize input costs and environmental risks, so it is desirable to treat only the portions of the field susceptible to CRR infection. Remote sensing with high-resolution satellites and manned aircraft has enabled delineation of the full extent of the disease late in the growing season. Recently, classified images have been used effectively to create prescription maps for variable-rate application of fungicide when planting a cotton crop in subsequent years. In 2015 a UAV was used to create a high-resolution image mosaic of a CRR-infected field at Thrall, Texas. The mosaic was classified into healthy and CRR-infected small zones, and a prescription map was created from the mosaic for variable-rate fungicide application during planting in 2017. The method proved as effective as uniform application across the field would have been. Furthermore, image-analysis techniques were developed that enable classification of image mosaics at approximately the single-plant level. Thus in the future it is conceivable that precision application of flutriafol during planting to prevent cotton root rot could be done at the level of a single seed.
An unmanned aerial system for the detection of crops with undergraduate project-based learning
S. A. Wilkerson, A. D. Gadsden, S. A. Gadsden
To keep pace with population growth, farmers are leveraging a host of new technologies to improve crop production, including genetically modified organisms (GMOs), along with increased chemical pesticides and fertilizer usage. These new techniques, however, have sometimes led to runoff problems for water systems and local watersheds. By using dronebased technologies the overuse of fertilizers, chemical sprays, and pesticides can be minimized, while preserving farm output and quality. This paper discusses lessons learned from and progress made in a year-long capstone research and development project performed by engineering and computer science students at York College of Pennsylvania. The project involves the study and use of multispectral camera technologies along with drones to survey farms growing corn in various climates. The technologies used to assess farms and modern farming practices are by their nature multidisciplinary. Students involved with this project have thus needed to draw on their engineering and scientific backgrounds while learning new and varied topics to tackle this real-world problem. This paper also examines some of the teaching challenges encountered when using project-based learning (PBL) techniques with engineering students to tackle a multidisciplinary problem similar to the types they will likely face in their professional careers. For example, the students have needed to apply best principles to design and build a drone system to assess crop health. Moreover, they have needed to understand the legal responsibilities of operating drones, farmer issues, and a host of technologies unfamiliar to them prior to this project. Student metrics and outcomes are also assessed to improve the process for future years.
Experimental approach to detect water stress in ornamental plants using sUAS-imagery
Ana I. de Castro, Joe Mari Maja, Jim Owen, et al.
Efficiency in irrigation management is crucial to optimize water use in agriculture. A good irrigation strategy requires accurate and reliable measurements of crop water status that provide dynamic data and timely spatial information. However, this is not feasible with time-consuming manual measurements, which are also prone to cumulative errors due to subjective estimations. Ornamental horticulture crops offer challenges for applying small unmanned aircraft systems (sUAS) technology due to the relatively small area of production and its diversity of plant species. sUAS can operate on demand at low flight height and to carry a wide range of sensors allows capturing the variation of plant traits over time, making it a timely alternative to ground-based data collection in nursery systems. This research evaluated the potential of sUAS-based images to estimate crop water status under three different irrigation regimes. sUAS-imagery of experimental plots was acquired in August 2017 using several multispectral sensors. Container-grown ornamental plants used in the study were Cornus, Hydrangea, Spiraea, Buddleia and Physocarpus. An algorithm based on the object-based image analysis (OBIA) paradigm was applied to retrieve spectral information from each individual plant. Preliminary one-way analysis of variance (ANOVA) identified water stressed and non-stressed plants from data of each study sensor, although spectral separation was higher when information from the sensors was combined. Our results revealed the potential of the sUAS to monitor water status in container-grown ornamental plants, although further analysis is needed to explore vegetation indices and data analysis algorithms.
Analytics for UAV-based Crop Management
icon_mobile_dropdown
Machine learning techniques for the assessment of citrus plant health using UAV-based digital images
Dat Do, Frank Pham, Amar Raheja, et al.
This paper presents the use of machine learning techniques for the development of a methodology for the analysis of digital images of citrus plants collected from unmanned aerial vehicles (UAVs). Proven ground based sensors including a chlorophyll meter, water potential meter, and spectroradiometer are used to evaluate the condition of the plants, thus providing the ground truth. The collected images and ground truth data are then used as training data to the machine learning models, which are validated using a separate set of data. For our models, we evaluate several machine learning techniques from simple linear regression to convolutional neural networks. The overall goal is to develop a solution for monitoring plant health that can readily and cost-effectively be used by farmers to determine nitrogen and water stresses in plants. Such a system will aid in the conservation of physical resources while reducing human labor and the environmental impact of chemicals.
Unmanned aerial system based cotton genotype selection using machine learning (Conference Presentation)
Jinha Jung, Akash Ashapure, Murilo Maeda, et al.
The objective of this research is to develop a novel machine learning framework for automatic cotton genotype selection using multi-source and spatio-temporal remote sensing data collected from Unmanned Aerial System (UAS). The proposed machine learning model is based on Artificial Neural Network (ANN) and it takes UAS based multi-temporal features such as canopy cover, canopy height, canopy volume, Normalized Difference Vegetation Index (NDVI), Excessive Greenness Index along with non-temporal features such as cotton boll count, boll size and boll volume as input and predicts the corresponding yield. Testing the performance of our model using actual yield resulted in an R square value of approximately 0.9. The proposed cotton genotype selection model is expected to revolutionize the cotton breeding research by providing valuable tools to cotton breeders so that they can not only increase their experiment size for faster genotype selection but also make efficient and informed decision on best performing genotype selection.
Evaluation of multispectral unmanned aerial systems for irrigation management
José L. Chávez, Huihui Zhang, Maria Cristina Capurro, et al.
Growing competition for water is incentivizing the implementation of deficit irrigation. Thus, there is a need to accurately map actual crop evapotranspiration (ETa) to more efficiently manage and document irrigation. An alternative is the use of remote sensing (RS) platforms. Unmanned Aerial Systems (UAS) can fly frequently and acquire very high spatial resolution images. Multispectral UASs (fixed-wing and multi-rotor) flew over irrigated corn fields, in northern Colorado, to evaluate the capabilities of the RS systems on irrigation management. Soil water content sensors were used in the evaluation. Using multispectral UAS platforms in irrigation management is advantageous in regards to having the capability to assess crop water use and stress frequently and at very high spatial resolutions. This study shows that inferring crop water use and soil water status with acceptable errors is possible with visible-near-infrared and thermal cameras. Furthermore, the required imagery processing and calibration is detailed.
Innovative UAV Applications
icon_mobile_dropdown
UAV videos to extend research to producers
W. Brien Henry, Robert Moorhead, John J. Williams, et al.
It is said, “A picture is worth a thousand words.” So what is the communication value of a video showing the intent of a research project? We do not know if the value of a video can be estimated by a word count but we can say, with authority, it represents a powerful instructional tool. Using video footage and aerial imagery from an unmanned aerial vehicle, coupled with narration, to document field research, sponsored by the Mississippi Corn Production Board, we created an exciting tool that showed growers the results of research going in the field.
A comparison of manned and unmanned aerial Lidar systems in the context of sustainable forest management
Sustainable forest management practices support the growing effort to make efficient use of natural resources without a reduction in future yield potential. These efforts require accurate and timely measurement of the world’s forests to monitor volume, biomass, and stored carbon level changes. Historically, these measurements have been effected through manual measurements of individual trees in representative plots, spaced throughout the forest region. Through the process of imputation, the missing values are interpolated, often through a regression model based on the collected reference data. Remote sensing technologies, specifically lidar (light detection and ranging), possess the capability to rapidly capture structural data of entire forests; however, airborne lidar mounted on manned aircraft can be cost prohibitive. The increasing capabilities and reduction of cost associated with small unmanned aerial systems (sUAS), coupled with the decreasing size and mass of lidar sensors, have opened the possibility for these platforms to provide a cost effective method with comparable performance. This study completes a cost comparison of the two platforms using a regression model of above ground live carbon as a method of comparing performance in context of sustainable forestry. The sUAS performed comparably based on our two data sets. The sUAS achieved a R2 of 0.74, and the manned aircraft lidar system achieved an R2 of 0.61, with both models producing RSE(%) within one percent of each other. The sUAS has the capability to be competitive with the manned aircraft at a cost of $8.12/acre for the study area, compared to the manned aircraft’s cost of $8.09/acre. The added benefits of sUAS include rapid deployment and low mobilization costs, while disadvantages include operational considerations, such as the need for line-of-sight operations. However, we concluded that sUAS is a viable alternative to airborne manned sensing platforms for fine-scale, local forest assessments.
Spatial analysis of multispectral and thermal imagery from multiple platforms
Gregory Rouze, Haly Neely, Cristine Morgan, et al.
Airborne and satellite remote sensing can potentially be used to model crop characteristics. However, satellite imagery usually exhibit low spatial and temporal resolutions, and manned aircraft imagery, despite improved resolutions, is not cost-effective. Recent developments in UAV remote sensing have allowed for imagery at improved spatial resolutions relative to satellites and at a fraction of the cost relative to manned aircraft. Furthermore, UAVs offer potential advantages over proximal soil sensors (i.e. EM-38) in terms of in-season decision making. However, it is unclear at this point whether these benefits translate to higher quality information. This question has relevance within fields that exhibit contrasting environments, such as soil spatial variability. Therefore, the objectives of this paper were twofold: 1) to quantify improvements in UAV-based plant (cotton) modelling relative to proximal sensing (i.e. EM-38), manned aircraft, and satellites (Landsat 8); and 2) to determine how such modeling can be affected by soil spatial variability. Results indicate that UAVs show higher nugget/sill ratios and larger ranges than manned aircraft and satellites. These results have implications for predicting agronomic variables (i.e. yield, plant height), as well as soil/plant sampling.
Evaluating the capabilities of Sentinel-2 and Tetracam RGB+3 for multi-temporal detection of thrips on capsicum
Jayantrao Mohite, Arvind Gauns, Navin Twarakavi, et al.
Various pests and diseases can deteriorate the quality and yield of the capsicum. In order to control these losses, their timely detection is important. Thrips is one of the major pests in capsicum which is unable to detect in initial phase as the symptoms are not visible to naked eyes. Thrips not only causes plant damage but for the serious plant diseases it vectors. In this paper, we address the problem of detection of low infestation of thrips on capsicum leaves using multi-temporal hyperspectral remote sensing data simulated to multispectral sensors such as Sentinel-2 and Tetracam RGB+3. The reectance data from capsicum leaves with healthy and low infestations of thrips has been collected using handheld spectroradiometer. The hyperspectral remote sensing data is collected from 213 bands with wavelength ranging from 350 nm to 1052 nm and bandwidth varying from 3.22 nm to 3.346 nm during the period of 17 Mar to 13 Apr 2017. Variations observed in the spectral reflectance over time makes the detection based on multi-temporal data difficult. We have evaluated the performance of tuned random forest classifier for various set of features such as full feature set of 213 bands, features selected by Least Absolute Shrinkage and Selection Operator (LASSO) from 213 bands, features simulated to broad bands similar to Sentinel 2 and features simulated to multispectral bands similar to Tetracam RGB+3 (a camera which can be placed on drones). Results suggests that an overall classification accuracy of 92.81 % has been achieved on validation dataset using full feature set whereas accuracy slightly dips down to 90.3, 85.13 and 87.45 % when using selected features by LASSO, bands simulated to Sentinel-2 and Tetracam-RGB+3 respectively. Results imply that, Tetracam-RGB+3 and Sentinel-2 satellite can be effectively used for detection of low-infestation of thrips on capsicum.
Poster Session
icon_mobile_dropdown
Using hyperspectral sensors for crop vegetation status monitoring in precision agriculture
Marius Cristian Luculescu, Luciana Cristea, Sorin Constantin Zamfira, et al.
The world is continuously changing. Day by day we are faced with more and more changes regarding the climate, technology, economy and society. All of these place their mark on agroecosystems. Major economic and environmental impacts can be obtained by providing water and nutritional supplements just to those plants that need them, only when they need and in proper quantities. In order to do this, a real time management of agricultural crops is necessary. The paper presents a solution for crop vegetation status monitoring in precision agriculture, based on hyperspectral sensors, namely on spectrometers, placed on an UAV (Unmanned Aerial Vehicle).
MoniSCAN: software for multispectral monitoring of the crops vegetation status
Marius Cristian Luculescu, Luciana Cristea, Sorin Constantin Zamfira, et al.
An efficient crops management, in the continuously changing context especially regarding climate, requires real time monitoring of soil resources and vegetation dynamics. A part of this process is the precision agriculture that supposes to investigate crops so that to allocate inputs as water and fertilizer, for example, only to the plants that need them, at the proper moment and in proper quantities. For monitoring the crops vegetation status different solutions are available on the market. Most of them acquire spectral data, process and represent them on maps offering support for proper farmer decision. MoniSCAN is such a software developed in a research project.