Remote-sensing advances with LiDAR and hyperspectral data. Proximal sensing improvement in Energy Sorghum
Siebers, Matthew; Fu, Peng; McGrath, Justin; Ort, Donald; Bernacchi, Carl
Abstract
Improvements in monitoring plant growth have been advancing from scales ranging from the leaf to the satellite. Here we focus on assessing the on 1) the improvements in the use of proximal sensors - Light Radar (LiDAR) and hyperspectral cameras - in monitoring agricultural environments and 2) the improvements to data capture and data processing. Data was collected on the candidate biofuel crop, Energy Sorghum, grown at the University of Illinois in 2018 and 2019. There were eight LiDAR (Hokuyo LX-10) measurements completed on every sorghum variety in 2018 and 16 completed in 2019. One full-field measurement contained ~ 950 varieties grown over two hectares. Using a ground based LiDAR we developed techniques that rapidly measured several important elements of growth: plant height, stand count, and leaf area index. Results showed that we captured a 3D, digital time-series of plant growth with the potential to extract many more growth characteristics. Hyperspectral cameras (300 - 1700 nm, Middleton Hyperspectral) can be used at the canopy level to measure a large number of phenotypes. Additionally, the shape of the hyperspectral signature can be analyzed to determine the important wavelengths for measuring a given trait. All 950 varieties of sorghum were scanned two times during the 2019 season. Data processing routines were developed to separate shaded leaves, sunlit leaves and soil. Partial least square regression was then used to correlate hyperspectral scans to photosynthetic parameters, leaf thickness, chlorophyll content and carbon and nitrogen content. Future work includes generating additional phenotypes from LiDAR and hyperspectral scans, using improved sensors such as multi-band LiDAR and automating data collection. A recently installed cable-based phenotyping platform at the University of Illinois will allow a full-suite of high-end sensors to routinely navigate over four hectares of experimental plots. Data capture, analysis, sensor development and automation require collaboration between scientists, engineers and computer scientists. Those collaborations are starting to yield exciting results.