The NEON observatory offers a valuable opportunity to scale understanding of western U.S. forest dynamics, from trees to entire ecosystems. Applying deep-learning techniques to identify individual plant species from NEON AOP’s high-resolution hyperspectral, light detection and ranging (LiDAR), and red-green-blue (RGB) imagery is a critical step in this scaling. The LiDAR data and RGB imagery provide information about vegetation structure, while the hyperspectral data provides unique biochemical features associated with plant species to help delineate spectral features of individual trees. In addition, Unmanned Aerial Systems sampling for forest systems can potentially validate observations from the AOP platform to expand NEON’s footprint and provide a mechanism to capture disturbance dynamics.
Results/Conclusions
Preliminary efforts demonstrate that fully convolutional neural networks (FCNNs) allow cataloging the variable spectral signatures of individual trees at NEON’s San Joaquin Experimental Range site to 66% accuracy. These efforts will help inform development of a new package, Tensorphloem, that is based on Google’s open source machine learning library Tensorflow. A key limitation to this work is that, unlike other fields that capitalize on FCNNs for medical imaging, advertising, autonomous driving, and natural language processing, there are no well-labeled publicly available datasets of hyperspectral imagery with species identifications for the science community to use as a benchmark when comparing the performance of machine learning algorithms for classification tasks. This effort represents a joint collaboration between Earth Lab, a new data synthesis center for Earth systems research, and the Integrated Remote and In Situ Sensing (IRISS) initiative at CU Boulder. This cross-disciplinary effort between ecology, engineering, and data science will provide statistical advances and explore emerging technologies to better capture vegetation changes from single trees to entire forests systems in the western U.S.