A team from the New Mexico State University College of Engineering has been tasked with developing machine learning algorithms in support of national defense. Professor David Voelz has been awarded a two-year, nearly $300,000 grant from the Office of Naval Research.
He is leading the project, “Machine Learning-Based Turbulence Analysis and Mitigation for Hyperspectral Imaging,” with Associate Professor Laura Boucheron and Assistant Professor Steven Sandoval from NMSU’s Klipsch School of Electrical and Computer Engineering.
“My colleagues and I at NMSU had been working for a few years on some related projects for atmospheric modeling and light propagation prediction with machine learning,” Voelz said. “At NMSU, we had expertise in image processing, machine learning and atmospheric optics, so we were a good fit for researching this question.”
The project’s objective is to create machine learning algorithms to support the analysis of atmospheric turbulence effects on hyperspectral imaging and advance the tools for the mitigation of turbulence effects in the images. Voelz said they are interested in a wide spectral sensing range, from 300 nanometers to 10-micron wavelength, and imaging over horizontal or slant paths of a few hundred meters to several kilometers that are relatively near the Earth’s surface.
Defense tasks that can be helped by hyperspectral imaging include target detection, recognition and identification, shape extraction, classification and material characterization. For remote sensing tasks, hyperspectral imaging is often applied in nadir, or down looking, ground survey applications with an aircraft or satellite.
“Our interest is imaging along horizontal or slant paths near the ground for applications that require both high spectral and spatial resolution,” Voelz said. “In this situation, atmospheric turbulence is a significant degrading factor for spatial and spectral resolution as it induces image blur and spectral mixing between objects.
“Increasingly, machine learning algorithms have become essential signal analysis tools for hyperspectral imaging cube datasets due to their prediction capabilities, ability to estimate the statistics of classes of interest and speed of operation,” Voelz said. “However, only a few machine learning algorithms have been proposed in the literature for addressing both blur and spectral mixing in hyperspectral imaging data. Our intent is to apply machine learning algorithms to exploit the diversity provided by both the spectral and spatial data to aide in image deblurring and unmixing.”
Voelz said he believes the biggest challenge for the project will be to determine if a machine learning approach can decipher the subtle information buried in the hyperspectral image data for improving the images.
“The learning algorithm probably needs some help, or constraints, so it can arrive quickly at a useful result,” he said. “Finding these constraints is a challenge as this is a new application.
“Although hyperspectral imaging is a very powerful sensing tool for many activities, it produces huge amounts of data, so the development of artificial intelligence or machine learning algorithms is important for getting to the critical information quickly,” Voelz said.