Figure 1. Hyperspectral Cube

The concept of remote sensing, which is defined as sensing information of an object or objects from far away, has been a major endeavor from the very beginning of space exploration. With this exploration from outer space came a myriad of opportunities that not only answered questions about other worlds, but also allowed us to explore our own world. This includes a better understanding of the earth’s surface: land, water, and atmosphere.

This all began in 1959 when Explorer 6 sent back one of the first images of Earth from space. Space remote sensing has since advanced rapidly, with the first dedicated weather satellite in 1960 and the introduction of the Landsat series in 1972. Today, electro-optical (E-O) imagery is readily available in many formats offered by government and commercial companies that provide products with value-added information about the earth’s activity and condition. Typical areas of application include weather forecasting, water quality, land-use and changing land-use, vegetative health in agriculture and forestry, pollutant detection and monitoring, military assessments, and disaster assessment and monitoring.

As space-based digital camera imagery is evolving, more and more advanced imaging technologies are being deployed in space to see more of the “unseen.” Hyperspectral Imaging (HSI) is one of those emerging remote sensing technologies.

What is Hyperspectral?

The sun emits a continuous spectrum of electro-magnetic (EM) energy by using only a very short range of its peak emission in the visible wavelengths (~400-650 nanometers), which the human eye is sensitive to. Solar E-M emissions span a significant spectral wavelength range – from high-energy X-rays, ultra-violet, visible, and infrared, to radio waves. Modern framing array digital cameras can generally sense energy in limited portions of this spectrum. These are typically broader spectral bands greater than 100 nanometers with ground spatial resolution sometimes less than a meter.

Newer to the scene of remote sensing are hyperspectral sensors, which gather data in a different way. They collect narrow band (less than 10 nanometers) spectral information from 400 nm to 2500 nm in the case of reflective band information, which means the energy source being detected by the sensor originated with the sun. Just as a prism can separate the wavelengths of sunlight, with hyperspectral sensors you can see into a number of colors or bands, and a specially constructed spectrometer can separate sunlight into hundreds of narrow bands. Each wavelength band then falls onto a specialized focal plane array, which contains material that can detect the incoming spectral light that carries spectroscopic information of the object or material from which the sunlight reflected or passed through.

What distinguishes hyperspectral from other imaging systems is its ability to record hundreds of narrow wavelength bands that capture the spectral “fingerprint” of the materials within each spatial pixel. These fingerprints are made up of spectral absorption features that are exploited to characterize and identify those materials or objects. This ability to simultaneously measure a large number of wavelengths enables more information (e.g. confidence of material ID, greater number of materials, etc.) to be extracted from an image.

For example, in a standard image of three green surfaces—green tarp, grassy field, and a green painted panel—all may appear similar in a digital photo, just different levels of the color green. In hyperspectral imagery, all of these materials appear very different. Solar energy at different wavelengths interacting with these surfaces gets absorbed at different spectral locations in the spectrum (as per their molecular make-up) at various bandwidths, so the reflectance spectra will appear different. These small differences across the spectrum allow a hyperspectral sensor to identify each target that would otherwise be declared the same when using a traditional digital camera.

Figure 2. Methane emission detection captured by OSK over a landfill in California, USA.

Hyperspectral imagery, like standard digital imagery, is a passive sensing system that refers to the sensor not providing any powered light source. It relies on ambient light sources to illuminate the scene where the light may originate from the sun, moon, or other natural or man-made source. Light from the ambient source(s) reflects off a target such as the surface of the Earth and is then captured by the sensors on the satellite.

The first Hyperspectral satellite, Hyperion, was sent to orbit by NASA in 2000 and had a ground sampling distance (GSD) of 30m. Since then, technology has rapidly improved. Orbital Sidekick (OSK) has invested significantly in this leading-edge technology and pushed the development envelope by improving image quality both in spatial resolution and the lowest spectral noise possible that can be deployed in a small satellite. It has also built new analytical capability for imagery exploitation and new products.

OSK’s planned GHOSt constellation for payloads is now in production and is anticipated to have a GSD of 8.3 meters, which represents a greater than 70% improvement in spatial resolution when compared to Hyperion. In terms of cost reduction realized, the NASA Hyperion program spent 70-$100 million to develop and launch its sensor. OSK has been able to reduce the cost by two orders of magnitude. These advances have allowed hyperspectral to be competitive with other forms of space-based remote sensing technologies.

Figure 3. Spectral comparison of various different objects and paints.

One of the major challenges of collecting hyperspectral data is storing and transmitting the enormous file sizes inherent when gathering such high-dimensional data. Because HSI records hundreds of bands for every pixel, the data is very detailed but also very large. This data must be stored onboard the satellite and also transmitted to the ground from the satellite, which can be expensive. Therefore, in addition to the challenges of gathering useful data, one must also deal with the volume of data.

There are a number of ways to decrease data size. OSK’s approach is to develop a space-based computing platform that, when coupled with OSK’s custom algorithms, can analyze the data in space, extract meaningful information from the data, and transmit the useful parts; thereby substantially reducing the size of the data. Certain customers need information quickly, such as when an oil leak occurs, how a forest fire changes direction, etc. By analyzing the data onboard, results can be delivered quicker, allowing decision makers better, quicker access to vital information.

Figure 4. OSK’s GHOSt Constellation.

In addition to gathering data, OSK has the ability to exploit it and extract valuable information to generate products that can be customized for each customer. By understanding each customer’s needs in terms of what information is important, how that data or information needs to be delivered, and what format is required, OSK’s products provide substantial value to our customers.

Remote sensing companies tend to either be a data provider or analytics provider, the latter only ingesting data made available by the former. Because hyperspectral data typically requires some specialized expertise for proper analysis, we’ve constructed OSK to be both a data acquisition company and an analytics company. This allows us to offer custom and unique solutions to our customers.

The need for persistent monitoring is ever-growing. OSK’s ability to reduce overall satellite cost and continuously advance the state-of-the-art technology enables us to deploy more highly capable sensors, allowing for persistent monitoring of assets on Earth. This, when coupled with a software platform that allows customers to task the satellites to monitor their assets, provides another competitive advantage over those with limited capacity. As the remote sensing field continues to change and grow, Orbital Sidekick continues to evolve to meet the growing needs of customers with newer, better, and faster capacity to satisfy those needs.

This article was written by Abraham Mathew, VP of Engineering; Dr. Lee Sanders, Senior Imaging Systems Engineer; and Kevin Brodie, Lead ML/Data Scientist; Orbital Sidekick (OSK) (San Francisco, CA). For more information, visit here .