By integrating an optical Micro-Electro-Mechanical Systems, or MEMS, chip into an iPhone camera, researchers at the VTT Technical Research Centre of Finland have developed a new, cost-effective kind of hyperspectral technology. The spectral device will provide mobile device users and consumers with new ways to monitor their environments, including quick food analysis, health checks, and other Internet-connected sensing. Research team leader Anna Rissanen works actively with companies to enable commercialization and new business development based on the team's various sensors.

The optical MEMS chip (shown) is integrated into the camera. (Credit: VTT)
Anna Rissanen

Sensor Technology: What is hyperspectral imaging, and why is it valuable?

Anna Rissanen: Hyperspectral imaging allows access to both spatial and spectral data in each pixel of an image. Spectral data offers a versatile way of sensing various objects, analyzing material properties, and measuring the environment.

ST: How can a consumer benefit from easy, low-cost hyperspectral imaging?

Rissanen: Hyperspectral imaging integrated into mobile devices or Internet-of-Things (IoT) environments — intelligent homes, appliances, and smart vehicles — could bring about potentially novel sensing applications, including real-time analysis of an environment's air quality, health-related sensing of skin or blood circulation, or freshness monitoring of food products. The key to generating novel applications in the future, however, requires not just smart and low-cost sensor hardware, which we aim to demonstrate, but also intelligent imaging algorithms that allow users to extract the relevant information from the spectral data.

The hyperspectral technology can be used to authenticate bank notes. (Credit: VTT)

ST: How did you convert an iPhone into an optical sensor? How is spectral data obtained?

Rissanen: The system is based on one MEMS Fabry-Perot interferometer (FPI) tunable optical filter chip, fixed in front of the iSight 8 Mpix camera of the iPhone 5s. The communication between the MEMS FPI module and iPhone 5 uses Bluetooth. The realized demonstrator is external, since it is not possible for researchers like us to change the hardware inside the iPhone. Because the size of the MEMS FPI chips is small, however, it can be integrated directly within the phone, in the packaging level.

ST: What kinds of observations can be made with such a device?

Rissanen: Through the iPhone camera, we can only access the spectra of visible wavelengths. For the demonstrator, we chose to use wavelengths between 450 nm to 550 nm, which limits the applications mostly to authentication. For actual product prototypes, however, it is possible to extend the tuning range to cover the visible spectrum by using two MEMS FPI chips. By substituting the visible MEMS chips to another type of MEMS FPI and accessing the visible and near-infrared (VNIR) wavelengths, there are much more diverse possibilities for observing spectral fingerprints. Hyperspectral imaging of bank notes for authentication purposes, for example, can be done with raw images. There are not so many immediately possible applications, as most applications require development of algorithms which extract the relevant information.

ST: How will this technology enable Internet-connected/“Internet-of-Things” applications?

Rissanen: IoT applications, such as smart homes and appliances, could use hyperspectral imaging sensors to monitor food spoilage, to assess air quality, or to provide data concerning environmental conditions, for example from drones. IoT applications that will benefit from the environmental data obtained from sensor networks can only arise after the technology is commercialized.

ST: How can vehicles and drones use the system?

Rissanen: Smart vehicles could benefit from using hyperspectral imaging, to make them more reliable. Drones can use hyperspectral imagers for precision agriculture, for example, to measure fertilization levels or for forestry.

ST: In the future, with this kind of device, what kinds of monitoring (via phone) will be possible?

Rissanen: This depends on what type of hyperspectral imagers would end up as commercial products and whether the application-targeted algorithm development would become open-access to allow various app developers to invent new capabilities with the data. We already know that hyperspecttral imaging can be used for various health-related measurements, like the screening of moles for detection of skin cancer, which is a separate hand-held instrument already in the commercialization phase. Whether the mobile phone could do the same depends on the final specifications of the sensor and how the technology is commercialized.

A hyperspectral imager module for nanosatellite/CubeSat applications. (Credit: VTT)

ST: What kinds of hyperspectral space applications are possible?

Rissanen: Besides the hyperspectral imager sensor modules for mobile applications, we have also been developing small, lightweight, and CubeSat-compattible hyperspectral camera modules, which enable fast high-resolution imaging, with easy scalability to multiple remote sensing applications from space. We have created a miniature hyperspectral imager to be launched on the Aalto-1 CubeSat in 2017, as well as for PICASSO Vision [a European Space Agency nanosatellite mission for atmospheric and space science observations].

Programmable wavelength selection of the imaging instruments reduces the downlink data, while the customization of operation wavelengths enables a large variety of applications with the same camera hardware: environmental measurements of forests and agriculture, monitoring of waters, atmospheric gas sensing, as well as the potential for generating new spectral data-based applications through networked, affordable space instruments. The hardware technology is also scalable for commercial manufacturing in volumes.

ST: What was your role in the development of the optical sensor?

Rissanen: I started as a designer and process integrator to create the MEMS FPI chips, which are used in this demo but also in various other spectral sensing applications. Currently, as a research team leader, I work actively with companies to enable commercialization and new business development based on the various sensors created by the team.

ST: What's next with the development of the hyperspectral technology?

Rissanen: As a research institute, VTT does not sell products, but we aim to commercialize the developed sensor technologies by interacting with industrial partners. We hope that, like other hyperspectral imager prototypes we have previously developed, this too will be successfully commercialized. With this demo, we don't aim to demonstrate individual applications, but to show that the sensor hardware has reached a certain level of maturity, which in combination with commercialization efforts and data algorithm development, offers potential for the future.

For more information, visit Here .