A new proof-of-concept design retires one of the most familiar parts of a traditional camera: the lens. By swapping out the glass lens with a tiny array of light receivers, a California Institute of Technology team believes the thinner, lighter model supports a new wave of ubiquitous imaging.

A conventional camera’s curved lens bends, or refracts, light that can then be focused onto film or a sensor. The rounded size and shape, however, has prevented manufacturers from creating truly flat imagers — even on the latest (and thinnest) iPhones.

Instead of a lens, the Caltech researchers, led by Professor Ali Hajimiri, used an ultra-thin optical phased array (OPA) to manipulate incoming light and capture an image.

What is an Optical Phased Array?

The OPA chip placed on a penny for scale. (Credit: Caltech/Hajimiri Lab)

Phased arrays, commonly employed in wireless communication and radar applications, are collections of individual transmitters – each sending out the same signal. By staggering the timing of transmissions made at various points across the device, the array’s tightly focused signal beam can be steered in desired directions.

Caltech’s optical phased array receiver uses a similar principle. Light signals received by the array’s various transmitters form a focused, controllable "gaze” where the waves are amplified. By adding a tightly controlled timed delay to the light being received, an operator can selectively change the camera’s direction and focus.

“The beauty of this thing is that if I change these delays, which I can do electronically and very rapidly, I can change where I’m looking, without moving any mechanical object,” said Hajimiri, Bren Professor of Electrical Engineering and Medical Engineering in the Division of Engineering and Applied Science at Caltech.

Shifting Views

With today’s camera options, switching from a large focal point to a small focal point requires a swapping of lenses. Even an iPhone 7 Plus, for example, features two cameras to enable both wide-angle and zoom options.

By adjusting the delays to detect different points in space, users of the Caltech device can quickly scan an entire surface or select a small fraction of the field of view, instantaneously shifting from “fish-eye” to telephoto modes respectively.

Caltech’s single layer of integrated silicon photonics also reduces the thickness and weight of the device – a valuable and important improvement for future imaging applications.

“A good chunk of the weight of the camera and imaging system is in the glass that you have, the housing for the glass, and the mechanically stable system that you need to hold it in place,” said Hajimiri. “With this proof of concept, you eliminate all of that.”

Because the optical phased array is made from silicon, the thin devices are potentially lower cost when made in large volumes.

A Look to the Future

The ability to control a camera’s optical properties without any mechanical movement, lenses, or mirrors, could change the very idea of what we consider a camera to be, according to the Caltech professor, who imagines OPA imagers that look like wallpaper, blinds, or wearable fabric.

The proof-of-concept device, he said, may even support new astronomy applications by enabling ultra-light, ultra-thin enormous flat telescopes on the ground or in space.

“If you can make these planar sheets that expand and deploy in space, you could have a very large system that would be looking at the sky, essentially a giant camera.” said Hajimiri. “And you could get the resolution that would not be possible from a regular space telescope, both from mass and size considerations.”

The team will work on scaling up the camera by designing chips that enable larger receivers with higher resolution and sensitivity.

"Once scaled up, this technology can make lenses and thick cameras obsolete," said graduate student and fellow researcher Behrooz Abiri (MS '12) in a press release .

What do you think? Could lenses and thick cameras become obsolete? Share your thoughts below.