The last thing you probably do when a fly buzzes toward you is marvel at its graceful wing and body kinematics. But that’s exactly what researchers at Cornell University’s Itai Cohen Group do on a daily basis.

One of the group’s ongoing areas of study is biolocomotion—in particular, how living organisms, from individual insects to groups of people, navigate space. Although the process is lost on most of us, flyswatter or rolled-up newspaper in hand, the flight of flapping insects is actually incredibly complex. Due to aerodynamic instabilities, stable flapping flight requires fast, corrective actions. “If the fly isn’t constantly adjusting its pattern of flight, it will fall out of the sky quickly,” says Samuel Whitehead, a graduate student at the lab. “But flies beat their wings at just over 200 Hz, or 200 times per second. In order to observe any subtle changes to the fly’s orientation, we need fast and accurate quantitative measurements.”

That’s where Phantom high-speed cameras come in. The lab uses them to analyze how flies maintain control during flight and recover from mid-air stumbles. Not only does this research shed light on one of the animal kingdom’s most complex, yet beautiful processes, but the results can be applied to the development of future micro air vehicles.

Like Balancing A Stick on Your Finger

For flies, flapping creates the greatest instability along the roll axis, or the angle of rotation running from the front of the fly’s body to the back—or, in aircraft terms, from nose to tail. According to the lab, if a fly does not actively control its roll angle, it would roll over and crash within just four wing-beats. “It’s like balancing a stick on your finger,” Whitehead explains. “Flapping flight is subject to many rapid instabilities that must be constantly controlled to achieve mid-air stability.”

But because insect flight and roll control occur faster than the human eye can see, the researchers utilized high-speed photography to film the fly’s corrective maneuvers and measure its wing and body kinematics. They found that not only do flies exhibit incredible roll control, but they’re able to perform extreme maneuvers to regain it even if they’re perturbed during flight—making their roll correction reflex the fastest in the animal kingdom. These perturbations could be a simple gust of wind or, in the case of the lab’s biolocomotion experiments, a magnetic pulse.

For the fruit fly experiments, the Itai Cohen Group utilized three older Phantom camera models. But the lab recently acquired a newer model, the Phantom VEO 710. This small, yet powerful camera features a 1-megapixel, 12-bit CMOS sensor and over 7 Gigapixels/second throughput—translating to recording speeds of 7,400 fps at full 1,280 × 800 resolution, 680,000 fps at reduced resolutions and 1,000,000 fps with the FAST option. Housed in a compact, 5-inch cube, it is a versatile camera for laboratory settings. And with up to 72 GB of RAM, the VEO 710 can capture more frames than the lab’s previous high-speed cameras. When in multi-cine mode, its internal memory can be segmented up to 63 times for fast, uninterrupted capture of shorter events.

In addition, this combination of fast frame rates and larger frame size—1,280 × 800 resolution versus 512 × 512 resolution—makes the VEO 710 a better choice for performing detailed motion analysis of faster, more complex subjects. “Having this new camera makes it possible to study new things like mosquitoes, which flap their wings four times faster than fruit flies,” Whitehead says.

Observing Roll Control

In a series of experiments, the researchers glued magnets to the backs of 15 common fruit flies and then released them into a transparent cubic chamber. To “trip” the insects in mid-air, they initiated a 5-ms vertical magnetic pulse using two Helmholtz coils located on the floor and ceiling of the chamber. Using three Phantom cameras, the researchers recorded the process, including the mid-air stumble and recovery, at a speed of 8,000 frames per second (fps) at 512 × 512-pixel resolution.

The three Phantom cameras, which were positioned orthogonally around the test chamber, together provided a true 3D view of fly wing and body kinematics—but not without some advanced image processing work on the researchers’ part.

Phantom VEO 710 high-speed camera

To extract the 3D data from the cameras, the researchers first calibrated the cameras to get an estimate of the direct linear transformation (DLT) between pixel and space coordinates for each camera. Next, they used image processing software to extract fly features from the rest of the image for each video frame. The image processing step also binarized the images, separating fly wings and bodies into separate, color-coded images.

These binarized images were then used to create a 3D hull reconstruction of the fly bodies. This process involved scanning through voxel space associated with the trio of cameras and using the DLT estimates to project voxel coordinates onto each of the three camera views. Mapped voxels corresponding to an “on” pixel in all three images together defined the 3D hull reconstruction.

Images obtained from three high-speed Phantom cameras depicting a roll correction maneuver. The three-dimensional-rendered fly represents the measured kinematics.

These 3D voxel reconstructions ultimately allowed the researchers to calculate the position and orientation of the flies—including trajectory and attitude of the body and the flapping of the wings. After analyzing the high-speed footage and performing 3D reconstruction of the data, the researchers found that flies compensate for roll perturbations by flapping one wing harder than the other for 2 to 5 wing-beats, creating corrective torque. They also begin to respond to the perturbation within 5 ms—or one-wing-beat—and can correct for large perturbations that roll them up to 100 degrees within 30 ms. “In other words, by the time you blink, the fly could have performed its correction maneuver 10 times,” Whitehead says.

The researchers even challenged the flies with extreme perturbations; instead of just one magnetic pulse, they exposed the insects to multiple pulses, causing them to spin multiple times mid-air. Remarkably, once the pulses stopped, the flies again regained roll control within three to four wing-beats. “We have yet to discover a perturbation from which the flies cannot recover,” Whitehead says. “They return to nearly zero roll angle despite the number of pulses and imposed revolutions along roll.”

From Neurons to Computer Circuits—and Beyond

Thanks to the high-speed footage of the fruit fly experiments, the lab now has a model of the input-output response of flies—what they experience mid-air and how they employ corrective maneuvers to regain roll control. “Using this data, we now understand how the fly is moving, adjusting its flight and solving its problems,” Whitehead says.

This data opens the door to studying the neurobiology of roll control as a model system for extremely fast reflexes. In particular, such information can be applied to the development of neuromorphic chips for micro air vehicles. “Flies aren’t computers, but they are computing things,” Whitehead says. “We know what the flies are doing. But how exactly do they do it? To answer these kinds of neuroscience questions, you first need a fine-tuned picture of what the animal is doing. Thanks to the cameras, we now have that.”

Micro Air Vehicles: Tiny Bots, Tiny Brains

Micro air vehicles are flying robots smaller than the size of a penny. At these sizes, things like fluid mechanics, stabilization and power become all the more important in order to achieve successful flight. At the same time, they become all the more challenging to address. That’s why engineers take their cues from nature.

The results of the Cornell team’s experiments on fruit flies are helping to inform the design of small, insect-like robots—including an 80-mg flying robot designed by researchers from the Harvard Microrobotics Lab. This robot integrates complex control algorithms to adapt to changing environments, including recovering from gusts of wind—a level of computing that requires the processing power of a desktop computer. Although the tiny bot is currently tethered to a power source, the researchers hope they can devise a new kind of power source that mimics the way neurons fire electrically in the brain.

“There’s only so much power we can store on a robot the size of a fly,” Whitehead says. “But flies manage to do it. Devising control algorithms that mimic neural activity, instead of relying on traditional processors, could add very high computational power at low pay-loads and energy costs.”

This article was written by Doreen Clark, Senior Product Manager, AMETEK, Inc. (Berwyn, PA). For more information, contact Ms. Clark at This email address is being protected from spambots. You need JavaScript enabled to view it. or visit here .


Photonics & Imaging Technology Magazine

This article first appeared in the May, 2020 issue of Photonics & Imaging Technology Magazine.

Read more articles from this issue here.

Read more articles from the archives here.