Cornell University engineers have been experimenting with a new type of programming that mimics the mind of an insect.

The developed sensors and algorithms may soon support autonomous, small-scale robots like Harvard University’s RoboBee  , an 80-milligram flier that could perform a variety of roles in agriculture and disaster relief.

A Better Chip

Even the most lifelike bug-bot could be thrown off by a gust of wind or a mid-air obstacle.

Cornell’s sensing system aims to steer a RoboBee around trouble, adjusting its flight to avoid a crash.

The adapting capabilities, however, require a significant amount of computing power — and a bee can’t carry a desktop computer on its back.

To lighten the load, Silvia Ferrari — professor of mechanical and aerospace engineering and director of Cornell’s Laboratory for Intelligent Systems and Controls  — and her team are turning to neuromorphic computer chips.

Unlike traditional chips that process binary code, neuromorphic chips function similarly to biological neurons — transmitting spikes of electrical current, in complex combinations.

Though not entirely demonstrated in hardware yet, Ferrari says the neuromorphic sensors and chips will require less power than standard binary chips.

Spiking neural networks, according to the researcher, can learn to perform adaptive, reconfigurable control for small-scale vehicles like the RoboBee.

“The hope is that, as a result, the artificial sensorimotor controller will be capable of adapting more rapidly and more robustly than conventional controllers,” Ferrari told Tech Briefs.

Simulating Insect Flight

So how does a tiny insect stabilize itself mid-flight?

Currently the neuromorphic functions have been demonstrated by physics-based modeling. A simulator, developed by Cornell University doctoral student Taylor Clawson, predicts the aerodynamic forces on the RoboBee’s wings.

Cornell engineers are developing new programming that will make RoboBees (shown) more autonomous and adaptable to complex environments. (Image Credit: Harvard)

To virtually demonstrate an insect in air, Clawson used a method originally devised by Cornell professor Jane Wang  .

The simulations provide theoretical bounds on both the sensing rate and the delay time between sensing and actuation.

By interpreting the findings together with experimental results on fruit flies’ reaction time and sensory motor reflexes, Wang determined that fruit flies stabilized their flight by sensing their kinematic states every wing beat.

Clawson’s model includes the aerodynamics from Wang’s original work, modeling the RoboBee flight in all three dimensions. Instead of computing aerodynamic forces based on an assumed trajectory, however, Clawson’s simulator computes the wings’ path based on the forces exerted on the wing during flight.

With the simulator, researchers can test the controller and autonomy algorithms under a broad range of operating conditions, scenarios, and control designs.

“The simulation has been used to accurately predict many of the key aspects of the RoboBee flight as observed in physical experiments,” Clawson told Tech Briefs.

The Next Step

Ferrari’s lab has teamed up with the Harvard Microrobotics Laboratory  , creators of the lightweight RoboBee — a bot equipped with vision, optical, flow, and motion sensors.

With a four-year, $1 million grant from the Office of Naval Research, the Cornell/Harvard team is developing sensorimotor controllers and algorithms — testing them first in simulation.

The Cornell programming, Ferrari hopes, will help make the RoboBee more autonomous and adaptable, without significantly increasing the vehicle’s weight.

“We are excited about both fundamental questions on the control of flapping flight and very small-scale robots that are challenged by comparatively large external disturbances,” said Ferrari.

What do you think? Can small-scale robots learn to think like insects? Share your thoughts below.