Scientists from the University of Lincoln and Newcastle University in the U.K. have created a computerized system that allows for autonomous navigation of mobile robots based on the locust’s unique visual system. The work could provide the blueprint for the development of highly accurate vehicle collision sensors, surveillance technology, and even aid video game programming according to the research published today.

Locusts have a distinctive way of processing information through electrical and chemical signals, giving them an extremely fast and accurate warning system for impending collisions. The insect has incredibly powerful data processing systems built into its biology, which can, in theory, be recreated in robotics.

The research started by understanding the anatomy, responses and development of the circuits in the locust brain that allow it to detect approaching objects and avoid them when in flight or on the ground.

A visually stimulated motor control (VSMC) system was then created that consists of two movement detector types and a simple motor command generator. Each detector processes images and extracts relevant visual clues that are then converted into motor commands.

The researchers were inspired by the way the locusts’ visual system works when interacting with the outside world, and the potential to simulate such complex systems in software and hardware for various applications. The system was inspired by the locusts’ motion sensitive interneuron – the lobula giant movement detector. This system was then used in a robot to enable it to explore paths or interact with objects, effectively using visual input only.

The researchers want to apply it to collision avoidance systems in vehicles. The performance of these systems is not always as good as it could be – and they come at a high cost. This research offers important insights into developing a system for the car that could improve performance to such a level that the element of human error could be eliminated.

Source