Robot Uses Honeybee-Like Artificial Brain to Navigate Environment

Using the relatively simple nervous system of the honeybee as a model, researchers from the Free University of Berlin (FU Berlin) have demonstrated a robot that perceives environmental stimuli and learns to react to them. They installed a camera on a small robotic vehicle and connected it to a computer, and a computer program replicated in a simplified way the sensorimotor network of the insect brain. The input data came from the camera that received and projected visual information. The neural network, in turn, operated the motors of the robot wheels - and could thus control its motion direction. "The network-controlled robot is able to link certain external stimuli with behavioral rules," says Martin Paul Nawrot, professor of neuroscience at FU Berlin. "Much like honeybees learn to associate certain flower colors with tasty nectar, the robot learns to approach certain colored objects and to avoid others.



Transcript

00:00:06 here I'm presenting a test where spiking neural network model of conditioned behavior is tested on an autonomous robot our robot is a DF robot ship Rover equipped with a light sensor and a havo embedded camera that does image processing on the chip our spiking M network is simulated using ITR which is an open-source spiking Network Simulator the task we're

00:00:33 about to show is a simple associative learning task with the robot is presented with colored objects and one color is reinforced the robot will then show its preferences to that color by driving towards it the neural network architecture is inspired by the model of olfactory learning in the Tropa and the honey if

00:00:55 you go from left to right then the crn represent the color receptor neurons that either are tuned to the color red or blue their output is projected to a group of projection neurons that excite the canyon cells in the mushroom body the KC output converges to a single group of extrinsic neurons whose synapses are plastic and form the association between the conditioned

00:01:21 stimula and the unconditioned stimulus in our setup the camera input serves as sensory input and the flashlight as as the reinforcement signal the output of the network is then translated into motor signals references are listed in the credits at the end of this video the video on the left hand side shows the spiking of MOT neuron groups where one group represents the

00:01:46 left motor and the other the right the colar bars show where the colar was detected by the camera at any given moment e e