Artificial Brain System with Touch and Vision Sensors for Smarter Robots
Most robots operate based on visual processing for complex tasks like grasping objects, which limits their capabilities. In order to perform even more complex tasks, robots need an exceptional sense of touch and the ability to process sensory information quickly. Engineers and computer scientists at the National University of Singapore have developed a new sensory integrated artificial brain system that mimics biological neural networks, and can run on a power-efficient neuromorphic processor, like Intel’s Loihi chip. The system also integrates artificial skin and vision sensors, giving robots the ability to draw accurate conclusions about the objects they are grasping based on the data that the vision and touch sensors capture in real-time.