Visualization System Reads Robots' 'Thoughts' for Better Autonomous Vehicles
A new visualization system developed by MIT combines ceiling-mounted projectors with motion-capture technology and animation software to project a robot's intentions in real time. The researchers have dubbed the system "'measurable virtual reality' (MVR) - a spin on conventional virtual reality that's designed to visualize a robot's perceptions and understanding of the world," says Ali-akbar Agha-mohammadi of MIT's Aerospace Controls Lab. The researchers say the system may help speed up the development of self-driving cars, package-delivering drones, and other autonomous, route-planning vehicles. The team initially conceived of the system in response to feedback from visitors to their lab. During demonstrations of robotic missions, it was often difficult for people to understand why robots chose certain actions. The engineers mounted 18 motion-capture cameras on the ceiling to track multiple robotic vehicles simultaneously. They then developed computer software that visually renders 'hidden' information, such as a robot's possible routes, and its perception of an obstacle's position. They projected this information on the ground in real time, as physical robots operated.
Transcript
00:00:09 Our lab is interested in rapid prototyping and testing of autonomous vehicles which are vehicles that can complete a certain set of tasks without human intervention. So, as researchers we're very interested in seeing how our algorithms work in the real world where sensory measurements are often imperfect or even distorted. In these scenarios vehicles often make an estimate or a belief of their best
00:00:32 perception of what the environmental model looks like and makes a decision based on that. But this magnifies when you have a very complex system of agents interacting together and it's very hard to understand why the algorithm behaves a certain way. What we'd really like to be able to do is to be able to read the minds of our autonomous agents and get some idea of how their decision-making processes work.
00:00:53 Additionally we'd like to be able to test our algorithms on a variety of environments so we can robustify them. So this new system which we refer to it as "measurable virtual reality" basically combines a projection system a motion capture system. We use the projection system to project a simulated environment and we call it measurable virtual reality because we measure this projected scene
00:01:18 using actual sensors that are mounted on autonomous robots, say ground robots or aerial robots. And at the same time we have this motion capture system which tells us where this physical system, the robot we are working with, located in the 3-D environment and combining the information we are getting from the motion capture system and the projection system we basically enable fast prototyping of
00:01:44 cyber-physical systems or in other words the faster design of these learning, perception and planning algorithms for autonomous systems. This work can be applied in multi-agent scenarios where a single agent can have control of other agents in its team. For example you can think of a scenario where an agent can communicate with nearby agents but only within a certain radius of
00:02:09 communication. As this leader agent moves around in its environment it can link to other agents and give them tasks to do in real time. This was work that was previously very hard to convey to spectators from outside our lab because it was difficult to ascertain when that communication link occurred. But now using our system we can in real-time see when that link-up occurs.
00:02:32 One of the limitations these days in designing autonomous systems are the regulations out there in society. We cannot easily run autonomous cars or flying robots outdoors due to the regulations so this system allows us to bring the outdoors in basically and have simulations of the world and then using the sensors actually measure this projected scene as if the robot is flying
00:02:57 or driving outside in the real world environment. So our system allows to transform any indoor lab environment into a complete virtual reality simulation which is perceivable by any type of autonomous agents. So we're hoping that this system can become a future indoor environment in which private institutions can test and research their vehicles before deploying them into the real world.