An augmented reality headset combines computer vision and wireless perception to automatically locate a specific item that is hidden from view, perhaps inside a box or under a pile, and then guide the user to retrieve it. (Image: Courtesy of the researchers, edited by MIT News)

MIT researchers have built an augmented reality (AR) headset that gives the wearer X-ray vision. The headset combines computer vision and wireless perception to automatically locate a specific item that is hidden from view, perhaps inside a box or under a pile, and then guide the user to retrieve it.

The system utilizes radio frequency (RF) signals, which can pass through common materials like cardboard boxes, plastic containers, or wooden dividers, to find hidden items that have been labeled with RFID tags, which reflect signals sent by an RF antenna.

The headset directs the wearer as they walk through a room toward the location of the item, which shows up as a transparent sphere in the AR interface. Once the item is in the user’s hand, the headset, called X-AR, verifies that they have picked up the correct object.

When the researchers tested X-AR in a warehouse-like environment, the headset could localize hidden items to within 9.8 centimeters, on average. And it verified that users picked up the correct item with 96 percent accuracy.

X-AR could aid e-commerce warehouse workers in quickly finding items on cluttered shelves or buried in boxes, or by identifying the exact item for an order when many similar objects are in the same bin. It could also be used in a manufacturing facility to help technicians locate the correct parts to assemble a product.

“Our whole goal with this project was to build an augmented reality system that allows you to see things that are invisible — things that are in boxes or around corners — and in doing so, it can guide you toward them and truly allow you to see the physical world in ways that were not possible before,” said Fadel Adib, Associate Professor in the Department of Electrical Engineering and Computer Science, the Director of the Signal Kinetics group in the Media Lab, and the senior author of a paper on X-AR.

To create an augmented reality headset with X-ray vision, the researchers first had to outfit an existing headset with an antenna that could communicate with RFID-tagged items. Once the team had built an effective antenna, they focused on using it to localize RFID-tagged items.

They leveraged a technique known as synthetic aperture radar (SAR), which is similar to how airplanes image objects on the ground. X-AR takes measurements with its antenna from different vantage points as the user moves around the room, then it combines those measurements. In this way, it acts like an antenna array where measurements from multiple antennas are combined to localize a device.

X-AR utilizes visual data from the headset’s self-tracking capability to build a map of the environment and determine its location within that environment. As the user walks, it computes the probability of the RFID tag at each location. The probability will be highest at the tag’s exact location, so it uses this information to zero in on the hidden object.

To test X-AR, the researchers created a simulated warehouse by filling shelves with cardboard boxes and plastic bins and placing RFID-tagged items inside.

They found that X-AR can guide the user toward a targeted item with less than 10 cm of error — meaning that on average, the item was located less than 10 cm from where X-AR directed the user. Baseline methods the researchers tested had a median error of 25 to 35 cm.

They also found that it correctly verified that the user had picked up the right item 98.9 percent of the time. This means X-AR is able to reduce picking errors by 98.9 percent. It was even 91.9 percent accurate when the item was still inside a box.

Now that they have demonstrated the success of X-AR, the researchers plan to explore how different sensing modalities, like WiFi, mmWave technology, or terahertz waves, could be used to enhance its visualization and interaction capabilities. They could also enhance the antenna so its range can go beyond 3 m and extend the system for use by multiple, coordinated headsets.

For more information, contact Abby Abazorius at This email address is being protected from spambots. You need JavaScript enabled to view it.; 617-253-2709.