In the future, autonomous drones could be used to shuttle inventory between large warehouses. A drone might fly into a semi-dark structure the size of several football fields, zipping along hundreds of identical aisles before docking at the precise spot where its shipment is needed.
Most of today’s drones would likely struggle to complete this task, since drones typically navigate outdoors using GPS, which doesn’t work in indoor environments. For indoor navigation, some drones employ computer vision or lidar, but both techniques are unreliable in dark environments or rooms with plain walls or repetitive features.
MIT researchers have introduced a new approach, MiFly, that enables a drone to self-localize, or determine its position, in indoor, dark, and low-visibility environments. Self-localization is a key step in autonomous navigation.
MiFly uses radio frequency (RF) waves, reflected by a single tag placed in its environment, to autonomously self-localize.
Here is an exclusive Tech Briefs interview, edited for length and clarity, with Co-Lead Authors and Research Assistants Maisy Lam and Laura Dodds.
Tech Briefs: What are some technical challenges you faced while developing MiFly and how did you overcome them?
Dodds: I can start by giving a high-level overview of the big technical challenge here, which is that we really wanted to enable drones to fly in challenging indoor environments. For example, in the future, drones might need to go indoors to dock and deliver your package or fly within a warehouse or things like that. But drones today cannot fly within challenging indoor environments like MIT's underground tunnels. And the reason why this is hard is because GPS signals that drones typically use today cannot work indoors. So, it doesn't work in dark settings or what we call featureless environments where you have plain walls or repetitive visual features.
Our goal here was to overcome this and allow a drone to fly in these challenging environments. The way we did this was using what we call millimeter wave signals. Millimeter wave signals are used in 5G or self-driving cars. They're growing in popularity for a lot of reasons, but one of the big ones is that they can work in the dark. Our idea was that if we can use this millimeter wave for drone self-localization, we can enable it to fly in these challenging environments. Our high-level idea was we can place a millimeter wave sensor on the drone, and it can localize itself with respect to a sticker that we place on the wall, a millimeter wave tag. This would allow us to provide a localization system in these challenging environments with minimal infrastructure.
The way this works is we send a signal from our drone, which reflects off everything in the environment, but we need to be able to localize or separate the reflection that's coming only from the tag on the wall. In order to do this, we use a technique called modulation, where our tag essentially adds a modulation frequency on top of the signal. So, it received the signal from the radar, adds an extra frequency on top, and reflects it back toward the radar. This now allows us to separate sort of the reflection from the tag from everything else in the environment. And this is the start of how we can start to localize the drone with respect to this tag and overcome this fundamental challenge of the localizing in these dark and challenging environments.
Lam: I'd like to build off what Laura said and how the system operates. Once we reached that point where we were introducing the modulation, another challenge that arrived here was, ‘OK, now we can separate the tags response from the environment.’ But, at this point, the radars that we mounted on the drone are really only able to range to the tag. So, this is a challenge because we want to be able to get 3D localization for a drone. You need to be able to identify a drone’s full six degree of freedom (DoF) pose in space, which is critical for performing movements, lateral complex rotation. At this point, another challenge is that a single tag is not enough to localize; you can think about GPS, there are multiple GPSs that are needed to trilaterate. You need at least three. The problem here is that we don't want to go ahead and just deploy a bunch of tags indoors; this is not good for infrastructure. We wanted to be able to only use one tag, so that was another challenge here.
The way we approached this was that, instead of adding more anchors, we wanted to see if we could look at the same anchor or the same tag that Laura was describing in a different way. So, how do you leverage one tag to get the full six DoF location of the drone?
A good analogy here is polarized sunglasses; that's the approach that we took. Polarized sunglasses receive a certain polarization of light that arrives in one direction, and they block out another polarization in another direction. We applied the same concept to millimeter waves. This where we came up with the idea of dual polarization on top of the dual modulation. By leveraging vertically polarized and horizontally polarized signals, we are able to isolate, basically, signals that are going back and forth between the radars on the drone and the tag on the wall.
Tech Briefs: What was the catalyst for this project?
Lam: Our lab does a lot of research in wireless signals. Prior to this, Laura had worked on leveraging radio frequency signals on an AR headset. So, in our group, we're always kind of inspired with ‘How do you push the next generation of technology and push the boundaries using new sensing modalities?’
Tech Briefs: Do you have plans for further research, work, etc.?
Lam: We’re already working on it: autonomous integration leveraging these tag designs and now building a drone. We want to be able to leverage these estimates that we get and show that the drone can actually use them in real-time navigation. So, it cannot rely on vision and not rely on GPS and simply use these tags to perform fully autonomous navigation.
What we mean by that is allowing a drone to go from point A to point B in a space safely — meaning that it's able to determine its own position and path such that it's constantly able to get signal to the tag accurately. We already are working on that and have already demonstrated preliminary success with this.
Dodds: On the tag design side, what we were focusing on really in this project was how to leverage a tag and design a tag to enable this six DoF localization. We weren't focused on enabling the highest range possible in this initial implementation of our system.
So, because there's already other works that show initial range estimates from a much longer distance, we can incorporate a lot of their innovations in order to improve the range. So, we don't need to start from scratch to make a longer-range tag. We can incorporate their work because the long range wasn't really a focus for our exact paper. In the future, we would like to work on that to build a longer-range system. But that's later down the pipeline.
Transcript
00:00:02 Researchers at MIT have built a new drone system, MiFly, that can accurately track its location indoors, including in the dark and low visibility settings. Instead of relying on GPS or cameras, which fail indoors and in the dark, MiFly relies on a new location technology that uses wireless signals. Using two radars mounted on the drone that send cross-polarized wireless signals, MiFly can track its own location with six degrees of freedom. The drone can fly through MIT’s network of dark underground tunnels. It does this using the lightweight radars mounted on the drone. As it flies, the radars continuously send wireless signals that are uniquely polarized. A single tag is deployed in the environment, similar to a sticker on the wall. The tag reflects these signals back towards the drone. Using the different polarizations, the drone can separate horizontal and vertical
00:00:56 signals to estimate distance to the tag. It combines these reflections with data from an onboard inertial measurement unit to locate itself. As shown in the bottom left, MiFly can precisely estimate its pose in 6 degrees of freedom with an accuracy of 7 centimeters. Beyond dark environments, MiFly can also function in indoor environments with little visual features, such as a plain hallway. In these settings, cameras struggle because there are little visual features to use for tracking. MiFly, however, uses wireless signals which work in featureless environments. The researchers tested MiFly across a wide range of indoor environments, showing that it is accurate even as the drone moves and rotates in different ways. This technology could enable many new applications. For example, it could be used for improved automation and efficiency in warehousing. MiFly can also be used to enable precise docking
00:01:51 and delivery in scenarios that are challenging for GPS and cameras. Using wireless signals for drone localization has the potential to significantly improve the robustness of future drone technologies. For more information check out our website mmdrone.media.mit.edu

