When an area is deemed too dangerous for human exploration teams, the idea of a search-and-rescue drone — or a swarm of them! — is an intriguing concept.

After all, why put human lives at risk when you can survey a scene with camera-equipped unmanned fliers?

Creating packs of autonomous drones, however, requires a lot of hardware. To produce three-dimensional maps to guide each aircraft, you need cameras, laser range finders, and a demanding amount of computational and memory resources.

A group of researchers has developed a more minimal, map-less approach – one that takes after the bee.

The swarm gradient bug algorithm (SGBA), created by teams from TU Delft, the University of Liverpool and Radboud University of Nijmegen, is a pared-down navigation solution. The software allows a swarm of tiny flying robots to autonomously explore an unknown environment.

As battery life diminishes, the commercial off-the-shelf (COTS) drones subsequently return to a set location. To come back to the departure point, the robots search for a "home" beacon, a 2.4 Ghz radio transmitter.

“The main idea underlying the new navigation method is to reduce our navigation expectations to the extreme: we only require the robots to be able to navigate back to the base station,” said Guido de Croon , principal investigator of the project. “The swarm of robots first spreads out into the environment by having each robot follow a different preferred direction. After exploring, the robots return to a wireless beacon located at the base station.”

In a proof-of-concept test, tiny, camera-equipped drones were sent into an indoor office environment to find two human “dummies,” representing victims in a disaster scenario. Within six minutes, the swarm of 6 drones explored about 80% of the open rooms.

The team's findings, financed by the Dutch Research Council (NWO), within the Natural Artificial Intelligence programme, appeared in the research journal Science Robotics .

In addition to assistance from vision-guided odometry, the robots navigate their environment by following the walls. With four laser range sensors in all four directions of the aircraft's horizontal plane, the drones determine a wall angle by evaluating whether the side range sensors are triggered in combination with the front one.

Moreover, the tiny aircraft avoid collision and maximize their search efficiency by "talking" with each other via wireless-communication chips.

The signal strength between the chips is shown much like how the bars on your phone decrease when you move away from your Wi-Fi router at home. By sensing proximity, the drones keep their distance from each other, avoiding crashes. (See the video below.)

“The main advantages of this method are that it does not require extra hardware on the drone and that it requires very few computations,” said PhD student Kimberly McGuire , who performed the project.

In an edited interview below, Kimberly McGuire tells Tech Briefs where the tiny swarms of drones are expected to take off from here.

Tech Briefs: What does the drone look like?

Kimberly McGuire: You have to imagine that these flying robots are very small quadrotors, which means that they stay in the air with 4 rotor blades. They’re so small that they can fit in the palm of your hand. They’re very lightweight — only 33 grams. They are like a smaller version of the bigger quadrotors that you can buy for making a video when you’re going on a hike outside.

Tech Briefs: Why is it important that the drones are so small?

Kimberly McGuire: Small drones are more useful for future applications that we have in mind: greenhouse inspection, warehouse inspection, and search-and-rescue scenarios. In these situations, when they have to actually fly around humans, it’s important that they are very lightweight. You wouldn’t feel comfortable if a three-kilo drone was flying over your head.

We really wanted to focus on making them small, because it’s safer for the humans around them. But when they are small, it’s also possible to use many of them at once because they are quite cheap if you compare them to their variants.

Tech Briefs: Take me through the proof-of-concept task that the drone achieved.

Kimberly McGuire: We wanted to place all the drones in one location where they could explore a building – where they take off, spread out, and try to cover as many rooms as possible.

While they’re flying and they come across each other, they can sense each other through inter-communication. They can exchange information. For instance, one drone can “say,” “My preferred direction is that way,” and the other drone says, “Okay, I will change my preferred direction to another way.” Of course, avoiding each other is also a very important task.

Once they are about halfway through their battery life, they switch their behavior to try to return to their initial location or start position. They try to find a way back by listening to the home beacon, which is a 2.4 Ghz radio transmitter, and they try to navigate back by measuring the signal strength of that beacon. That’s how they find their way home again.

If the drone is carrying a camera with an SD card reader, the user can just take that SD card and watch the videos that the drone recorded while it was on its way.

Tech Briefs: How does a drone understand its environment?

Kimberly McGuire: Currently, we really focused on the navigational purpose – simply going from A to B. Bigger vehicles and drones have a laser scanner or an HD camera, where they make a pretty map of their environment. They are trying to navigate and localize themselves in a 3D map.

We tried to look more to the animal kingdom and insect-inspired navigation. Bees, for example, never use a 3D map. They actually use very simple behaviors to explore the environment that surrounds their hive, and they are able to return home.

The tiny quadrotor detected obstacles with very lightweight laser range sensors. The sensors are not ones that turn around; they are placed in different directions on the small quadrotor. The sensors are also being used to navigate further into the building as well. They are following the walls.

Tech Briefs: Are they able to detect objects?

Kimberly McGuire: In our research, the drones are not programmed onboard to detect anything of interest. We hope that future researchers will do that, and that we have a navigational tool kit and platform for researchers to use and add sensors to.

Hardware and communication specifics. Parts A-D of this image show the tiny drones, including components, weight, and battery consumption. (E) reveals the communication scheme shown for the six-drone experiment. Here, a counter regulates when the drone will transmit a message (msg) to another drone (for counter 1: drone 1 to 2, drone 2 to 3, etc.). Between the regulated counter, the drone transmits its message to another drone with a time offset based on its ID. Six PAs were used for the six communication channels to receive logging of each drone; however, these can be replaced by one if no telemetry is required.

Tech Briefs: What are the biggest challenges in creating a truly autonomous aircraft?

Kimberly McGuire: When I think of “full onboard autonomy,” I’m referring to control without the help of an external computer doing the calculations and sending commands. The biggest challenge to achieve full autonomy on smaller drones is that there is limited energy and limited conversational capabilities.

If you want to design a navigational strategy for these small things, you really have think about the minimal necessary requirements for them to move through such an environment completely by themselves. You have to think in a very minimal way. That’s why we tried try to look at the insect kingdom.

Another big challenge: There is only a limited amount of sensing that you can put on these drones. When we were adding an extra camera, we had to make some adjustment to the sensing capabilities already onboard, just to make sure that the drone was still able to fly for the full eight minutes. Any sensing that you add on top of that will eat away at its battery life. That’s a very big balance that we have to be very aware of while developing such a strategy: It’s not only the software; it’s also the hardware.

Tech Briefs: What’s next?

Kimberly McGuire: We showed a proof of concept, that this can be possible with a swarm of these tiny quadrotors weighing only 33 grams. We hope that people will take inspiration from this, and they’re able to make this very minimal navigational package even more robust.

Because it’s so computationally efficient, people can actually add more functionalities to it that are useful for certain missions. If users need to go into a building, for example, they can add something to detect humans who are in danger in collapsed buildings. We hope to have made a good base for other researchers to build upon.

What do you think? Share your comments and questions below.