In July of 2019, we on the Tech Briefs podcast team found ourselves in the middle of a drone testing ground.

We spoke with the engineers competing in the AlphaPilot Innovation Challenge. Sponsored by Lockheed Martin, AlphaPilot asked nine teams to create software that could guide a drone around a course — autonomously. 

In this episode of Tech Briefs' original podcast series Here's an Idea, we explore the highs and lows of the developer summit, held at MIT, and talk to some of the engineers who are writing the code and designing the hardware to send drones through obstacle courses all without a pilot.

Now, these teams are competing in the Artificial Intelligence Robotic Racing (AIRR) Circuit. And the winners will get a million in cash.

In the episode, we also speak with a lieutenant colonel from DARPA, who’s leading a competition of his own – a "digital dogfight."

A fast, pilotless aircraft has the chance to impact our everyday lives, in ways that are both exciting and frightening.

Will we be able to trust autonomous drones? Can drones beat a human pilot? Why should we race drones? 

Listen to the episode below.

Subscribe or listen via your preferred podcast provider here.

Episode Highlights:

  • 0:17: What does a testing ground look like exactly? We bring you into the scene at MIT, where nine teams took their drones out for a spin.
  • 5:32: Ryan Gury, Chief Technology Officer of the Drone Racing League, takes us around the DRL headquarters.
  • 10:30: Lockheed Martin's Chelsea Sabo and Georgia Tech grad student and AlphaPilot team captain Manan Gandhi bring us to their workroom and describe their biggest challenges when designing A.I. for flight.
  • 17:24: When it comes to autonomous drone competitions, DRL isn't the only game in town. Lieutenant Colonel Dan "Animal" Javorsek's 'ACE' project uses simulation to pit one pilot-less aircraft against another.

Here are excerpts from our interview with DRL's Ryan Gury and DARPA's Lt. Col. Javorsek.

Gury, on the joy of First-Person View (FPV):

"FPV is crazy. You wear a video screen, and you control a small racing machine that has absolute instant acceleration and can maneuver in any direction, and when you get good at it, it's just like flight. You forget that you're on the ground.

I ride a motorcycle, I love the feeling of speed, and when you're flying a FPV there's nothing faster or greater than that. It's a huge rush, it's super fast, and you can't get hurt, so you can take all kinds of ridiculous risks while you're flying."

Gury, on what the Racer AI drone looks like:

"The front of the drone is a very large obtuse angle so it looks like a flying crossbow. It can generate 16 to 20 pounds of thrust; it uses nine inch propellers compared to the six or seven which is what we use; it has a canopy; it looks like a bat mobile. 

It's basically a massive computer jammed into a small size, and all it's doing is running Photoshop filters, Its perfect for computer vision. We're able to take something that would be the size of a refrigerator ten years ago and smash it into a drone and in this case, try and contend with humans."

Gury, on why autonomous drones can't fail:

"... Even if a human is flying a drone and they crash it, you still have someone to blame. I don't think there's any forgiving nature about robots failing. I think in order for autonomous technologies to succeed, they have to be almost perfect."

Lt. Col. Javorsek, on the importance of autonomous aircraft:

"There may be some good value that you can get out of having an autonomous system that's able to offload a lot of the more mundane tasks: flying the airplane, maneuvering it to a position of advantage.

If we have autonomy that's able to handle these lower-level tasks, and can slowly get better at handling the more complicated ones, we preserve the limited cognitive resources that we have to handle these really challenging scenarios, right? And so, when it comes to, say, combat search and rescue, if I have an environment that is threatened, if the autonomy is able to automatically handle a lot of the basic flying tasks, or even some of the basic threat reactions, I as the human battle manager, can think at a level that allows me to maybe do longer range planning for how to recover the downed air crew or whatever is the actual mission for that day."

Lt. Col. Javorsek, on how autonomous an autonomous aircraft actually is:

"Really, it turns out that the manpower footprint that's required to operate an unmanned aerial vehicle today is quite high, on the order of 10 or so people. And when you start looking at the manpower requirement there, you start to recognize that we are in an area that is in bad need of autonomy. And it's not for technological limitations. Most of that is because we don't really trust the systems to work the way that they could do."

Lt. Col. Javorsek, on why he chose dogfighting to test A.I. in the air:

"The challenge-problem that we put new human pilots through is not because we want these future pilots to be able to handle the large number of dogfights that are going to happen in the future, because we don't really anticipate that. But rather it's kind of an entry gate that introduces them to a very dynamic environment that requires them to make decisions very quickly. But that isn't burdened with a lot of additional information, like complicated sensors, radars and other types of sensors, that can cause the decisions to be more complicated."

More Photos from the MIT Drone Testing Summit:

Tech Briefs editor Billy Hurley with the Drone Racing League's (DRL's) James Slider, at MIT, testing out autonomousdrones
Tech Briefs editor Billy Hurley (right) with James Slider, director of special projects at DRL.
Engineers at MIT work on developing software to support autonomous aircraft
Autonomous aircraft engineers at work.
Georgia Tech's Manan Gandhi and Tech Briefs' Billy Hurley
AlphaPilot Challenge captain Manan Gandhi (left), with Hurley.

Sign up for the Here's an Idea newsletter.