The world is an uncertain place for drones delivering packages through populated cities or navigating through the many trees and obstacles in the forest.
Researchers from the Massachusetts Institute of Technology’s (MIT) Computer Science and Artificial Intelligence Laboratory (CSAIL) have developed a new system that accounts for uncertainty, which will allow drones to consistently fly 20 miles per hour through dense environments, including forests and warehouses.
The system—dubbed NanoMap—takes into account that the drone’s position in the world over time is uncertain and models and accounts for that uncertainty.
“Overly confident maps won’t help you if you want drones that can operate at higher speeds in human environments,” graduate student Pete Florence, lead author of the study, said in a statement. “An approach that is better aware of uncertainty gets us a much higher level of reliability in terms of being able to fly in close quarters and avoid obstacles.”
NanoMap uses a depth-sensing system to piece together a series of measurements about the drone’s immediate surroundings, allowing it to not only make motion plans for its current field of view, but also anticipate how it should move around in the hidden fields of view that it has already seen.
“It’s kind of like saving all of the images you’ve seen of the world as a big tape in your head,” Florence said. “For the drone to plan motions, it essentially goes back into time to think individually of all the different places that it was in.”
During initial testing, NanoMap was not modeling uncertainty, causing the drone to drift just 5 percent away from where it was expected to be. This drifting caused the drone to crash more than once every four flights.
However, after accounting for uncertainty, the crash rate dropped to just 2 percent.
In recent years, computer scientists have worked on algorithms that allow drones to know where they are, what is around them and how to get from one point to another, using common approaches like simultaneous localization and mapping (SLAM) to take raw data of the world and convert it into mapped representations.
The output of SLAM methods are not typically used to plan motions. To compensate, researchers often use methods like “occupancy grids,” where many measurements are incorporated into one specific representation of the 3D world.
However, the data can be both unreliable and hard to gather quickly. At high speeds, computer-vision algorithms cannot make much of their surroundings, forcing drones to rely on inexact data from the inertial measurement unit (IMU) sensor, which measures things like the drone’s acceleration and rate of rotation.
NanoMap operates under the assumption that to avoid an obstacle you do not have to take 100 different measurements and find the average to figure out its exact location in space, but rather you can gather enough information to know that the object is in a general area.
“The key difference to previous work is that the researchers created a map consisting of a set of images with their position uncertainty rather than just a set of images and their positions and orientation,” Sebastian Scherer, a systems scientist at Carnegie Mellon University’s Robotics Institute, said in a statement. “Keeping track of the uncertainty has the advantage of allowing the use of previous images even if the robot doesn’t know exactly where it is and allows in improved planning.”
This new technology could enable drones to be used for everything from search-and-rescue missions and defense to package delivery and entertainment, as well as for self-driving cars and other forms of autonomous navigation.