Scientists have developed a new virtual training ground that can help fine-tune fast-flying drones, without a lot of messy clean up or broken windows.
Researchers from the Massachusetts Institute of Technology have created a new virtual reality system that allows drones to see a rich virtual environmental in an empty physical space.
The new system—dubbed “Flight Goggles”—could serve as a virtual testbed for several different environmental and other conditions where researchers want to test and train fast-flying drones.
“We think this is a game-changer in the development of drone technology, for drones that go fast,” Sertac Karaman, an associate professor of aeronautics and astronautics at MIT, said in a statement. “If anything, the system can make autonomous vehicles more responsive, faster, and more efficient.”
Flight Goggles is comprised of a motion capture system, an image rendering program and electronics that enable the researchers to quickly process images and transmit them to the drone. The researchers lined the test space—a hanger-like gymnasium—with motion-capture cameras that can track the orientation of the drones as they fly.
With the image-rendering system, the researchers can draw up photorealistic scenes, like a loft apartment or a living room, and beam the virtual images to the drone as it is flying through the empty facility.
“The drone will be flying in an empty room, but will be ‘hallucinating’ a completely different environment, and will learn in that environment,” Karaman said.
The drone can process the virtual images at a rate of about 90 frames per second—about three times faster than the human eye can see and process images—because of custom-built circuit boards that integrate a powerful embedded supercomputer, along with an inertial measurement unit and camera. The hardware fits into a small, 3D printed nylon and carbon-fiber-reinforced drone frame.
To train autonomous drones, researchers must fly the vehicles in large, enclosed testing grounds where they often use large nets to catch any careening vehicles. They also must set up props like windows and doors for the drone, making testing an expensive process.
This type of testing precludes fast-flying drones that need to process visual information quickly as they fly through an environment.
“The moment you want to do high-throughput computing and go fast, even the slightest changes you make to its environment will cause the drone to crash,” Karaman said. “You can’t learn in that environment. If you want to push boundaries on how fast you can go and compute, you need some sort of virtual-reality environment.”
In the new set-up, the researchers conducted a set of experiments, including one where the drone learned to fly through a virtual window about twice its size.
The window was set within a virtual living room and as the drone flew in the actual, empty testing facility, the researchers beamed images of the living room scene, from the drone’s perspective, back to the vehicle. The researchers tuned a navigation algorithm to enable the drone to learn on the fly.
Over 10 flights, the drone successfully flew through the virtual window 361 times at around five miles per hour. The drone only crashed into the window three times.
For the final test, the researchers turned on the drone’s onboard camera to enable it to see and process its actual surroundings at the facility, which included an actual window.
Using the navigation algorithm that the researchers tuned in the virtual system, the drone, over eight flights, was able to fly through the real window 119 times, only crashing or requiring human intervention six times.
“It does the same thing in reality,” Karaman said. “It’s something we programmed it to do in the virtual environment, by making mistakes, falling apart, and learning. But we didn’t break any actual windows in this process.”
Karaman said that extreme robo-sport was his initial motivation behind the study and he plans to enter a drone racing competition in the next two to three years with an autonomous drone in an attempt to beat the best human participants.