MIT-Developed Flight Goggles Allow Drones to Navigate Around Intangible Objects

Researchers at MIT’s Laboratory for Information and Decision Systems and its Computer Science and Artificial Intelligence Laboratory have developed a virtual reality-based system that lets drones navigate rooms while avoiding virtual obstacles. The “flight goggles” system allows the unmanned aerial vehicle to “see” a complete virtual environment and navigate around intangible objects. It comprises a motion capture system, an image rendering program, and electronics that enable the team to quickly process images and transmit them to the drone. The virtual images can be processed by the drone at a rate of about 90 frames per second – around three times as fast as the human eye can see and process images. It is expected that the technology could significantly reduce the number of crashes that drones experience in actual training sessions and could serve as a virtual test bed for any number of environments and conditions in which researchers might want to train fast-flying drones.

“We think this is a game-changer in the development of drone technology, for drones that go fast,” said Sertac Karaman, associate professor of aeronautics and astronautics at MIT. “If anything, the system can make autonomous speed more responsive, faster, and more efficient.”

Karaman was initially motivated by a new, extreme robo-sport: competitive drone racing, in which remote-controlled drones, driven by human players, attempt to out-fly each other through an intricate maze of windows, doors, and other obstacles. He was also inspired by the study of birds flying through cluttered environments – such as dense forests – leading to his research into high-speed navigation through randomly-generated obstacle fields.

High-speed navigation requires an entirely new training regimen. Currently, training autonomous drones is a physical task involving large, enclosed testing grounds with large nets and  props, such as windows and doors. Karaman says testing drones in this way can work for vehicles that are not meant to fly fast, such as drones that are programmed to slowly map their surroundings, but fast-flying vehicles that need to process visual information quickly require a different regime.

“The moment you want to do high-throughput computing and go fast, even the slightest changes you make to its environment will cause the drone to crash,” Karaman says. “You can’t learn in that environment. If you want to push boundaries on how fast you can go and compute, you need some sort of virtual-reality environment.”