Virtual Image Synthesis and Transformation for Autonomy Creates Photorealistic World, Helps Autonomous Cars Navigate

Researchers at MIT have devised a system for training autonomous driving systems how to drive. VISTA – Virtual Image Synthesis and Transformation for Autonomy – creates a photorealistic world with multiple steering possibilities, helping the cars learn to navigate through various scenarios before driving down real streets.

VISTA uses a small dataset based on how human drivers operate their vehicles and synthesizes an enormous number of new viewpoints from trajectories that the vehicle could take in the real world. The control system, or “controller,” for autonomous vehicles is rewarded for the distance it travels without crashing, which requires it to learn by itself how to reach a destination safely. In doing so, the vehicle learns to safely navigate any situation it encounters, including regaining control after swerving between lanes or recovering from near crashes.

“This is equivalent to providing the vehicle with an infinite number of possible trajectories,” says Daniela Rus, director of the computer science lab at MIT. “Because when we collect physical data, we get data from the specific trajectory the car will follow. But we can modify that trajectory to cover all possible ways of and environments of driving. That’s really powerful.”

The researchers reported In a paper published in IEEE Robotics and Automation Letters that a self-driving computer program trained by a VISTA simulator was able to navigate unfamiliar streets. When placed in situations that mimicked various near-crash situations, it was able to steer the car back into a safe driving trajectory within a few seconds.

“It’s tough to collect data in these edge cases that humans don’t experience on the road,” said lead author Alexander Amini, from the MIT Computer Science and Artificial Intelligence Laboratory (CSAIL). “In our simulation, however, control systems can experience those situations, learn for themselves to recover from them, and remain robust when deployed onto vehicles in the real world.”

After successfully driving 10,000 kilometers (~6200 miles) in simulation, the researchers applied the feedback trained computer program to an actual autonomous vehicle and tested it in the real world. Next, the researchers hope to simulate all types of road conditions from a single driving trajectory, such as night and day, and sunny and rainy weather. They also hope to simulate more complex interactions with other vehicles on the road.