3 Ways AR/VR Are Improving Autonomous Vehicles
A lot of work still needs to be done before we start hailing our automated Ubers and Lyfts, whizzing around cities in fully autonomous vehicles. And Virtual, Augmented and Mixed Reality will play a large role in making the (safe) driverless vehicles of tomorrow a reality.
Virtual, Augmented and Mixed Reality (VAMR) aren’t just for games and entertainment, despite what many think. VAMR will touch every aspect of our lives, from how we shop for clothes to the way we consume media.
Autonomous vehicles, another one of the most exciting disruptive technologies, will also be impacted by VAMR in several ways. Here’s a look at three ways VAMR is improving autonomous vehicles.
Simulated Testing Ground
Mixed Reality (MR) Prototyping will provide a safe testing ground for autonomous vehicles, which are yet to be perfected. The Mixed Reality Lab at the University of Southern California (USC) has been using MR Prototyping to explore human-machine teaming; as of writing, the lab has successfully paired people with autonomous drones.
Traditional research, design, and development would put real drones and real people in very real danger while algorithms and other aspects are sorted out. Instead, the USC lab has successfully virtualized all aspects of the pairing within a game engine – virtual drones and virtual humans, together, in virtual environments.
Within this simulation, the level of “reality” can vary, and geo-specific terrain can be used if desired. Many of the design parameters and algorithmic problems associated with human-drone pairing can be solved in the virtual space. Afterwards, reality gets “mixed,” with some of the elements (e.g. autonomous drones) flying in both the virtual and physical space, and the humans remaining exclusively in the virtual space. More of the problems are resolved, and then the system can go fully physical when the algorithms are well-behaved.
Testing human-drone pairing in a VAMR space doesn’t differ that much from testing autonomous vehicles in a wholly virtual space – engineers can test, tweak, and ultimately educate autonomous vehicles in a virtual space, removing all danger to humans. Additionally, since the virtual and physical can be tied together in real time, that allows for remote collaboration as well as connection to other virtual systems, further removing the danger to humans and improving testing outcomes.
Visual Displays Will Improve Situational Awareness
New forms of visual displays, combined with aural and haptic feedback, will be designed to improve driver situational awareness and increase safety; combining these with other active systems based on computer vision, such as lane departure and auto-braking, presents the promise of lower accidents, fatality rates, and more.
There is perhaps a romantic notion of the traditional sedan dashboard evolving into something akin to an F-35 cockpit, which is actually not the way autonomous vehicle’s interfaces should be designed. Pilots, especially those who fly fighter jets, are highly trained and are a far cry from the normal driver, in terms of reaction time, decision making ability and much more. For normal drivers, less is probably more.
Portable Environment for Riders
AI will end up being a major player for autonomous vehicles, and it will likely be a combination of highly-optimized computer vision algorithms, next-generation path planning and traffic flow monitoring and metering. In the case of fully autonomous vehicle, VAMR may end up as the “portable environment” for riders.
For example, as telepresence capabilities improve, the ability to “take a meeting” while riding to work may become a viable reality. While we run the risk of personal isolation, VAMR combined with autonomous travel could provide the productivity increase technology has long promised, while also providing a bit of an escape from mundane commutes. Additionally, VAMR combined with autonomous travel could supplement taking the traditional phone call while driving, which would significantly reduce accident rates, fatality rates, and more.
Autonomous vehicles are upon the world, and there’s much to be excited about. But before we start hailing our automated Ubers and Lyfts, whizzing around cities in the backseat of a car with no driver, a lot of work needs to be done. VAMR will play a larger role than many think in making the (safe) driverless vehicles of tomorrow a reality.
About the Author
Todd Richmond, IEEE member, currently heads USC’s Mixed Reality Lab as its Director, where he works in a variety of areas related to emerging disruptive technologies tied to Augmented Reality, Virtual Reality, Mixed Reality, and Artificial Intelligence and their implications/applications for training, learning, and operations; future environments for communication and collaboration, immersive technologies, interactive education, and visualization and analytics are some of the areas in which Richmond and his team focus. Richmond is also research faculty at USC’s School of Cinematic Arts, working to better understand VAMR.
Richmond earned a B.A. in chemistry from the University of San Diego and a Ph.D. in chemistry from Caltech, followed by a postdoctoral fellowship in protein engineering at U.C. San Francisco.