There’s nothing quite like the call of the open road. Whether it’s quality conversation with a friend or family member on an epic road trip, enjoying a podcast or your favorite album while driving solo, or something in between, just about everyone has some fond memories behind the wheel or in the passenger seat. But what if we could make our time spent in cars more productive, social, and entertaining than ever before?
That question is at the heart of our partnership with BMW. Announced in 2021, the goal of this research project is to explore how augmented and virtual reality could one day be integrated into smart vehicles to safely enhance the passenger experience. BMW has long been a leader in cutting-edge automotive technology — they don’t call it the ultimate driving machine for nothing — and the BMW Research Group first began investigating the possibilities of integrating AR into its cars more than a decade ago. So it was a natural next step to team up with Reality Labs Research and combine forces to attempt to solve this problem and level up the future of travel.
Oculus Insight vs. the open road
At first blush, it may not seem obvious why delivering quality AR and VR content for a car passenger would be a problem. After all, modern-day VR headsets are equipped with a number of sensors. And since the 2019 commercial debut of Meta’s Oculus Insight technology, which combined state-of-the-art computer vision and state estimation technology, we’ve been able to cut the cord and deliver a tracking system that can accommodate the full range of a person’s movements (known as six degrees of freedom, or 6DOF) while also pinpointing the location of two handheld controllers, as well as the headset in space — all with accuracy down to less than a millimeter.
But moving vehicles pose a tricky challenge: Tracking technology like Oculus Insight use both inertial motion sensors (IMUs) and cameras to precisely estimate the headset’s location and motion. However, in a moving environment — more precisely, in a non-inertial reference frame — these two modalities are in conflict as the cameras observe motion relative to the inside of the car, while the IMUs measure acceleration and rotational velocity relative to the world. That mismatch means today’s VR headsets can’t display stable virtual content when traveling inside a vehicle as it turns and/or accelerates.
At least, that is, until now.*
To solve this problem, we collaborated with BMW to incorporate IMU data from a BMW car’s sensor array in real-time into the tracking system of our Project Aria research glasses. This additional information allows the system to calculate the glasses’ location relative to the car. That was a huge feat because, after transferring the tracking system to a Quest Pro, it enabled us to accurately anchor virtual objects to a moving car using a digital twin of the car. We’ve been able to demo some compelling virtual and mixed reality passenger experiences in moving cars using this new tracking system and Meta Quest Pro. The next step will be to add the car’s location relative to the world, which would enable world-locked rendering.
“Our research prototype shows that we can enable entertaining and comfortable passenger experiences that are anchored to the car itself, including VR and MR gaming, entertainment, productivity, and even meditation capabilities,” says Richard Newcombe, Vice President of Research Science, Reality Labs Research. “The technology has the potential to transform how we can safely interact with our environment while traveling, and as we progress into reliable world-locked content on the road to AR glasses, we’re hopeful it will be possible for passengers to see things like markers for landmarks, restaurants, places of interest and more.”
Initial research findings
Our work with BMW has demonstrated that with the additional IMU measurements from the vehicle, we can display stable car-locked VR and MR content to passengers when driving — even when making rapid turns, going over road bumps, or accelerating.
“It is too early to tell exactly how or when this technology will make it into customers’ hands, but we envision a number of potential use cases for XR devices in vehicles — from assisting the driver in locating their car in a crowded parking lot to alerting them to hazards on the road and surfacing important information about the vehicle’s condition,” notes Claus Dorrer, Head of BMW Group Technology Office USA in Mountain View. “The implications of future AR glasses and VR devices — for passengers as well as drivers — are promising. The research partnership with Meta will allow us to discover what immersive, in-vehicle XR experiences could look like in the future and spearhead the seamless integration of such devices into cars.”
The road ahead
Long term, we hope to continue working with BMW to further leverage the ever-growing machine perception capabilities of modern cars to enable future use cases. Access to the car’s precise 6DOF positioning system could enable rendering world-locked virtual content outside of the vehicle, like identifying landmarks and other points of interest. We expect this capability to be invaluable for future AR glasses and personalized AI assistants.
If we get it right, this technology could revolutionize travel in cars, trains, planes, and beyond, unlocking new forms of hands-free communication, entertainment, and utility — giving us far more value than the screens and instruments we’re used to seeing in vehicles today.
It’s an exciting vision, to be sure. And it’s a road we’re committed to traveling.
The metaverse won’t be built by any one company alone. It will take companies big and small working alongside developers, creators, academics, and policymakers. Our partnership with BMW is an example of this in Europe. Academic and industrial research institutions interested in participating in Project Aria may submit their proposals here.