Meta Connect 2022: Meta Quest Pro and the road to the metaverse
October 11, 2022
Last year at Meta Connect, Mark Zuckerberg shared our vision for the metaverse — a set of interconnected digital spaces that will let you do things you can’t do in the physical world and connect more deeply with the people who matter most. Today, he took the virtual stage alongside leaders from Reality Labs at Meta and some special guests to showcase the progress we’ve made toward that vision over the past year.
Today’s headliner was Meta Quest Pro — the first entry in our new high-end line of advanced headsets, built to expand what’s possible in VR. We’ve shared some details over the past year under the codename Project Cambria, and we’re excited to unveil the full package. It’s packed with innovative features like high-resolution sensors for robust mixed reality experiences, crisp LCD displays for sharp visuals, and eye tracking and Natural Facial Expressions to help your avatar reflect you more naturally in VR.
Two LCD displays that use local dimming and quantum dot technology to provide richer and more vivid colors
Custom local dimming tech, powered by specialized backlight hardware and accompanying software algorithms, which can control more than 500 individual LED blocks independently, giving the displays 75% more contrast than Meta Quest 2
37% more pixels per inch than Meta Quest 2
Re-engineered Touch Pro controllers with sensors of their own for more accurate tracking and a full 360-degree range of motion, plus our new TruTouch Haptics system to provide a wider and more precise range of feedback, plus rechargeable batteries
The VR developer ecosystem is booming. To date, over $1.5 billion has been spent on games and apps in the Meta Quest Store. We now have 33 titles that have made over $10 million in gross revenue, and the number of apps that have made over $5 million in gross revenue has doubled since last year, now at 55. And that success isn’t limited to the top developers. Of the 400+ apps on the Meta Quest Store, roughly one-third are making revenue in the millions.
The Walking Dead: Saints & Sinners has surpassed $50 million in revenue on the Meta Quest Platform alone—nearly double its revenue on all other platforms. It took just 24 hours for Zenith: The Last City to make its first $1 million in revenue on the Meta Quest Store, while Resident Evil 4 made its first $2 million in its first 24 hours. Blade & Sorcery: Nomad cleared its first $1 million in revenue in two days, while a number of other devs have hit the same milestone in as little as three days.
Of course, part of building a healthy and sustainable VR ecosystem is ensuring that developers have multiple ways to reach their audiences. That’s why we introduced App Lab, which since its launch last year has grown to over 2,000 apps. And some great titles like Smash Drums, Ancient Dungeon, and Puzzling Places got their start on App Lab.
We’ve also seen a shift in people’s behavior: When Meta Quest 2 launched, the majority of people’s time in VR was spent alone. Today, the majority of their time is spent in multiplayer and social apps — hanging out, playing games, learning new hobbies, and more.
Meta Horizon Worlds
We believe that the metaverse will bridge VR headsets, phones, laptops and desktops, and even devices that don’t exist yet. At Connect today, we showed off work we’re doing to bring Meta Horizon Worlds to other platforms, so you can eventually pick up your phone or laptop and visit friends who are hanging out in VR and vice versa. If you’re watching a comedy show for example, you’ll be able to send the link to your friends so they can join without a headset, no matter where they are. Making these virtual worlds accessible through any device takes their ability to connect people to another level. Worlds on the web is going to be the first way a lot of people around the world experience a virtual world. And while the web doesn’t give the full VR experience, it opens it to an entirely new population of developers, creators, and people.
We’re testing a way for you to take a video in Worlds and easily share it straight to Instagram as a Reel, helping creators connect with their audiences and share their work in a new way. We’re also working with some of the best creators on Instagram to see what kind of worlds they can build, so keep your eyes on your timeline for that.
We’re also making it easier than ever to build for Worlds. You used to have to build everything in VR. Now, we’re seeing incredible worlds being built by people outside VR in Crayta, a game and world building platform available on Facebook. And we’re continuing to expand the creative toolkit beyond VR. You’ll be able to use TypeScript, a powerful scripting language, to make more dynamic and interactive worlds. And we’re making it so you’ll be able to import tri-mesh items into Worlds and build parts of your worlds using 3D content creation tools like Maya, 3Ds Max, Blender, and Adobe Substance 3D.
These are high-quality tools that professionals use to build in 3D. One of the main benefits of supporting them is the graphics quality they produce. It will take a while to integrate all of these, but the result will be things looking a whole lot better. And we’re working with Epic Games to bring Creative Commons-licensed content from the Sketchfab library to Meta Horizon Worlds.
We talked about your personal space in Worlds — think of it as your home in the metaverse. It should be a social space where you can invite your friends over, no matter what devices they’re using. Most importantly, you should feel a sense of ownership of it and feel the same level of comfort as you do in your home in the physical world.
We also announced that next year we’ll begin a multi-year collaboration with NBCUniversal, bringing iconic comedy and horror experiences into the metaverse. The Office, Blumhouse, Universal Monsters, and Halloween Horror Nights will all come to Worlds, and you’ll be able to immerse yourselves in these IPs like never before via VR. You’ll also be able to catch up on your favorite NBCUniversal programming when the Peacock app comes to Meta Quest next year.
We announced new social innovations we’re working on with the YouTube VR team. YouTube is about to feel a lot more like a shared experience on Meta Quest. If you’re hanging out with friends in Meta Horizon Home, you’ll soon be able to bring up YouTube and watch videos together, just as if you were watching together in-person.
You’ll also be able to multitask, keeping a YouTube video up while you work or browse the web in Home.
And we’re working with the YouTube team to make the experience even more flexible in the future. If you’re using Meta Quest Pro, you’ll soon be able to take your YouTube videos (via a 2D panel) into other VR apps. Imagine watching a video about the construction of Mont-Saint-Michel while simultaneously assembling it in Puzzling Places, or taking boxing lessons while playing The Thrill of the Fight — or simply watching music videos while waiting your turn in Walkabout Mini Golf.
It feels groundbreaking today, but we expect this sort of multitasking will become common in the future, and we are thrilled to have YouTube VR as an early partner along the way.
VR isn’t just a great place to try out new types of games and immersive experiences — it’s fantastic for fitness, too. Whether you’re following along with a trainer or simply working out by accident, there’s a wide variety of apps available for Meta Quest that’ll get you moving. And at Connect, we announced several new products and updates coming this year that can help you make real progress in your VR fitness journey.
The Meta Quest 2 Active Pack is our first fitness accessory bundle, and we’re excited to reveal that it’ll be out on October 25 for $69.99 USD — and you can pre-order it from the Meta Store starting today! The bundle includes a new wipeable facial interface for your headset, an extra set of wrist straps, and new adjustable knuckle straps so you can keep throwing uppercuts without any fear of dropping your Touch controllers. When you use them together, these accessories can help you power through those sweaty workouts.
We also introduced the Made for Meta program, formerly Oculus Ready, through which we’re working with leading hardware manufacturers to bring more accessories to Meta Quest, starting next year.
If you’re a fan of Within’s Supernatural, you’re already familiar with the hit subscription-based fitness app’s Flow and Boxing workouts. But now, the app is adding a new type of exercise — knee strikes — that are an explosive way to activate your core and lower body, helping you develop balance and coordination. You’ll be able to find them alongside the other full-body movements in select Flow and Boxing workouts later this month.
Also coming to the Meta Quest Platform is Gym Class - Basketball VR, a physics-based basketball game that you can play with up to six people online. You can take your time practicing shooting and dribbling, or leap into the air to perform amazing dunks no matter your height or skill level. Gym Class - Basketball VR was a breakout hit on App Lab with over a million downloads, and we’re excited that IRL Studios will be bringing it to the Meta Quest Store this fall.
Finally, we revealed that we’ll be releasing a beta of our new Fitness API to select developers this Fall, which will make it easier for them to build fitness content for the Meta Quest Platform. Once implemented, our Fitness API gives people the option to share their real-time data from Move (like total calories burned and minutes spent exercising) with developers. This can lead to new personalized experiences, like a customized stats page or unlocking new levels based on your physical progress.
And if you’re the kind of person who thrives with some external motivation, next year we’ll also release a new way to share your fitness progress with selected friends, so they can support you on your VR fitness journey.
Evolving the future of work
Of course, VR is more than just fun and games (and gains). It can also help you be more productive and get work done. Meta Quest Pro was designed with productivity in mind and will be a major upgrade for those who use VR as a tool for work — but hardware is only part of the equation. We’re equally focused on creating software that improves the way you work and collaborate, both on Meta Quest Pro and Meta Quest 2.
To that end, we have multiple major updates coming to Meta Horizon Workrooms in the coming months.
First, a much-requested feature: You’ll soon be able to join Workrooms via Zoom. We’re also adding breakout groups, letting you seamlessly transition from large-group presentations into smaller and more intimate discussion groups, while still remaining together in the same room. It’s great for facilitating conversation and brainstorming. And for designers, architects, and others who work in three dimensions, we’re working on a way to review 3D models within Workrooms.
We’re also lowering barriers between VR and more traditional platforms. We call this project Magic Room, and it’s the future of meetings. The goal is to make collaborators feel equally present in a shared space, no matter where they are or what tech they’re using. We’re working on an update for Workrooms that will let people join a virtual office from whatever device they happen to have handy, and we want the experience to be great whether someone is using Meta Quest Pro or joining via their phone (though Magic Room will also work great with Meta’s other main work products, Meta Portal and Workplace).
And for those who do start using Meta Quest Pro later this month, you’ll be able to unlock the full potential of Workrooms with your new hardware. The included stylus tips for the new controllers give you greater precision and control for whiteboarding, while Natural Facial Expressions will let you make eye contact during a presentation — or smile as you wrap up on Friday and head into the weekend!
We’ve also made it clear that we can’t build the metaverse alone, and nowhere is that more obvious than people’s work day. Every company works differently — and uses different tools.
Microsoft CEO Satya Nadella joined us during Connect to announce that a new version of Microsoft Teams immersive meeting experiences is coming to Meta Quest. Teams connects hundreds of millions of people around the world and is an essential part of how they meet, call, chat, and do business. Bringing Teams to Meta Quest will enable them to work together in ways that simply aren’t possible on a 2D screen. We’re also exploring the ability to support Meta avatars and Microsoft avatars so you can collaborate in Teams immersive experiences.
There’s more to this announcement: Soon you’ll be able to join Teams immersive meeting experiences from inside Workrooms. We’re also bringing Microsoft Windows 365 to Meta Quest, so you can stream the Windows experience on Meta Quest Pro and Meta Quest 2 devices. Additionally, you’ll be able to access the suite of Microsoft 365 apps so you can interact with Sharepoint or Microsoft productivity apps, whenever and wherever you’d like.
Lastly, both Meta Quest and Meta Portal will support Microsoft Intune and Azure Active Directory. As companies begin integrating Meta Quest 2 and Meta Quest Pro into their day-to-day work, it’s crucial that they’re able to manage and secure their headsets the same way they would laptops and phones. Microsoft Intune and Azure Active Directory will provide companies the security and management options they both need and expect.
At last year’s Connect, we introduced Presence Platform, our suite of machine perception and AI capabilities — including Passthrough, Spatial Anchors, and Scene understanding — that let developers build more realistic mixed reality, interaction, and voice experiences that seamlessly blend virtual content with the physical world. Today, we shared the newest addition to Presence Platform: Movement SDK.
Movement SDK lets avatars mimic expressions in real time using Meta Quest Pro’s inward-facing sensors. While today’s avatars rely on spatial audio and immersive graphics to help create a sense of social presence, eye contact and facial expressions will be game changers for remote meetings and virtual performances in the metaverse.
With the introduction of Movement SDK, we’re able to support social presence for third-party character embodiment. Working with an internal team of artists, we created an alien character that we call Aura. She can wink, puff her cheeks, move her mouth from side to side, and more. As you can see above, Aura’s facial movements, skin tones, and eye movements are used to make her socially engaging. In addition to facial movements, the blend shapes can trigger other actions. For instance, the markings on her hair light up when she’s happy — an important social clue on her planet — and she’ll also flare her hair when she’s angry.
Hundreds of developers are already building VR apps that take advantage of Presence Platform — unlocking a new generation of more powerful and immersive experiences on Meta Quest. And even better, anything developers create for Meta Quest 2 using Presence Platform will also run seamlessly on Meta Quest Pro. Check out the Meta Quest Developer Blog to learn more.
Avatars and identity in the metaverse
Your avatar is you, or at least your digital representation. And that means our avatars need to provide you with near-infinite ways to express yourself in the metaverse and across Meta’s apps. Earlier this year we launched cochlear implants, over-the-ear hearing aids, and wheelchairs, but we have more improvements coming to representation, including additional body types and shaders for more realistic skin.
What you wear is important too. Today we announced that the Avatar Store is launching in VR later this year. You’ll be able to shop for virtual clothing from some of your favorite brands. We’re working with partners across sports, entertainment, and more to ensure that you can find clothes that fit your personal style. And we’re hoping that this will kickstart a marketplace for interoperable digital goods — meaning if you buy a sweater, you can wear it on your avatar no matter what app you’re using.
What else? Legs, of course. “I think everyone has been waiting for this,” Zuckerberg joked during today’s keynote. No more floating from the waist up!
It may sound like we’re just flipping a switch behind the scenes, but this took a lot of work to make happen. When your digital body renders incorrectly — in the wrong spot, for instance — it can be distracting or even disturbing, and take you out of the experience immediately. And legs are hard! If your legs are under a desk or even just behind your arms, then the headset can’t see them properly and needs to rely on prediction.
We spent a long time making sure Meta Quest 2 could accurately — and reliably — bring your legs into VR. Legs will roll out to Worlds first, so we can see how it goes. Then we’ll begin bringing legs into more experiences over time as our technology improves.
Next year we’ll also enable developers to start implementing custom Avatar actions and behaviors into their games and apps.
And remember, avatars are still evolving — to be better-looking, more capable and expressive, more customizable. Our next generation of Meta Avatars, previewed today at Connect, will be more expressive and detailed than what’s available today.
Each step brings us a bit closer to photorealistic avatars. Those are still a few years off, but we’re steadily improving both the technology and our understanding of how people want to show up in VR. We’re working on AI that will design an accurate avatar for you, so you don’t need to spend time tinkering with the pieces yourself (unless you want to). In the future, you might also have multiple avatars for different occasions — a serious photorealistic representation of yourself for work meetings, and a more cartoonish version for hanging out. You could even show up to the group hang as a movie character, or a dragon. Who’s to say?
The road to augmented reality
While we’ve made a lot of progress on virtual and mixed reality, two important windows on the metaverse, a lot of work remains to be done in augmented reality, where you see digital objects overlaid perfectly on the world around you. There are still a few years to go before the fundamental technology is advanced enough to support great AR glasses. We’ll need to see progress across the stack — compute, graphics, displays, sensors, AI, basically everything — before the dream of AR glasses is fully realized.
We’re investing in those areas (more on that below), but for glasses, the form factor is critical — all this tech has to be built into something lightweight and comfortable enough to wear casually. So we’re focused on building as much of the AR experience as we can fit into a normal pair of glasses that can blend in with your daily life.
We took our first step forward last year when we partnered with EssilorLuxottica to introduce Ray-Ban Stories, our first-generation smart glasses. In just a year, we’ve introduced big improvements: doubling video capture length from 30 to 60 seconds and making it easier to upload your content from the glasses to Instagram.
Hands-free functionality is a core part of Ray-Ban Stories, letting you stay connected and stay in the moment without having to fumble around for a phone. So soon you’ll also be able to call or text hands-free on Ray-Ban Stories from your phone number.
We also know people love listening to music on Ray-Ban Stories, so we’ll soon start rolling out Spotify Tap playback. You’ll just tap and hold the side of your glasses to play Spotify, and if you want to hear something different, tap and hold again and Spotify will recommend something new.
And we shared an update on Meta Spark. Starting today, we’re giving creators the ability to build interactive 3D objects using Spark Studio and begin testing them in mixed reality through the Meta Spark Player. We’re also building tools that will let creators better understand how gaze and spatial awareness can contribute to 3D content layered on top of the physical world.
At Reality Labs, we’re inventing a new computing platform — one built around people, connections and the relationships that matter. There’s a lot of ground to cover there, so check out our post on today’s Reality Labs Research news to learn more.
The research and products we showed today are part of a roadmap that extends far into the future — and we can’t build the metaverse alone. We want to bring the best developers, engineers, artists and others together to make this future a reality. Thank you for joining us on this journey. It promises to be a pretty epic ride.