Reality Labs

On the big screen: An immersive mixed-reality centerpiece for Meta’s first store

May 18, 2022

The doors are now open at the Meta Store, our first-ever brick-and-mortar shop, in Burlingame, California. The store stocks the latest Meta products, including Quest 2, Ray-Ban Stories, and various Portal models, but the real magic is when guests are able to experience them in person for the first time. These experiences are available to anyone who visits the store, thanks to a giant digital wall that displays the virtual gameplay of a guest wearing a Quest 2 headset alongside actual footage from a nearby camera, creating a real-time mixed-reality experience. The demo leverages state-of-the-art segmentation, high-spec broadcasting tools, and a range of custom-built systems that help deliver a seamless experience. 

“One of the superpowers of our store is offering guests highly immersive and experiential demos,” says Jason Fruy, Meta Customer Experience Design Lead. “Spectators will feel like they’re in the headset without actually putting it on. We tried our best to make that a spectacular experience.” 

Alex Meland from Between Realities playing Beat Saber in the Meta Store's mixed-reality demo area.
Recommended Reading

Hannah Dubrow, Interactive Producer for the Meta Store, says that it’s hard to appreciate what people are experiencing in VR when you aren’t there with them. While the store could have relied on real-time point-of-view feeds from the headsets, that can be a bit jarring due to a disconnect between the live player’s actions and gestures and what’s happening onscreen. That can make the experience feel less real and more like watching something on film. 

“To accurately show what it’s like to play a game in VR to others, capturing it in mixed reality is essential,” says Dubrow. That’s because mixed reality displays the player merged into the virtual game environment, giving viewers who aren't in the headset the ability to see real-time interaction as if they were also in VR.

Delivering this unique, seamless experience was a special technological challenge. Meta’s designers and engineers partnered with specialized studios and developers to build custom software solutions, including mixed-reality segmentation and broadcast programs and custom apps, to turn their ambitious vision into reality. “The sheer number of systems that were harnessed to make this demo as seamless and magical as possible is pretty special,” says Dubrow. 

“In this journey, we were constantly problem-solving and navigating tech limitations,” says Fruy. “But our team worked closely with our partners to drive decisions, guide solutions, and maintain the vision for the best possible Quest 2 experience for our guests.” 

State-of-the-art segmentation

The digital wall is a curved TV screen that’s 23 feet wide and 7 feet tall. “It creates this larger-than-life experience, at almost one-to-one scale, in the ‘real’ space,” says Dubrow. 

Green screens, a visual effects technique that’s commonly used in movies or television, was an option for the demo space, but the team knew they wanted something more immersive. “We didn’t want the store to look like a film studio; we wanted the experience to feel like magic,” says Dubrow. 

To achieve that magic, they needed technology that could perform real-time background removal, like a green screen, while also doing something more challenging: cleanly separating the person playing the game live from the broadcast of them in the game on the screen behind them. That’s because their mixed-reality demo would composite the outline of the player, positioned in front of the digital wall, into the virtual content.  

That called for state-of-the-art image segmentation, a computer vision technique used to understand what is in a given image at a pixel level. Among other abilities, the system would have to accurately gauge the depth of spaces and execute deep body and gesture analysis of human silhouettes. “We needed to build a program that could tell the difference between two very similar objects in the same camera field — the human and the cast of the human that it is seeing,” said Dubrow. “But we were smart, right off the bat,” she adds. “From the very beginning, we partnered or brought in people who specialized in all the areas we needed to make this work.” 

They joined forces with specialist studio Scout House to develop a custom segmentation program that leverages image recognition and deep learning systems to analyze the depth of the demo area and accurately identify human forms to separate them from other objects. For example, the program can distinguish between the actual human standing on the floor and the image of the human on the screen because the camera is calibrated for two modes of operation. “One is for the empty stage, so it’s looking for the differences such as the addition of the player, and the other is for isolating a certain depth range, excluding objects outside that range,” says Dubrow. The system can also infer the outline of people’s legs and remove the space in between the limbs, rather than capturing the whole silhouette. 

“The program also does all of this processing very quickly, compositing the human into the endgame content in just a fraction of a second,” says Dubrow. “That gives us very sharp segmentation with a low lag time,” she adds.

Moving pieces 

The segmentation of the real-time camera feed was just one piece of the broadcasting puzzle. “There’s a lot happening all at the same time during a cast, and so many individual pieces that all have to come together,” says Dubrow. She reels off a list: The headset’s IP address connects to the mixed-reality capture (MRC) server via a purpose-built demo associate (DA) iOS app, the Open Broadcaster Software composites the gameplay and real-time feeds, and a media processor layers the composited feed over an HTML page. Simultaneously, that HTML page is pulling the game title information from the session data and is all ready to animate in sequence behind the Demo Lead-In countdown video that reveals the MRC. And this all happens in just three seconds, says Dubrow. 

Getting all these moving pieces moving in unison and developing the systems that needed to interface with one another required collaboration with a few highly specialized companies. “If you were to look at kind of a flat map chart of everyone involved, there were at least five or six other groups we worked with beyond our own team,” says Fruy. “This was a first-of-its-kind demo experience, so it required several pieces of custom software, apps, and integrations.” In addition to Scout House, the team partnered with ForwardXP to develop the DA app that controls the headset as well as the headset app launcher, and with Systems Innovation to develop the demo’s broadcasting tools. 

These systems also had to integrate with both our own custom software and third-party systems to enable players to schedule an appointment, set their preferences, and receive a clip of their demo afterward. For example, a feature called Meta Store Connect allows guests to book their demo remotely. They can also link their Facebook account to the demo if they want to share their demo clip for their friends or family to see.

The Meta Store Connect app allows guests to book their demo remotely and gives them the option to receive a takeaway clip.

That appointment booking system interfaces with a tablet onboarding app developed by Deeplocal, which allows guests to select from a list of games to play, including Beat Saber, GOLF+, Real VR Fishing, and Supernatural. The system also enables guests to set their preferences, including the ability to select a private experience that’s not broadcast to the store or to book an appointment with only their first name rather than their full login. Meanwhile, the DA app periodically checks in with the OA server for updated appointments, names, game selections, and preferences. 

After their demo, guests can request a 30-second mixed reality-clip of their experience, which consists of themselves augmented in their gameplay. This digital takeaway can be sent via Messenger from Meta Store Connect or as a downloadable link sent via SMS. Store Connect can also continue the guest’s dialog with Meta Store and hand them off for more Quest, Portal, and RayBan Stories product learning and experiences.

An example of the digital takeaway visitors can receive.

“Many of these systems were innovative because they needed to creatively integrate with so many signals and calls,” says Dubrow. She cites the example of the DA app, which scans a near-field communication chip hidden inside the headset’s strap clip that contains the last two digits of the fixed IP address assigned to the device.

The granular attention to detail on the demo experience also applied to sound, explains Dubrow, who has a background in theater design and production. The team decided to compose much of the store’s background music to ensure a seamless sensory experience. But they also wanted to avoid sound from the store’s various exhibits bleeding into their demo arena, and detracting from the experience. “We worked with sound designers to find the right hardware and directional speakers, and to help us be extra thoughtful about the levels,” said Dubrow. “That allows sound and music to actually complement each other as you move from one zone to the next.”

Nothing was left to chance. The team spent a fortnight before the opening of the store evaluating and user-testing replicas of their new systems and ironing out glitches. While testing went smoothly, the team encountered some eleventh-hour problems shortly before the Meta Store opened its doors, with around one in five demos unexpectedly crashing. It required around 200 hours of troubleshooting and updating that continued right up until the Sunday before the store opening. 

“We had some issues with the MRC plugin, but luckily the Meta VR Camera team, who created the plugin, were in the building across the way,” explains Dubrow. “They were quick to write us a patch and update the Quest OS with no time to spare before the opening.”  

Magical moments

The team’s fastidious attention to detail has paid off. “The moment when the video introduces the cast, and it seamlessly comes up on the screen is very special, almost magical,” says Dubrow. These transportive moments were a goal of the Meta team, who kept a particular group of guests in mind while designing the store: people who might feel reluctant, anxious, or intimidated about using VR. “Giving those people a spectacular experience is the best way of overcoming that hesitancy,” explains Dubrow. 

As we work toward that future and help build the metaverse, the store will need to grow and adapt to continue being the best place to try out our latest products and truly experience what is possible. And the store has been built with this in mind: “We’ve designed the store to be modular, scalable, and extensible, and prepared to grow and change over time,” says Fruy. 

“We know that people can only fully understand Meta’s products once they’ve experienced them,” says Fruy. “We obviously want guests to purchase Meta products and make them part of their lives, but we also want to show them what’s possible with our products today, while giving a glimpse into the future as the metaverse comes to life.”

We're hiring AR/VR engineers & researchers!

Help us build the metaverse

Reality Labs

Reality Labs brings together a world-class team of researchers, developers, and engineers to build the future of connection within virtual and augmented reality.