Artificial Intelligence

Our most-read tech stories of 2021

December 21, 2021

What will our digital worlds look like 10 to 15 years from now? And what future technologies will people need to interact seamlessly with these worlds? These are questions our researchers, engineers, and designers have been focused on answering. It’s complex work that’s part of our vision for the metaverse, which will transform the way we live, work, and play. 

Many of this year’s most popular reads were about the work we’re doing to prepare for what comes next:  

Inside the Lab: Expanding connectivity by sea, land, and air

Recommended Reading

We’re dramatically expanding people’s access to high-speed internet using new connectivity technology. We lay the foundation with subsea fiber-optic cables, which carry more than a thousand times the bandwidth of other communications technologies, to make fast and reliable internet possible for up to 3 billion people in even the farthest reaches of the world. On land, we’re tackling the challenge of delivering bandwidth closer to people’s homes with Bombyx, a robot that can climb and install fiber onto medium-voltage power lines. And in the air, we’re working to improve fiber connection with Terragraph, our technology that beams multi-gigabit connectivity.

How does News Feed predict what you want to see?  

In the seconds it takes to load the News Feed on Facebook, our machine learning (ML) ranking system sifts through trillions of posts and thousands of signals to predict what each person most wants to see. To score all these posts for a variety of factors, ML models run in parallel (and all in real time) on multiple machines, called predictors. Here, we share details about the many layers of our complex ranking system and the challenges of building it.  

Connect 2021: Our vision for the metaverse

At Connect 2021, we shared our vision of the metaverse as a world of interconnected digital spaces. We introduced Horizon Home, our early vision for a home base in the metaverse; Messenger capabilities that will eventually let you travel to VR destinations with friends; and advances in gaming and fitness. But our metaverse developments aren’t all fun and games. Soon we’ll begin testing Quest for Business, a new suite of features that lets you collaborate with coworkers and integrate services you already use, like Slack and Dropbox, in Horizon Home. We also announced a growing suite of developer tools, our next-generation VR hardware, and a $150 million educational outreach initiative. 

Ray-Ban and Facebook introduce Ray-B​​an Stories, first-generation smart glasses 

We continued to build tools that help people feel connected, anytime, anywhere, with Ray-Ban Stories. The smart glasses let you capture photos and video, listen to music, take phone calls, and easily share your adventures with others while staying in the moment. The components were carefully engineered to fit the iconic Ray-Ban silhouette.

Inside Facebook Reality Labs: The next era of human-computer interaction

Imagine a wearable computer that senses, learns, and acts in concert with you as you go about your day. Outlandish as this may seem today, Reality Labs (RL) has been tackling this multidisciplinary challenge, working toward a 10-year vision of human-computer interaction that is as seamless as it is personalized. Our ultimate goal is to build an interface that accurately adapts to you and meets your needs. And while this system is years off, our recently launched Project Aria is helping move us closer to this goal.

Inside Facebook Reality Labs: Wrist-based interaction for the next computing platform

We started imagining the ideal input device for AR glasses six years ago — one that could be used by anyone in myriad situations throughout the day. It needed to be built with privacy, security, and safety in mind, to give people meaningful ways to personalize and control their AR experience, and to operate through an intuitive, always-available, unobtrusive interface. This system is still many years off, but we’re looking at electromyography as a possible solution, using a wristband to record muscular nerve signals passing through the wrist and translate them into digital commands.

Inside Reality Labs Research: Meet the team that’s working to bring touch to the digital world

How do we touch the virtual world? Reality Labs (RL) Research Director Sean Keller and team have been exploring this question for years. “We use our hands to communicate with others, to learn about the world, and to take action within it,” says Keller. “We can take advantage of a lifetime of motor learning if we can bring full hand presence into AR and VR.” RL has spent years inventing a lightweight haptic glove to do this very thing. It requires hundreds of actuators (tiny motors) all over the hand, moving in concert to make the wearer feel like they’re touching a virtual object. Today, each glove is handmade by skilled engineers, but the goal is to one day invent a new manufacturing process. In other words: We’re just getting started.

 How one woman survived 9/11 — and shared her story through VR

Genelle Guzman-McMillan was the last survivor pulled from the rubble at Ground Zero. A young immigrant from Trinidad, she came to New York to follow her dreams, landing a job on the 64th floor of the World Trade Center just months before 9/11. Released by TARGO this fall, Surviving 9/11: 27 Hours Under the Rubble is a VR documentary centered on her experience. In this VR account of Guzman-McMillan’s story, you can walk through World Trade Center Plaza and take in the skyline as it looked before the towers fell.  

Ten women tech leaders you should know 

When her father went missing one night in Pune, India, Shilpa Lawande mobilized her social network to search for him, all from her home in Cambridge, Massachusetts. Within 60 hours, he was found — nearly 62 miles from home. For Lawande, Meta’s Engineering Director of Core Data, this story is just one example of the power of social networks to bring the world closer. Lawande is one of 10 women leaders helping shape the future of the industry. We sat down with them to discuss their backgrounds, what inspires them, and why they love working here. 

New research shows the potential of brain-computer interfaces for restoring speech communication

Earlier this year, as part of our collaboration with UCSF’s Chang Lab, researchers at UCSF published an important milestone for the field of neuroscience. New results demonstrated the first time someone with severe speech loss was able to type out what he wanted to say in real time, simply by attempting to speak. It was the first time in over 16 years (since experiencing near full paralysis of his vocal tract) that he’d been able to communicate without having to use a cumbersome head-mounted apparatus to type out what he wanted to say. This research not only offers a way to give someone back the ability to communicate — using tech that’s able to decode brain signals — but also carries important implications for the future of assistive technology.

As all these stories show, our leaders, researchers, engineers, and designers are creating groundbreaking new ways for us to connect. We are deep in the trenches, exploring new ways for future technology to enhance our lives and help us all connect more deeply. 

We're hiring AI scientists!

Help us drive scientific breakthroughs in core AI research

Artificial Intelligence

Meta creates breakthrough technologies and advances AI to connect people to what matters and to help keep communities safe.