Reality Labs

Inside Reality Labs Research: Meet the team that’s working to bring touch to the digital world

November 16, 2021

In March, we began a three-part series exploring the future of human-computer interaction (HCI). First, we laid out our 10-year vision of a contextually-aware, AI-powered interface for augmented reality (AR) glasses that can use the information you choose to share to offer proactive assistance, allowing us to look up and stay present with those around us. Next, we dove into some nearer-term research: wrist-based input combined with usable but limited AI, which dynamically adapts to you and your environment. Today, we conclude the series with a look at our haptic glove research and the advances in soft robotics, microfluidics, hand tracking, haptic rendering, and perceptual science that that work entails.

This demo reel shows a series of virtual interactions with Reality Labs’ haptic glove research prototype, including the manipulation of virtual objects, such as the throwing and catching of a virtual ball; multiplayer interactions, such as thumb war and a handshake; and multiplayer games, such as virtual Jenga.
Recommended Reading

There is a team deep in Reality Labs (RL) Research tasked with inventing the future of interaction in augmented and virtual reality. They aren’t just looking a couple of years down the road. They’re casting a vision — based on their expertise in highly technical fields — for what our digital worlds will look like in 10 to 15 years. Their job is to then create the future technology people will need for frictionless interaction with those worlds.

The work these researchers, engineers, and designers are doing is long-term research and development. In fact, it’s so novel that they are — in some cases — inventing entirely new domains of scientific research. The resulting technologies have the potential to not only fundamentally alter the course of augmented and virtual reality but to potentially influence fields as diverse as medicine and space travel.

This is the epicenter of the next era of human-computer interaction — a bold research project designed to tackle one of the central challenges of the metaverse: How do we touch the virtual world?

Touching the digital world with haptic gloves

Imagine working on a virtual 3D puzzle with a friend’s ultra-realistic 3D avatar. As you pick up a virtual puzzle piece from the table, your fingers automatically stop moving as you feel it within your grasp. You feel the sharpness of the cardboard’s edges and the smoothness of its surface as you hold it up for closer inspection, followed by a satisfying snap as you fit it into place.

Now imagine sitting down to work at a café and having a virtual screen and keyboard appear in front of you. The virtual keyboard conforms to the size of your hands and the space you have available, and it’s easily personalized to suit your preferences. You can feel the click of each keystroke, as well as the edges of the virtual keys on your fingertips, making it as easy as typing on a perfectly-sized physical keyboard.

How would these experiences enhance your connection to the virtual world? What would they do for your ability to be productive or perform any action in the metaverse?

The closest experience we have to this today is hand tracking on Quest, which lets you see a digital version of your hands in VR and manipulate virtual objects, but without actually feeling them in your hands. While this ability to use your hands directly in VR is a vast improvement over Touch controllers, without haptic feedback, we simply can’t be as dexterous in the virtual world as in the real world. The goal of this research is to change that.

Sean Keller leads AR/VR interaction and input research at Reality Labs Research. His multidisciplinary team is focused on building interfaces for the next era of computing, driving research across the domains of haptics, EMG input, soft robotics, design, perceptual science, applied machine learning, and more.

“The value of hands to solving the interaction problem in AR and VR is immense,” explains RL Research Director Sean Keller, who started the team and grew it from one person to hundreds of world-class experts in the span of seven years. “We use our hands to communicate with others, to learn about the world, and to take action within it. We can take advantage of a lifetime of motor learning if we can bring full hand presence into AR and VR. People could touch, feel, and manipulate virtual objects just like real objects — all without having to learn a new way of interacting with the world.”

Keller’s goal is to invent soft, lightweight haptic gloves that address both sides of the AR/VR interaction problem — helping the computer to accurately understand and reflect the wearer’s hand movements, and reproducing a range of complex, nuanced sensations for the wearer such as pressure, texture, and vibration to create the effect of feeling a virtual object with your hands. To succeed, these gloves would need to be stylish, comfortable, affordable, durable, and fully customizable. You’d be able to pair them with your VR headset for an immersive experience like playing in a concert or poker game in the metaverse, and — eventually — they’d work with your AR glasses too. Far more than simply a peripheral device, these gloves would make the virtual world tangible.

But a truly new haptic glove will require groundbreaking developments across many scientific and engineering disciplines. Keller’s team is up to the challenge.

“We’re creating almost everything about this discipline from scratch,” Keller says. “We’re learning how people perceive the sensations of touch and how they complete tasks. We’re figuring out how to fit the whole variety of human hand shapes and sizes, while maintaining mechanical coupling to the user. We’re pushing the boundaries of what’s possible with soft robotics and instrumented tracking systems. And we’re inventing entirely new soft materials and manufacturing technologies — it’s a clean break from the past.”

Faced with this monumental challenge, Keller and his team started by asking what it would take to create believable haptic sensations on the hand.

An early stage haptic glove research prototype from Reality Labs Research.

To deliver a realistic sense of touch, a haptic glove needs hundreds of actuators (tiny motors) all over the hand, moving in concert in a way that makes the wearer feel like they’re touching a virtual object. But existing mechanical actuators create too much heat for such a glove to be worn comfortably all day. They’re also too big, rigid, expensive, and power-hungry to render realistic haptic sensations.

Two years after getting started, the team had run up against the limits of traditional electrical, metallic components. They hypothesized that they could replace mechanical actuators with soft, pliable ones — made from entirely new materials — that could change shape in response to the wearer’s movements. But these actuators simply didn’t exist yet.

“You categorically can’t set 1,000 little motors and wires on the hand,” explains RL Research Hardware Engineering Director Tristan Trutna. “You can’t do it even if you have infinite resources to accomplish it — physically, it doesn’t fit. There’s simply too much mass and too much heat. If you need thousands of tangible forces in different locations at different distances, you either need pneumatics, hydraulics, or high-density electroactive actuators.”

The team turned to the emerging fields of soft robotics and microfluidics — technologies commonly used in prosthetic limbs and PoC diagnostic devices, respectively.

In the last two years, they’ve made significant breakthroughs in both pneumatic actuators, which use air pressure to create force, and electroactive actuators, which change shape or size in the presence of an electrical field.

To deliver a realistic sense of touch, a haptic glove needs hundreds of actuators all over the hand, moving in concert in a way that makes the wearer feel like they’re touching a virtual object. The team has made progress with miniaturizing pneumatic actuators (pictured), which use air pressure to create force.

To control these new soft actuators, they’re building the world’s first high-speed microfluidic processor — a tiny microfluidic chip on the glove that controls the air flow that moves the actuators, by telling the valves when and how far to open and close.

“What makes our work different from the broader field of microfluidics in general is that we have this emphasis on making things very lightweight, wearable, and fast,” notes RL Research Scientist Andrew Stanley. “For a haptic interaction, the actuator needs to pressurize against the fingertip very quickly as some event happens in virtual or augmented reality. Most microfluidics processes, like the ones used in chemical analysis, happen on the order of seconds whereas we’re looking at an order of milliseconds. We can get a faster response time with air. With our fluidic logic circuits, we’re able to eliminate heavy electromechanical components in the system by reducing the number of electromechanical valves that we need to control a large number of actuators.”

Haptic rendering: Building an accurate picture of the virtual environment

But even with a way to control air flow, the system would need to know when and where to deliver the right sensations. That called for advanced hand tracking technology that could enable the computer to know precisely where your hand is in a virtual scene, whether you’re in contact with a virtual object, and how your hand is interacting with the object.

It also called for a new type of rendering software that could send the right instructions to the actuators on the hand at precisely the right time, based on the hand’s location and an understanding of the virtual environment, including the texture, weight, and stiffness of the virtual objects in it.

Justin Clark and Forrest Smith are software engineers building haptic rendering tools at Reality Labs Research for next generation AR/VR interfaces.

“People generally think of ‘rendering’ as visuals,” says RL Research Software Engineer Forrest Smith. “We also use the word ‘render’ for haptics. What we’re doing here is taking the state of this virtual world and your interactions with it and rendering it to the actuators so that you feel the corresponding sensation.”

“To render real-time interactions with objects, we need to simulate the corresponding physics,” notes RL Research Engineer Justin Clark. A physics engine (the software used to simulate object interactions in video games) determines the direction, magnitude, and location of the forces your hand should experience while interacting with virtual objects. Haptic rendering algorithms then combine this information with the characteristics of the haptic device (such as the locations and properties of its individual actuators) in order to send the right instructions to the device.

“One of the challenges is building software that works with different types of actuators and supports a wide range of haptic experiences,” adds RL Research Software Engineer Andrew Doxon. “Ultimately, we’ll also need to build tools that allow people to create haptic content in the same way they would create visual or audio content.”

Combining auditory, visual, and haptic feedback

As the team continued its work into its fourth year, a third challenge emerged: In order to make the textures and sensations work, they’d have to model the physics of touch in a way that mimicked reality, but without being able to fully recreate the physics of the real world. While haptic gloves can provide valuable feedback, they can’t perfectly stop your fingers from closing as you attempt to grasp a virtual object, or stop your hands from passing through a virtual table as you rest them on the surface, for example.

They turned to perceptual science and multisensory integration — the study of how human senses work together to build our understanding of the world.

UX Research Science Manager Sophie Kim explains how the team is leveraging human perceptual capabilities to create sensations that are compelling enough to be believable. “Our brains are really good at taking a little bit of haptic signal, a little bit of visual signal, a little bit of auditory signal, and fusing it all together to really feel the sensation and be convinced that there’s an object that exists in your hand,” she says.

RL Perception Research Scientist Jess Hartcher-O’Brien describes how manipulating a cube can give us an idea of what that sensory integration might feel like in AR and VR. “If I pick up a cube, I already have assumptions about the type of material it is and how heavy it might be,” she notes. “I grasp it, I verify the material, so I’m combining the visual cues about its material properties and the haptic feedback that’s coming just from that first moment of impact. When I go to manipulate the object, my brain recognizes frictional forces and inertia and can work out how dense or heavy this object is. My visual system is updating based on how it sees my arm move. Proprioception tells me where my arm is in space, how quickly it’s moving, and what my muscles are doing.”

A haptic glove can even convince the wearer’s perceptual system that it’s feeling an object’s weight, by gently pulling on the skin of the wearer’s fingers with the actuators to mimic the tug of gravity on a held object. But it all has to be timed exactly right.

In one experiment from late 2017, shown in the following video, the team used a single vibrotactile device on a fingertip to provide haptic feedback as a series of virtual spheres made from different materials — wood, marble, foam — fell from the sky in VR. Each sphere had unique visual, audio, and haptic feedback cues associated with it, as it fell onto the tip of the subject's virtual finger.

In this 2017 experiment, the team used a single vibrotactile device on a fingertip to provide haptic feedback as a series of virtual spheres made from different materials — wood, marble, foam — fell from the sky in VR. Each sphere had unique visual, audio, and haptic feedback cues associated with it, as it fell onto the tip of the subject's virtual finger. The experiment demonstrated the importance of combining audio, visual and haptic feedback to create sensations that are compelling enough to be believable.

“All of the timing and engineering was just right for this audio-visual-haptic experience,” Keller says. “You could feel that it was a piece of foam or wood or marble. You could experience the sensation of these materials hitting your finger gently as they fell. When I experienced that, it was remarkable.”

Comfort meets customization with smart textiles

As the program matured, the team began to tackle glove comfort and the challenge of integrating the sensors and robotic actuators into the material of the glove itself. It was easy to see that a rigid, heavy, or otherwise uncomfortable glove — or one that easily fell apart — would immediately take the wearer out of any virtual experience. To avoid that, the glove would need to be lightweight, soft, and highly durable.

“We realized we needed to miniaturize novel technologies and to design systems to be multi-functional,” explains RL Research Process Engineer Katherine Healy. “Doing this would allow us to fit more, and do more, in less space, which is essential to enabling a comfortable form factor.”

Kristy Jost, Research Science Manager, and Kate Healy, Research Process Engineer, work on smart textiles and materials science on the haptic glove project at Realty Labs Research.

The materials group began inventing new, inexpensive polymers — flexible materials like plastics and silicone — that are comfortable and stretchable but customized at the molecular level to yield new functionalities (“smart textiles,” as they’re known, are also used in high-performance athletics). This required entirely new manufacturing technologies to turn those new materials into really fine fibers that could then be sewn, knitted, or woven into gloves.

“But conductive yarns alone just don’t have all the functionality we need for interactions in VR, which is why we’re exploring how to build multiple functions — including conductive, capacitive, and sensing functions — into the same fiber or fabric and enable a much slimmer, more wearable form factor,” adds RL Research Science Manager Kristy Jost.

Building a slim, lightweight haptic glove is one challenge. Customizing those gloves to fit billions of people is another.

That’s why the materials group is also exploring manufacturing techniques that could enable each glove to be custom-fitted for maximum haptic precision and comfort. Doing this means developing new methods of designing and building tiny actuators and creating new knitting and embroidery processes to precisely embed them in the gloves.

“Today the gloves are made individually by skilled engineers and technicians who manufacture the subsystems and assemble the gloves largely by hand,” says Healy. “We use semi-automated processes where we can, but manufacturing these gloves at scale will require the invention of new manufacturing processes.”

The materials team at Reality Labs Research is exploring new manufacturing techniques that could one day enable each haptic glove to be custom-fitted for maximum haptic precision and comfort. Achieving this will require developing new methods of designing and building tiny actuators and creating new knitting and embroidery processes to precisely embed them in the gloves.

Inventing the future: RL is just getting started

Some of the technologies needed to deliver believable haptic experiences in virtual and augmented reality don’t exist yet, but RL Research keeps pushing the state of the art forward, creating new breakthroughs to make haptic gloves a reality.

“I believe that haptics will be critical in the coming human-computer interaction revolution of AR/VR and the metaverse,” says RL Research Science Manager Nicholas Colonnese. “In the future, we might be able to render a ‘haptic click’ closing the sensorimotor loop for interacting with a virtual button in VR, or provide real-time training guidance for your sport of choice in AR, or share custom ‘haptic emoji handshakes’ when you greet your friends in the metaverse.”

RL’s haptic glove project started as a moonshot, but it’s increasingly feasible as the team successfully innovates and accomplishes research leaps across dozens of disciplines.

“When we started the haptic glove project, we asked ourselves whether we could build a mass-producible, affordable consumer device that lets people experience any tangible interface anywhere,” Keller says. “We couldn’t do it — not without inventing new materials, new sensors and actuators, new methods of integration and systems, new rendering algorithms, new physics engines, the list goes on. It just wasn’t possible, but we’ve forged a path that is plausible and could allow us to get there.”

Over the last seven years, Keller and his team have braved the unknown, pioneering new techniques, new technologies, and new disciplines. And they’re just getting started.

“The possibilities for this research are immense,” says Trutna. “While we’re focused on building a haptic glove, the breakthroughs we’re making in fluidic switching and control — not to mention soft robotics — could lead to radical advances for the medical industry in lab-on-chip diagnostics, microfluidic biochemistry, and even wearable and assistive devices.”

“Reality Labs Research is a forge for innovations,” adds Keller. “And it’s one of the perfect places to be an applied researcher because you have a feedback loop — you have a way to assess the impact of the things you’re discovering, learning, and building. Ultimately, you get to see how these things you create actually impact people and people’s lives. It’s very rewarding work, and you get to see that through to the end in a place like Reality Labs.”