Reality Labs

Boz to the Future Episode 7: The Future of Input for the Next Computing Platform with Thomas Reardon

January 4, 2022

Welcome back for the seventh episode of Boz to the Future, a monthly podcast from Reality Labs (RL). In today’s episode, our host, Head of RL, and incoming Meta CTO Andrew “Boz” Bosworth is joined by RL Director of Neuromotor Interfaces Thomas Reardon to talk about the more intuitive inputs we’re building for future computing platforms.

Recommended Reading

Reardon is a computational neuroscientist, software technologist, and CEO Emeritus of CTRL-labs, which he founded with fellow scientists from Columbia University. Following CTRL-labs’ acquisition by Meta in 2019, Reardon now serves as Director of Neuromotor Interfaces at Reality Labs. Prior to his PhD research at Columbia, Reardon had a storied career in software development. He’s perhaps best known as the creator of the Internet Explorer project at Microsoft. And as a founding member of the W3C, Reardon contributed widely to the early architecture, protocols, and standards of the web.

Together, Reardon and Bosworth take a look at the future of human-computer interaction and the role that wrist-based electromyography (EMG) input may have to play with future augmented reality (AR) glasses.

For generations, humans have been adapting to machines and learning how to use the various inputs available. Take the QWERTY keyboard as an example: It had a learning curve to it, and although it’s not really intuitive for mobile devices, it’s the paradigm we’ve come to know and accept. In the future, technologies like EMG at the wrist will help decode electrical signals from our muscles based on very small movements of our hands or fingers — and eventually just from the intention to perform a gesture. Those signals can then be used to translate physical actions that we’ve already decided to perform into digital commands in order to directly control a device. It’s a much faster way to act on the instructions that you already send to your device today when you tap to select a song on your phone, click a mouse, or type on a keyboard.

Longer-term, EMG may let you do things you can’t accomplish through movement alone. Imagine being able to type with extra fingers on each hand that were just as skillful as your real fingers or perform actions at a distance, like summoning and dismissing virtual tools and objects in AR like “The Force.” That’s the kind of technology Reardon’s team is building — not just to deliver superhuman abilities, but to help people do more with less. It’s exciting work that could transform the way we interact with computers in the future. We shared some of our latest EMG input research at Connect 2021, showing how wrist-based EMG input technology could one day allow you to send a message in augmented reality with your hand resting comfortably at your side. While it’s early days yet, we’re committed to innovating responsibly. That’s why we’re talking about this work early, considering both the people who will use this technology and those who won’t, and thinking through how to build this technology inclusively for people of various communities.

The pair also spoke about Reardon’s nonlinear career path, from pursuing a degree in classics years after working on internet protocols with Sir Tim Berners-Lee to a long, impressive tenure at Microsoft. Taking him off the beaten path, Reardon has had a rich career that has profoundly impacted technologies we use every day, and he continues to work on problems that have the potential to greatly improve the way we interact with machines. His advice to those with a curious mind: Don’t worry about selecting what you’re going to work on — focus instead on finding awesome people to do it with.

You can tune in to Boz to the Future on Apple Podcasts, Spotify, and Facebook — or right here on Tech@Facebook. We’ll see you next month for a new episode.

You can follow Bosworth on Instagram and Twitter @boztank.


Reality Labs is hiring. Click here to view our current open positions.