Facebook Reality Labs has outlined its plan to create extended reality (XR) devices by creating comfortable, wearable devices to use the human body and senses as the interface, including plans to develop a neural input which harnesses signals from the spinal cord to control devices.
The development would combine “contextually aware-AI” to navigate real world environmments through wearable AR glasses with a soft wristband and haptic gloves to communicate with the AI interface in a hybrid XR experience.
In a statement, Facebook commented: “As you enter the cafe, your Assistant asks, “Do you want me to put in an order for a 12-ounce Americano?” Not in the mood for your usual, you again flick your finger to click “no.”
“You head to a table, but instead of pulling out a laptop, you pull out a pair of soft, lightweight haptic gloves. When you put them on, a virtual screen and keyboard show up in front of you and you begin to edit a document. Typing is just as intuitive as typing on a physical keyboard and you’re on a roll, but the noise from the cafe makes it hard to concentrate.
“Recognising what you’re doing and detecting that the environment is noisy, the Assistant uses special in-ear monitors (IEMs) and active noise cancellation to soften the background noise. Now it’s easy to focus. A server passing by your table asks if you want a refill. The glasses know to let their voice through, even though the ambient noise is still muted, and proactively enhance their voice using beamforming. The two of you have a normal conversation while they refill your coffee despite the noisy environment — and all of this happens automatically.”
Facebook believes that future users may gesture with a hand, make voice commands or select items from a menu by looking at them, enabled by hand-tracking cameras, a microphone array and eye-tracking technology, as well as the potential for neural input options.
Facebook commented: “This approach uses electrical signals that travel from the spinal cord to the hand, in order to control the functions of a device based on signal decoding at the wrist. The signals through the wrist are so clear that EMG can detect finger motion of just a millimeter. That means input can be effortless — as effortless as clicking a virtual, always-available button — and ultimately it may even be possible to sense just the intention to move a finger.”
Photo credit: Facebook