Meta Shows Off Wrist-worn XR Controller Prototype to Ray-Ban Parent EssilorLuxottica

Meta CEO Mark Zuckerberg showed off a prototype of the company’s wrist-worn XR controller this week to EssilorLuxottica, the Italian parent company behind the Ray-Ban Stories camera glasses and a host of other conventional luxury eyewear brands. Although outwardly it appears to be similar to what the company revealed last year, it’s a definitely vote of confidence by Meta in the tech’s potential as the basis of a future XR input device.

Posting on Instagram, Zuckerberg showcased the prototype to EssilorLuxottica President and CEO Leonardo Del Vecchio, who can be seen pinching two fingers together in what is likely a clicking-style selection gesture.

Image courtesy Meta, Mark Zuckerberg

Here’s what Zuckerberg had to say about the trip:

“Great to be back in Milan to discuss plans for new smart glasses with Leonardo Del Vecchio and the EssilorLuxottica team. Here Leonardo is using a prototype of our neural interface EMG wristband that will eventually let you control your glasses and other devices.”

Zuckerberg hasn’t said as much, however the device looks awfully familiar, albeit a little rougher than what the company revealed in March 2021.

The wrist controller, which exists thanks to the acquisition of CTRL-Labs in 2019, is based around an array of electromyography (EMG) sensors that detect electrical signals which control the muscles in your hands. When first unveiled a year ago, Meta researchers said they were also looking into a number of haptic technologies that would help the user feel input too.

A smaller, more refined version of the neural input tech could prove valuable since the prospective AR glasses user wouldn’t need to use cumbersome controllers on the daily, or rely entirely on optical hand tracking either, which critically doesn’t provide haptic feedback.

Late last year the Italian eyewear company co-released Ray-Ban Stories, the first in a line of smarter glasses that Meta hopes will pave the way towards the future of smart and stylish AR devices. Ray-Ban Stories, which is something of a cross between Bose Frames and the first three generations of Snap’s Spectacles (re: not AR or smartglasses), is one of the few products featured in the newly opened Meta retails stores, which include demo areas for Meta Quest 2 and Portal.

As Zuckerberg said, the companies are still engaged in planning for new smartglasses, so a close hardware partner with the cachet of EssilorLuxottica is certainly worth impressing with visions of the future.

Meanwhile, Meta says it will be releasing four new VR headsets by 2024, which ought to give us plenty of XR devices to look forward to in the years to come.

The post Meta Shows Off Wrist-worn XR Controller Prototype to Ray-Ban Parent EssilorLuxottica appeared first on Road to VR.

Snap Acquires Brain-Computer Interface Startup NextMind

Snap announced it’s acquired neurotech startup NextMind, a Paris-based company known for creating a $400 pint-sized brain-computer interface (BCI).

In a blog post, Snap says NextMind will help drive “long-term augmented reality research efforts within Snap Lab,” the company’s hardware team that’s currently building AR devices.

“Snap Lab’s programs explore possibilities for the future of the Snap Camera, including Spectacles. Spectacles are an evolving, iterative research and development project, and the latest generation is designed to support developers as they explore the technical bounds of augmented reality.”

Snap hasn’t detailed the terms or price of the NextMind acquisition, saying only that the team will continue to operate out of Paris, France. According to The Verge, NextMind will also be discontinuing production of its BCI.

Photo captured by Road to VR

Despite increasingly accurate and reliable hand and eye-tracking hardware, input methods for AR headsets still isn’t really a solved problem. It’s not certain whether NextMind’s tech, which is based on electroencephalogram (EEG), was the complete solution either.

NextMind’s BCI is non-invasive and slim enough to integrate into the strap of an XR headset, something that creators like Valve have been interested in for years. It’s also

Granted, there’s a scalp, connective tissue, and a skull to read through, which limits the kit’s imaging resolution, which allowed NextMind to do some basic inputs like simple UI interaction—very far off from the sort of ‘read/write’ capabilities that Elon Musk’s Neuralink is aiming for with its invasive brain implant.

Snap has been collecting more companies to help build out its next pair of AR glasses. In addition to NextMind, Snap acquired AR waveguide startup WaveOptics for over $500 million last May, and LCOS maker Compound Photonics in January.

Snap is getting close too. Its most recent Spectacles (fourth gen) include displays for real-time AR in addition to integrated voice recognition, optical hand tracking, and a side-mounted touchpad for UI selection.

The post Snap Acquires Brain-Computer Interface Startup NextMind appeared first on Road to VR.

Facebook Acquires BCI Startup CTRL-Labs to Develop Neural Input Device

Facebook announced today plans to acquire CTRL-Labs, a brain-computer interface startup developing hardware and software for decoding electrical activity from the brain to be used for computer input. Facebook says that with the company’s expertise, it plans to develop a wrist-worn neural input device.

Facebook’s VP of AR and VR, Andrew “Boz” Bosworth, announced the planned acquisition of CTRL-Labs via his Facebook page today. Bosworth said that the company will join the Facebook Reality Labs team, Facebook’s VR and AR research and development group. The price of the acquisition was not announced; founded in 2015, CTRL-Labs had previously raised $67 million in funding over three rounds of investments, according to Crunchbase.

Bosworth said that Facebook will continue to pursue CTRL-Labs’ effort to build a wrist-mounted neural input device which can detect and decode both muscle activity and electrical activity from the brain, allowing for both motion-based tracking and control as well as ‘intention-based’ input.

“We know there are more natural, intuitive ways to interact with devices and technology. And we want to build them […] we hope to build this kind of technology, at scale, and get it into consumer products faster.” Bosworth wrote. “Technology like this has the potential to open up new creative possibilities and reimagine 19th century inventions in a 21st century world. This is how our interactions in VR and AR can one day look. It can change the way we connect.”

CTRL-Labs was already building a dev kit of such a device called CTRL-kit. The wrist-worn device is described as a “non-invasive neural interface platform that lets developers reimagine the relationship between humans and machines with new, intuitive control schemes.” The kit was “in preview for select developers” but not yet openly available at the time of the acquisition.

In addition to the hardware itself, a major part of the company’s focus has been using machine learning to decode signals detected in the wrist and turn them into useful digital input. The company believes that achieving that goal will allow users to one day have seamless control over electronic devices, which Facebook clearly believes could extend into the AR and VR realms too.

If your thinking some of CTRL-Labs work looks and sounds familiar, it isn’t déjà vu.

The Myo armband input device | Image courtesy Thalmic Labs

Startup Thalmic Labs was working on a similar premise for a wrist-worn input product called Myo. Though the company did take their product to market they ultimately made a hard pivot away from a wrist-worn wearable, discontinuing Myo and rebranding themselves to ‘North‘, ahead of launching their ‘Focals’ smartglasses in late 2018. The company was apparently quite certain they no longer wanted to pursue the wrist-mounted approach that they actually sold related patents to CTRL-Labs earlier this year.

As with Myo before it, the big question for the CTRL-Labs Facebook wrist-worn input device is how precise it can potentially be. While Myo allowed for coarse gestures suitable for the likes of ‘play, pause, next track, etc’, its inputs were nowhere near the fidelity needed for active AR or VR input. However, with mostly passive smartglasses likely to be the first step toward active AR devices, even coarse input would pair well with early use-cases.

SEE ALSO
Digital Frontier: Where Brain-computer Interfaces & AR/VR Could One Day Meet

From the outside, it looks like Facebook is betting that its expertise in machine learning will pair well with CTRL-Labs’ vision of one day delivering precise and reliable inputs from a wrist-worn device… one which would ultimately serve as a primary input device for Facebook’s upcoming AR headset.

The post Facebook Acquires BCI Startup CTRL-Labs to Develop Neural Input Device appeared first on Road to VR.

Researchers Electrically Stimulate Muscles in Haptic Designed for Hands-free AR Input

Researchers at The Human Computer Interaction Lab at Hasso-Plattner-Institut in Potsdam, Germany, published a video recently showing a novel solution to the problem of wearable haptics for augmented reality. Using a lightweight, mobile electrical muscle stimulation (EMS) device that provides low-voltage to arm muscles, the idea is to let AR headset-users stay hands-free, but also be able to experience force-feedback when interacting with virtual objects, and feel extra forces when touching physical objects in their environment too.

Using a HoloLens headset, researchers show their proposed solution in action, which is made up of a backpack, a laptop computer running Unity, a battery-powered EMS machine, electrode pads, and visual markers to better track hand gestures. The researchers say their system “adds physical forces while keeping the users’ hands free to interact unencumbered.”

image courtesy Hasso-Plattner-Institut

Both HoloLens and the upcoming Magic Leap One include a physical controller; HoloLens has a simple ‘clicker’ and ML One has a 6DoF controller. While both systems admittedly incorporate gestural recognition, there’s still no established way for AR headset users to ‘feel’ the world around them.

According to the paper, which is being presented at this year’s ACM CHI Conference in Montréal, the EMS-based system actuates the user’s wrists, biceps, triceps and shoulder muscles with a low-voltage to simulate a sort of ‘virtual pressure’. This perceived pressure can be activated when you interact with virtual objects such as buttons, and even physical objects like real-world dials and levels to create an extra sense of force on the user’s arms.


There are some trade-offs when using this sort of system though, making it somewhat less practical for long-term use as it’s configured now. Two of the biggest drawbacks: it requires precise electrode placement and per-user calibration before each use, and it can also cause muscle fatigue, which would render it less useful and probably less comfortable.

But maybe a little muscle stimulation can go a long way. The paper discusses using EMS sparingly, playing on the user’s keen sense for plausibility while in a physical (and not virtual) environment.

“In the case of [augmented reality], we observed users remarking how they enjoyed nuanced aspects of the EMS-enabled physics, for instance: “I can feel the couch is harder to move when it is stopped [due to our EMS-based static friction]”. As a recommendation for UX designers working in MR, we suggest aligning the “haptic-physics” with the expected physics as much as possible rather than resorting to exaggerations.

It’s an interesting step that could prove effective in a multi-pronged approach to adding haptics to AR wearables, the users of which would want to stay hands-free when going about their daily lives. Actuator-based gloves and vests have been a low-hanging fruit so far, and are quickly becoming a standard go-to for VR haptics, but still seem too much of a stretch for daily AR use. Force-feedback exoskeletons, which stop physical movements, are much bulkier and are even more of a stretch currently.

There’s no telling what the prevailing AR wearable will be in the future, but whatever it is, it’s going to have to be both light and useful—two aspects EMS seems to nail fairly well out of the gate.

The post Researchers Electrically Stimulate Muscles in Haptic Designed for Hands-free AR Input appeared first on Road to VR.