Facebook Reveals Latest Wrist-worn Prototypes for XR Input & Haptics

Facebook today revealed some of its latest research and vision for a wrist-worn input device that the company expects to form the basis of AR and VR interactions and haptics in the future. The device, still in a research prototype phase, senses electrical signals in the user’s arm to detect intentional inputs. In addition to functioning as a simple ‘button’ input, the company says the device can even enable accurate, keyboard-less typing, and more.

In a media briefing this week, researchers from Facebook Reality Labs Research shared some of their latest work in developing new input technology which the team believes will form the foundation of interactions for XR devices of the future. The group shared a concept video of what it believes will be possible with the technology.

Beyond the concept video, the researchers also discussed the work happening to bring it to fruition.

Input on the Wrist

Image courtesy Facebook

The Facebook researchers seem increasingly convinced that a wrist-worn controller is their best bet as an ‘always on’ wearable input device that can enable “ultra low friction” interactions for XR experiences.

Facebook has continued to build atop the wrist-wearable input technology it picked up in an acquisition of CTRL-Labs in 2019.

The heart of the wrist controller is electromyography (EMG) sensors which can detect the electrical signals which control the muscles in your hands. Rather than just course movements, the researchers say that EMG can be used to sense individual finger movements with precision down to one millimeter. In a video shared by the company, Facebook says the movements of the hand shown below are detected entirely with EMG:

While the near-term use-cases of this kind of technology could be to enable an ‘always available button’ which you can use to confirm choices presented to them through contextually relevant AR systems, the researchers say, further out it will be used for manipulating virtual interfaces and objects, and even to type without a keyboard.

“It’s highly likely that ultimately you’ll be able to type at high speed with EMG on a table or your lap — maybe even at higher speed than is possible with a keyboard today. Initial research is promising,” the company writes. “In fact, since joining FRL in 2019, the CTRL-labs team has made important progress on personalized models, reducing the time it takes to train custom keyboard models that adapt to an individual’s typing speed and technique.”

The researchers shared what is purported to be a live demo of this personalized keyboard model in action, using the wearable prototype to enable reasonably fast typing without a keyboard:

Beyond typing, the researchers say that being able to read finger movements from the wrist could also allow users to manipulate objects. Another purportedly real demo shared shows this in action:

Further in the future the team suggests that users may be able to train themselves to do some of these commands without any physical movement at all.

Although the company says it’s wrist-worn device is a “neural” input device, it draws a distinction between neural input and “mind reading.”

“This is not akin to mind reading. Think of it like this: You take many photos and choose to share only some of them. Similarly, you have many thoughts and you choose to act on only some of them. When that happens, your brain sends signals to your hands and fingers telling them to move in specific ways in order to perform actions like typing and swiping,” the company writes. “This is about decoding those signals at the wrist—the actions you’ve already decided to perform—and translating them into digital commands for your device.”

Always-on Haptics

Image courtesy Facebook

For a device which may be comfortably worn all day, the researchers say, a wrist-wearable is also a great place to deliver haptics.

To that end, the company has been experimenting with different types of haptic technologies for the wrist.

One prototype, called ‘Bellowband’, lines the inside of the device’s wristband with quarter-sized bladders which can lay flat or be inflated to put pressure on the user’s wrist. Different haptic effects can be achieved by using different combinations of the bladders or by pulsing them at different rates.

Another prototype, called ‘Tasbi’ (short for Tactile and Squeeze Bracelet Interface), uses six vibrating actuators around the wrist, along with a sort of tension-based, wrist-squeezing mechanism which can dynamically tighten and put pressure on the user’s wrist.

The researchers say that prototypes like these help the company find out which kinds of haptic feedback technology may be worth pursuing.

Contextual AI

A major part of Facebook’s vision for AR and an “ultra low friction” input approach necessitates AI which can deeply understand the user’s context.

“The underlying AI has some understanding of what you might want to do in the future. Perhaps you head outside for a jog and, based on your past behavior, the system thinks you’re most likely to want to listen to your running playlist. It then presents that option to you on the display: ‘Play running playlist?’ That’s the adaptive interface at work,” writes FRL Research Science Manager Tanya Jonker. “Then you can simply confirm or change that suggestion using a microgesture. The intelligent click gives you the ability to take these highly contextual actions in a very low-friction manner because the interface surfaces something that’s relevant based on your personal history and choices, and it allows you to do that with minimal input gestures.”

That’s largely conceptual for now. While today’s smartphones or smartwatches are able to leverage some clues like, time, location, and connected accessories to infer what actions might be relevant to you, the sort of contextual AI suggestions that Facebook is envisioning will require both an advance in AI as well as the sensor-laden peripherals that can get a real-time understanding of the user’s immediate environment.

More Questions Than Answers on Privacy

Facebook says its goal in building the far future of XR is to build technologies where “the human is the absolute center of the entire experience.”

Achieving the company’s vision will require hardware and software with an intimate understanding of both the user and their environment.

Facebook maintains that it’s committed to transparency throughout the development of these technologies, but admits that it isn’t equipped to asses the broader questions.

“Understanding and solving the full extent of ethical issues [raised by these technologies] requires society-level engagement,” says FRL Research Science Director Sean Keller. “We simply won’t get there by ourselves, so we aren’t attempting to do so. As we invent new technologies, we are committed to sharing our learnings with the community and engaging in open discussion to address concerns.”

Indeed, the company says that a major reason why it’s sharing this information today is to engage the broader tech community on these questions before it moves to take the technology out of the lab and into the market.

“[…] we support and encourage our researchers to publish their work in peer-reviewed journals—and [that’s] why we’re telling this story today. We believe that far before any of this technology ever becomes part of a consumer product, there are many discussions to have openly and transparently about what the future of HCI can and should look like.”

The post Facebook Reveals Latest Wrist-worn Prototypes for XR Input & Haptics appeared first on Road to VR.

Oculus Announces Two Sponsored Sessions at GDC Showcase Next Week

Ahead of a larger Game Developers Conference planned for July, the organization is hosting a virtual event next week, called GDC Showcase, from March 15th to 19th. Oculus, which has historically used GDC for major announcements and developer outreach, announced it will participate in two sponsored sessions.

While we don’t know if Oculus is planning to drop any significant news during the GDC Showcase, the company confirmed this week it’s supporting the event with two sponsored sessions:

Future of Gaming: Quality and Connection

Speakers: Jason Rubin (VP of Play, Facebook) | Michael Verdu (VP of Content, Facebook Reality Labs) | Denny Unger (CEO, Cloudhead Games) | Michael Carter (CEO, PlayCo)
Moderator: Dean Takahashi (Lead Writer, GamesBeat at VentureBeat)
Date: March 17th, 9:00–10:00AM PT

Quality Games and Player Connections Drive Healthy And Robust Gaming Ecosystems – Join Michael Verdu and Jason Rubin as they discuss the Oculus Quest Store and Facebook Instant Games catalog philosophies. They’ll share the evolution of both platforms and how each supports developers in creating high quality games. They’ll be joined by Denny Unger, CEO of Cloudhead Games, creator of Pistol Whip, and Michael Carter, CEO of PlayCo, creator of CatLife. Unger and Carter will discuss the shift in their studios’ focuses as they create quality games and social gameplay features for these platforms and others. This fireside chat will be moderated by Dean Takahashi, lead writer for GamesBeat at VentureBeat.

Squad Up! – Social and Multiplayer in VR with POPULATION: ONE

Speakers: Chia Chin Lee (Co-founder and CEO, BigBox VR) | Gabe Brown (Co-founder and CTO, BigBox VR) | Omid Yazdanshenas (Game Producer, Oculus)
Date: March 17th, 2:40–3:10PM PT

BigBox VR brought the battle royale genre to Quest for the first time last fall with POPULATION: ONE, a game where up-to-18 players can team up and face off in an immersive VR battleground. Now, after just a few months, POP1 has surpassed $10M in Quest revenue, with ongoing updates and events to bring their community closer together. Join BigBox VR co-founders Chia Chin Lee and Gabe Brown for a conversation with Oculus Games Producer, Omid Yazdanshenas, that explores multiplayer social games in VR and how they architected the success of their VR mega-hit, POPULATION: ONE.

And, though not Oculus specifically, the developers of The Walking Dead: Saints & Sinners are planning a postmortem of their hit game in a session during GDC Showcase:

Making Great VR Games: A Postmortem on Skydance Interactive’s ‘The Walking Dead: Saints & Sinners’

Speakers: Chris Busse (Head of Studio, Skydance Interactive) | Guy Costantini (Vice President of Global Interactive Marketing, Skydance Media)
Date: March 18th, 10:05–10:35AM PT

In this discussion, Skydance Interactive will offer insight on creating an innovative VR gameplay experience and the challenges of working in a market that is still relatively nascent. They will also speak to their experiences working with Skybound Entertainment to create a compelling, original story set in The Walking Dead universe, bringing a comic book franchise to new life in VR and how the team is working to deliver even more great content for their best selling VR game.

All sessions will be streamed, and you can watch them with a free registration for the GDC Showcase.

The post Oculus Announces Two Sponsored Sessions at GDC Showcase Next Week appeared first on Road to VR.

Facebook’s AR/VR Head Calls for “Big Shift” in How It Deals with User Privacy

Facebook Reality Labs head Andrew Bosworth released an internal memo, entitled “The Big Shift,” which underlines why the company needs to start building products now that better balance user privacy and user experience.

Even before Facebook moved to require all new Oculus users to sign in with Facebook, Oculus headset users were rightfully worried about the company’s treatment of user privacy. Facebook has a long track record of privacy scandals, including the Cambridge Analytic debacle, mass surveillance, and the amplification of misinformation (aka ‘fake news’). There’s more, but the list is comically long.

Virtual and augmented reality opens new, more intimate windows into user behavior though, with biometrical data obtained from VR/AR devices offering important vectors for understanding what makes each individual tick. It’s a treasure trove of user data which has largely gone untapped (and unleaked, as far as we know), but it won’t always be that way.

Now, Andrew Bosworth, the head of Facebook’s AR/VR Reality Labs team, is calling on his colleagues to put user privacy at the core of its products. The ‘Big Shift’ memo, seen in part below, was obtained by Big Technology, and first reported by OneZero.

“Starting in January we are changing the way we approach product development in FRL. Instead of imagining a product and trimming it down to fit modern standards of data privacy and security we are going to invert our process. We will start with the assumption that we can’t collect, use, or store any data. The burden is on us to demonstrate why certain data is truly required for the product to work. Even then I want us to scope it as aggressively as we can, holding a higher bar for sending data to the server than we do for processing it locally. I have no problem with us giving users options to share more if they choose (opt-in) but by default we shouldn’t expect it.”

In the memo, which was released December 22nd, Bosworth says he doesn’t simply aim to meet today’s expectations for user privacy, but wants to “differentiate our products on the basis of privacy. Let other companies scramble to keep up with us.”

Andrew Bosworth | Image courtesy Facebook

Bosworth, a 15-year Facebook veteran, first joined Microsoft in 2004; it wasn’t the same Microsoft we know today, but it was changing to prioritize user security in the face of a long history of ostensibly leaving virus and malware protection for third parties to sort out. Bosworth says in the memo it was due to “decades of buffer overruns and unchecked dereferences in a sprawling code base.”

After his one and a half year-stint as a software designer at Microsoft, Bosworth says public criticism pushed the company to reprioritize security, which helped make it the trusted leader in the field as it is today.

“Today Microsoft is considered perhaps the most trustworthy software vendor in the world. It is trusted by an overwhelming majority of enterprise companies. Having been on the outside since 2005 it was impressive to watch their persistence yield a gradual but definitive shift in their reputation. I think this is a model for us at Facebook. We should become the undisputed leaders in providing privacy aware software.”

Bosworth disagrees with the view that Facebook doesn’t care about balancing privacy and user experience, but he says that due to a recent shift in public sentiment, the company must “consider the consumer experience holistically rather than at optimizing for each individual feature.”

Facebook now offers a new set of privacy functions which reveals what data the company is collecting when you use its VR devices. That’s moving in the right direction, however it’s clear the company as a whole still isn’t working on the same wavelength. This month alone Facebook has faced a major backlash due to its mishandling of WhatsApp user privacy.

“The next step is for the priority of privacy to permeate the entirety of our culture, we’ve made inroads here but we have a long ways to go. Privacy Review should become a simple housekeeping exercise unless we detect further shifts in public attitudes towards privacy.”

Whether it was intentional or not, Bosworth’s memo strikes at the heart of the matter: companies of size simply don’t act in your best interests when given free reign, and users need to prioritize privacy over user experience if they want to push entities like Facebook in that direction. It’s supposed to be a ‘Big Shift’ in the way Facebook currently operates, and we can see why.

“With new culture and new tools, [and] a concerted effort to revisit old products, we are on a long road to redemption. We shouldn’t expect any credit before it is due and perhaps not even until a good while afterwards. But we should nonetheless feel proud of the shift we are undertaking and confident in our ability to see it through.”

Facebook declined OneZero’s request to comment on the contents of the memo.


We’ve included the majority of ‘The Big Shift’ in this piece. You can check out the whole, unedited version here.

The post Facebook’s AR/VR Head Calls for “Big Shift” in How It Deals with User Privacy appeared first on Road to VR.

Oculus Quest 2 Surpasses Original Quest in Monthly Active Users

In a look back at 2020, Facebook Reality Lab’s head Andrew Bosworth revealed that Quest 2 quietly celebrated a few new milestones shortly after its October 13th launch.

In the blogpost, Bosworth says that despite the need for social distancing in 2020, it’s actually been a pretty great year for the company in terms of growing virtual reality.

“VR had a tremendous year. Oculus Quest 2 is our fastest-growing VR headset, thanks to the convergence of leading VR form factors and the content built by our developer community.”

Bosworth didn’t mention any hard figures, but he says that Quest 2 “surpassed the original Quest’s monthly active people in less than 7 weeks,” an impressive feat.

A Facebook spokesperson told Road to VR that this is based on concurrent measurements between the two headsets. However Quest 2 has also since surpassed Quest 1’s record high number of monthly active users, the company tells us.

Photo by Road to VR

Considering the original Quest managed to generate $150 million of revenue in apps and games sales—calculated after its 2019 launch until shortly before Quest 2 was released—there’s no telling what Quest 2 will be able to accomplish in the same time frame.

Additionally, Bosworth says there are now more women using Quest 2 than any of their previous headsets—which Facebook says is based on overall percentage, not simply a raw number.

None of this really comes as a surprise though; company CEO Mark Zuckerberg reported shortly after its launch that Quest 2 had generated five times the number of pre-orders over the original, so Quest 2 was well positioned to be a hit. Granted, it’s been a captured market, with stay-at-home orders affecting many people, however Bosworth says the company is also focusing on providing work-from-home solutions to help fill in the obvious gaps.

“This is the year we take steps to make immersive experiences more social with Facebook Horizon,” he says, referring to the company’s still in-beta social VR platform. “And as the office concept evolves, we’re building out our capacity for meaningful social presence in virtual work spaces”

What does 2021 hold for Facebook? Outside of eventually releasing Horizon for regular users, it’s hard to tell. We know the company is gearing up to one day deliver AR glasses thanks to its research done via Project Aria, a sensor-rich pair of glasses which the company will use to train its AR perception systems and asses public perception of the technology That’s not likely happening any time soon though, so we’re interested to see just where the company goes in the meantime.

The post Oculus Quest 2 Surpasses Original Quest in Monthly Active Users appeared first on Road to VR.

Facebook: Quest 2 Is Selling ‘Faster Than Quest’ And ‘Beyond’ Expectations

According to Chris Pruett in an interview with Protocol Gaming, the Director of Content Ecosystem at Oculus, the Oculus Quest 2 is selling “faster than Quest” and is exceeding sales expectations.

Oculus Quest 2 Selling ‘Faster Than Quest’

This should come as no surprise to anyone that’s been following the industry at all over the last several weeks as Facebook’s latest standalone VR headset has proven to be quite popular. We dubbed it the new king of VR in our review with the massive caveat that you need to link an active and legitimate Facebook account to the device to use it.

Again, that’s a pretty massive string attached since it’s already resulting in several users getting locked out entirely, turning their shiny new VR headset into an expensive paperweight.

Read More: Facebook’s Account Verification Leaves Some Quest 2 Buyers With ‘Paperweight’

In the Protocol Gaming interview, Pruett said: “We really couldn’t be happier. The device is selling quite well…faster than Quest did…maybe a little beyond what we expected.”

Previously we’d heard from a few VR developers of popular titles such as Pistol Whip, Apex Construct, Waltz of the Wizard, and more that they’re all seeing big bumps in sales now that the Quest 2 is out. The Walking Dead: Saints & Sinners was one of the only big, new launch titles for the headset (all games are also on Quest 2) and Population: One, a VR battle royale shooter, drops on Quest and PC VR tomorrow with full crossplay.

We still don’t have any hard sales figures for any Oculus headsets at all, other than the Samsung collaborative Gear VR from years ago, so it’s hard to tell what Pruett’s quote means in the grand scheme of things. That being said, we can clearly tell the original Quest sold well and if this one is selling faster and beyond expectations, that’s certainly a good sign for overall VR adoption in mainstream society.

Let us know what you think down in the comments below!

Facebook Develops Hand Tracking Method to Let You Touch Type Without Needing a Keyboard

Some of the most basic questions surrounding AR/VR tech aren’t entirely solved yet, like making text input a comfortable and familiar experience. Facebook’s Reality Labs (FRL) today revealed new research into hand tracking which aims to bring touch typing to AR/VR users, all without the need of a physical keyboard.

There’s already basic hand tracking on Quest which lets you navigate system UI, browse the web, and play supported games like Waltz of the Wizard: Extended Edition (2019) without the need of Touch controllers, instead letting you reach out with your own to hands to cast spells and manipulate objects.

As interesting and useful those use cases may be, we’re still very much in the infancy of hand tracking and its potential uses for virtual reality. Using your fingers as glorified laser pointers on a virtual keyboard reveals just how much of a gap there is left in natural VR input methods. On that note, Facebook researchers have been trying to build out hand tracking to even more useful applications, and their most recent is aimed at solving some of the most frustrating things in VR/AR headset users to this day: text input.

VR keyboards haven’t evolved beyond this, Image courtesy Virtual Desktop

Facebook today revealed that its FRL researchers used a motion model to predict what people intended to type despite the erratic motion of typing on a flat surface. The company says their tech can isolate individual fingers and their trajectories as they reach for keys—information that simply doesn’t exist on touch screen devices like smartphones and tablets.

“This new approach uses hand motion from a marker-based hand tracking system as input and decodes the motion directly into the text they intended to type,” FRL says. “While still early in the research phase, this exploration illustrates the potential of hand tracking for productivity scenarios, like faster typing on any surface.”

One of the biggest barriers to overcome was “erratic” typing patterns. And without the benefit of haptic feedback, researchers looked to other predictive fields in AI to tackle the issue of guessing where fingers would logically go next. FRL says it researchers borrowed statistical decoding techniques from automatic speech recognition, essentially replacing phenomes for hand motion in order to predict keystrokes—that’s the short of it anyway.

“This, along with a language model, predicts what people intended to type despite ambiguous hand motion. Using this new method, typists averaged 73 words per minute with a 2.4% uncorrected error rate using their hands, a flat surface, and nothing else, achieving similar speed and accuracy to the same typist on a physical keyboard,” the researchers say.

With its insights into hand tracking, Facebook is undoubtedly preparing for the next generation of AR headsets—the ‘always on’ sort of standalone AR headsets that you might wear in the car, at work, at home and only take off when it’s time to recharge. Using Quest 2 as a test bed for AR interactions sounds like a logical step, and although the company hasn’t said as much, we’re hoping to see even more cool hand tracking tech pushed out for experimental use on the new, more powerful standalone VR headset.

The post Facebook Develops Hand Tracking Method to Let You Touch Type Without Needing a Keyboard appeared first on Road to VR.

Sniper Elite VR Launching On Oculus Quest Alongside PC VR And PSVR

Today during the Facebook Connect digital event the Oculus Quest shooter library got a little bit beefier. Rebellion and Oculus revealed that Sniper Elite VR is officially coming to the Oculus Quest. 

Previously showcased as a PSVR-exclusive, Sniper Elite VR is now taking aim at other platforms. We still don’t know for sure if it’s coming to PC VR headsets yet, but since Rebellion’s other VR projects like Battlezone eventually made their way to PC, it seems possible. 

The first time we got our hands on Sniper Elite VR was at E3 2019 and then again at PAX East earlier this year, but we’ve only tried it on the PSVR using the PS Aim Controller both times. Admittedly, it plays so well with that controller peripheral it was hard to imagine ever playing it any other way — but they eventually confirmed support for PC VR headsets too. It’s still quite surprising to imagine it could run so well on a standalone device like the Oculus Quest, but games like Onward and Phantom: Covert Ops have made it happen already as well. Now that we’ve seen the latest footage in the trailer embedded above, it’s looking like a really capable shooter.

According to the press release it includes a full campaign, free smooth locomotion across all levels, the iconic x-ray kill cam redesigned for VR, and authentic weapon interactions.

For those worried about playing a game that’s just a collection of slow-paced sniper missions, it doesn’t look like you have much to be concerned with at all. The trailer is full of action, showcases a wide variety of weaponry, and looks like it could rival the likes of both Blood & Truth and Medal of Honor: Above and Beyond for delivering a breezy action-packed VR FPS campaign. 

We still don’t have a release date yet other than ‘Coming Soon’ but now Quest owners can also look forward to Sniper Elite VR right alongside PSVR and PC VR players. Let us know what you think of the game and its latest trailer down in the comments below!

Facebook Researching VR/AR ‘Hear-Through’ Technology For ‘Perceptual Superpowers’

Facebook researchers are investigating AR glasses featuring “hear-through” technology powered by specialized in-ear monitors for “enhanced hearing.”

The technology “would be able to recognize different types of events happening around you: people having conversations, the air conditioning noise, dishes and silverware clanking. Then using contextualized AI, your AR glasses would be able to make smart decisions, like removing the distracting background noise — and you’d be no more aware of the assistance than of a prescription improving your vision,” according to Facebook.

The work with beamforming, adaptive noise cancellation, and machine learning is described as “an area we’re just starting to explore” with the goal of enabling “perceptual superpowers” like “enhanced hearing.” One of the possible paths would use the “pattern of your head and eye movements” to “automatically enhance the sounds you want to hear and dim unwanted background noise.”

Safe Use Questions

Michael Abrash, chief scientist at Facebook Reality Labs Research, responded to questions in a briefing call with journalists related to how the use of this technology might affect social norms. Could a personal conversation at a restaurant table be heard by a nearby AR-enhanced patron? Would the nearby patrons ever know their conversation was heard by someone else? And would that conversation be less likely to happen in the first place if people knew mediated hearing was more common than it is at the time of this writing?

Abrash and Ravish Mehra, the audio team lead at FRL Research, gave a few examples of some potential mitigation strategies that might be employed to limit the range of this feature in the future. Mehra explained in a prepared statement they intend “to put guardrails around our innovation to do it responsibly, so we’re already thinking about potential safeguards we can put in place…for example, before I can enhance someone’s voice, there could be a protocol in place that my glasses can follow to ask someone else’s glasses for permission.”

Another line of research at Facebook may explore the use of personalizing audio with a head-related transfer function (HRTF) which accounts for the individual shape of your ears, potentially using an “algorithm that can approximate a workable personalized HRTF from something as simple as a photograph of their ears.” A Facebook representative added “the audio research team is considering several novel approaches to scaling the capture of people’s unique HRTFs.”

A Facebook blog post explains:

“Another issue the team is keenly aware of is the capture of sensitive ear data, both in the research phase and beyond. Today, before any data we collect is made available to researchers, it is encrypted and the research participant’s identity is separated from the data such that it is unknown to the researchers using the data. Once collected, it’s stored on secure internal servers that are accessible only to a small number of researchers with express permission to use it. The team also has regular reviews with privacy, security, and IT experts to make sure they’re following protocol and implementing the appropriate safeguards…”

AR vs. VR And In-Ear vs. Open-Ear

Facebook Reality Labs Research In-Ear Monitors (IEM) audio prototype.

Mehra explained in the call that different audio-based modifications might be accomplished in future AR or VR-based sound delivery systems depending whether they feature in-ear or open-ear designs.

An open-ear audio design like the Valve Index or Oculus Quest, for instance, wouldn’t necessarily work well for the cancellation of sounds from the physical world, the researchers explained.

In-ear designs like Apple’s AirPods Pro might end up being a better fit for some use cases. They already feature “Active Noise Cancellation” and “Transparency mode” to switch between “depending on how much of your surroundings you want to hear.” Another in-ear design, Google’s Pixel Buds 2, offers an experimental feature called “attention alerts” that would lower volume when they detect a baby crying, a dog barking, or an emergency vehicle siren.

Facebook researchers, meanwhile, are using “prototype in-ear monitors” or “IEMs” which they say “can deliver the full experience of auditory superpowers. This lets us enhance the right sounds for you and dim others, making sure that what you really want to hear is clear even in loud background noise.”

“Our IEMs also feature perceptually transparent hear-through,” Audio Experiences Lead Scott Selfon explained in a prepared statement, “making it sound like I have nothing in my ears, and letting me safely hear the entire world around me.”

When asked if this same research could be applied to dynamically lower the sound from a VR headset to better hear someone talking to you in the same physical room, Abrash said he hadn’t thought of that before but it’s a “great idea.”

Future Aims

Facebook aims to make “stylish AR glasses” and pitches the research as being “focused on transforming communication for everyone, everywhere” while citing Johns Hopkins research suggesting one in five people have hearing loss and many “don’t use hearing aids for a variety of reasons including expense, social stigma, discomfort, and lack of reliability.”

“I’ve been wearing hearing aids since I was a little girl,” Technical Program Manager Amanda Barry said, in a prepared statement. “The ability to help people stay connected with their families as they get older and their hearing fades — that’s really pretty exciting.”

Facebook is also researching what it calls LiveMaps (an updating map of the physical world “with shared and private components”) and Codec Avatars (hyper-realistic personalized representations of human bodies) and either or both might be used in conjunction with the audio research. Overall, Facebook researchers seem set on a path to “defy distance” by providing “true social presence.”

“The only reason we need for virtual sound to be made real is so that I can put a virtual person in front of me and have a social interaction with them that’s as if they were really there,” Facebook Research Lead Philip Robinson explained in a prepared statement. “And remote or in person, if we can improve communication even a little bit, it would really enable deeper and more impactful social connections.”

Facebook recently launched a beta for its Horizon social network and added a new requirement that all future Facebook VR headsets be connected to a Facebook account. Starting in October, Facebook will be implementing new terms of service which state that “when people stand behind their opinions and actions, our community is safer and more accountable. For this reason, you must: Use the same name that you use in everyday life” when you’re using Facebook.

Facebook Wants to Build an AR Headset to Supercharge Your Hearing, Create a Custom HRTF from a Photograph

The Facebook Reality Labs Research team shared some of its latest audio initiatives today. The group aims to build technologies into an AR headset that will supercharge your hearing by making it easy to isolate the sound of your conversation in a noisy environment, and to be able to reproduce virtual sounds that seem like they’re coming from the real world around you. A custom HRTF (Head-related Transfer Function)—a digital version of the unique way each person hears sound based on the shape of their head and ears—is key to delivering such experiences, but the process is time consuming and expensive. The team is investigating a scalable solution which would generate an accurate HRTF from a simple photograph of your ear.

Facebook Reality Labs (FRL) is the newly adopted name of team at Facebook which is building immersive technologies (including Oculus headsets). Facebook Reality Labs Research (FRLR) is the research & development arm of that team.

Today Facebook Reality Labs Research shared an update on a number of ongoing immersive audio research initiatives, saying that the work is “directly connected to Facebook’s work to deliver AR glasses,” though some of the work is also broadly applicable to VR as well.

Spatial Audio

One of the team’s goals is to recreate virtual sounds that are “perceptually indistinguishable” from the sound of a real object or person in the same room with you.

“Imagine if you were on a phone call and you forgot that you were separated by distance,” says Research Lead Philip Robinson. “That’s the promise of the technology we’re developing.”

In order to achieve that goal, the researchers say there’s two key challenges: 1) understanding the unique auditory characteristics of the listener’s environment, and 2) understanding the unique way that the listener hears sounds based on their physiology.

Understanding the acoustic properties of the room (how sounds echo throughout) can be done by estimating how the room should sound based on the geometry that’s already mapped from the headset’s tracking sensors. Combined with AI capable of estimating the acoustic properties of specific surfaces in the room, a rough idea of how a real sound would propagate through the space can be used to make virtual sounds seem as if they’re really coming from inside the same room.

SEE ALSO
Facebook Says It Has Developed the 'Thinnest VR display to date' With Holographic Folded Optics

Facebook researchers also say that this information could be added to LiveMaps—an augmented reality copy of the real world that Facebook is building—and recalled by other devices in the same space in a way that the acoustic estimation could be improved over time through crowd-sourced data.

The second major challenge is understanding the unique way everyone hears the world based on the shape of their head and ears. The shape of your head and ears doesn’t just ‘color’ the way you hear, it’s also critical to your sense of identifying where sounds are coming from around you; if you borrowed someone else’s ears for a day, you’d have a harder time pinpointing where exactly sounds were coming from.

The science of how sound interacts with differently shaped ears is well understood enough that it can be represented with a compact numeric function—called a Head-related Transfer Function (HRTF). But accurately measuring an individual’s HRTF requires specialized tools and a lengthy calibration procedure—akin to having a doctor test your eyes for a vision prescription—which makes it impractical to scale to many users.

To that end, Facebook Reality Labs Research says it hopes to “develop an algorithm that can approximate a workable personalized HRTF from something as simple as a photograph of [your] ears.”

To demonstrate the work the team has done on the spatial audio front, it created a sort of mini-game where participants wearing a tracked pair of headphones stand in a room with several real speakers scattered throughout. The team then plays a sound and asks the participant to choose whether the sound was produced virtually and played through the headphones, or if it was played through the real speaker in the room. The team says that results from many participants show that the virtual sounds are nearly indistinguishable from the real sounds.

Context-aware Noise Cancellation

While “perceptually indistinguishable” virtual sounds could make it sound like your friend is right next to you—even when they’re communicating through a headset on the other side of the country—Facebook Reality Labs Research also wants to use audio to enhance real, face-to-face conversations.

SEE ALSO
Researchers Develop Method to Boost Contrast in VR Headsets by Lying to Your Eyes

One way they’re doing that is to create contextually aware noise cancellation. While noise cancellation technology today aims to reduce all outside sound, contextually aware noise cancellation tries to isolate the outside sounds that you want to hear while reducing the rest.

To do this, Facebook researchers built prototype earbuds and prototype glasses with several microphones, head tracking, and eye-tracking. The glasses monitor the sounds around the user as well as where they’re looking. An algorithm aims to use the information to figure out the subject the user wants to listen to—be it the person across the table from them, or a TV in the corner of the room. That information is fed to the audio processing portion of the algorithm that tries to sift through the incoming sounds in order to highlight the specific sounds from the subject while reducing the sounds of everything else.

– – — – –

Facebook is clear that it is working on this technology with the goal of eventually bringing it to AR and VR headsets. And while researchers say they’ve proven out many of these concepts, it isn’t yet clear how long it will be until it can be brought out of the lab and into everyday headsets.

The post Facebook Wants to Build an AR Headset to Supercharge Your Hearing, Create a Custom HRTF from a Photograph appeared first on Road to VR.