Meta Reveals VR Headset Prototypes Designed to Make VR ‘Indistinguishable From Reality’

Meta says its ultimate goal with its VR hardware is to make a comfortable, compact headset with visual finality that’s ‘indistinguishable from reality’. Today the company revealed its latest VR headset prototypes which it says represent steps toward that goal.

Meta has made it no secret that it’s dumping tens of billions of dollars in its XR efforts, much of which is going to long-term R&D through its Reality Labs Research division. Apparently in an effort to shine a bit of light onto what that money is actually accomplishing, the company invited a group of press to sit down for a look at its latest accomplishments in VR hardware R&D.

Reaching the Bar

To start, Meta CEO Mark Zuckerberg spoke alongside Reality Labs Chief Scientist Michael Abrash to explain that the company’s ultimate goal is to build VR hardware that meets all the visual requirements to be accepted as “real” by your visual system.

VR headsets today are impressively immersive, but there’s still no question that what you’re looking at is, well… virtual.

Inside of Meta’s Reality Labs Research division, the company uses the term ‘visual Turing Test’ to represent the bar that needs to be met to convince your visual system that what’s inside the headset is actually real. The concept is borrowed from a similar concept which denotes the point at which a human can tell the difference between another human and an artificial intelligence.

For a headset to completely convince your visual system that what’s inside the headset is actually real, Meta says you need a headset that can pass that “visual Turing Test.”

Four Challenges

Zuckerberg and Abrash outlined what they see as four key visual challenges that VR headsets need to solve before the visual Turing Test can be passed: varifocal, distortion, retina resolution, and HDR.

Briefly, here’s what those mean:

  • Varifocal: the ability to focus on arbitrary depths of the virtual scene, with both essential focus functions of the eyes (vergence and accommodation)
  • Distortion: lenses inherently distort the light that passes through them, often creating artifacts like color separation and pupil swim that make the existence of the lens obvious.
  • Retina resolution: having enough resolution in the display to meet or exceed the resolving power of the human eye, such that there’s no evidence of underlying pixels
  • HDR: also known as high dynamic range, which describes the range of darkness and brightness that we experience in the real world (which almost no display today can properly emulate).

The Display Systems Research team at Reality Labs has built prototypes that function as proof-of-concepts for potential solutions to these challenges.

Varifocal

Image courtesy Meta

To address varifocal, the team developed a series of prototypes which it called ‘Half Dome’. In that series the company first explored a varifocal design which used a mechanically moving display to change the distance between the display and the lens, thus changing the focal depth of the image. Later the team moved to a solid-state electronic system which resulted in varifocal optics that were significantly more compact, reliable, and silent. We’ve covered the Half Dome prototypes in greater detail here if you want to know more.

Virtual Reality… For Lenses

As for distortion, Abrash explained that experimenting with lens designs and distortion-correction algorithms that are specific to those lens designs is a cumbersome process. Novel lenses can’t be made quickly, he said, and once they are made they still need to be carefully integrated into a headset.

To allow the Display Systems Research team to work more quickly on the issue, the team built a ‘distortion simulator’, which actually emulates a VR headset using a 3DTV, and simulates lenses (and their corresponding distortion-correction algorithms) in-software.

Image courtesy Meta

Doing so has allowed the team to iterate on the problem more quickly, wherein the key challenge is to dynamically correct lens distortions as the eye moves, rather than merely correcting for what is seen when the eye is looking in the immediate center of the lens.

Retina Resolution

Image courtesy Meta

On the retina resolution front, Meta revealed a previously unseen headset prototype called Butterscotch, which the company says achieves a retina resolution of 60 pixels per degree, allowing for 20/20 vision. To do so, they used extremely pixel-dense displays and reduced the field-of-view—in order to concentrate the pixels over a smaller area—to about half the size of Quest 2. The company says it also developed a “hybrid lens” that would “fully resolve” the increased resolution, and it shared through-the-lens comparisons between the original Rift, Quest 2, and the Butterscotch prototype.

Image courtesy Meta

While there are already headsets out there today that offer retina resolution—like Varjo’s VR-3 headset—only a small area in the middle of the view (27° × 27°) hits the 60 PPD mark… anything outside of that area drops to 30 PPD or lower. Ostensibly Meta’s Butterscotch prototype has 60 PPD across its entirely of the field-of-view, though the company didn’t explain to what extent resolution is reduced toward the edges of the lens.

Continue on Page 2: High Dynamic Range, Downsizing »

The post Meta Reveals VR Headset Prototypes Designed to Make VR ‘Indistinguishable From Reality’ appeared first on Road to VR.

Facebook’s Display Research Lead: Varifocal Half-Dome 3 ‘Almost Ready For Prime Time’

Facebook’s Director of Display Systems Research Douglas Lanman revealed the Half-Dome 3 prototype shown late last year is further along the development cycle than some other research projects.

It’s something that is not quite a publication. It’s beyond that. And this thing is more like level five. It’s almost ready for primetime

Stated at an SPIE talk given earlier in the year, Lanman was referencing NASA’s Technology Readiness Levels (TRLs), adapted for tech products:

Lanman works at Facebook Reality Labs, the company’s VR/AR research division. He explained that a “real-deal scientist” works at TR1, while his team usually works between TR2 and TR4.

Startups, Lanman elaborated, typically start at TR6 or TR7, whereas shipping a consumer product at scale involves TR7, TR8, and TR9.

“At some point, you walk into an electronics store and you see level nine. And you start to wonder, am I just going to do this cycle of level 2 through 4 forever? How do I actually change the world rather than just have good ideas?”

Lanman explained that what makes him proud of Half-Dome 3 in particular is that it’s at “Level 5 or beyond“. More specifically, it’s “almost ready for prime time“.

https://www.youtube.com/watch?v=YWA4gVibKJE

All current VR headsets, outside of lab prototypes, have fixed focus lenses. Your brain gets a different image for each eye, but the images are permanently focused at the same distance (usually a few meters away).

This makes VR feel less real since you’re missing one of the real world’s depth cues. It also introduces the Vergence-Accommodation conflict, a leading cause of eye strain in head mounted displays.

The original Half-Dome prototype, shown off at F8 in May 2018, aimed to deliver near-silent variable focus (varifocal) experience. It also increased the field of view significantly (from around 100 degrees to 140 degrees wide) while maintaining roughly the same form factor.

Half-Dome 1 used physical actuators to move the position of each lens relative to the displays. A product that relies on constant movement like that for hours on end seems like a reliability nightmare waiting to happen, which is where Half-Dome 3 comes in.

Half-Dome 1, Half-Dome 2, and Half-Dome 3

Half-Dome 3 was first shown by Facebook’s Chief Scientist Michael Abrash (Lanman’s boss) in October at Oculus Connect 6. The field of view is “20% wider than Quest”, but no longer 140 degrees.

But Half-Dome 3 brought the technology to a considerably smaller design, with no moving parts. This is achieved by using a stack of liquid crystal lens layers. Applying a voltage to each lens layer changes its focal length, so each unique combination of on and off results in a different plane of focus.

Half Dome 3

At another talk given this year, Lanman expressed just how important he feels variable focus and depth cues are for virtual reality. Given this goal, the number of varifocal prototypes we’ve seen, and Half-Dome 3’s compact form factor and lack of moving parts, could varifocal technology now be out of the research labs- on the beginning of the path to productization?

To be clear, there’s no indication the recent ‘Oculus Quest 2’ leaks have any such technology.

The post Facebook’s Display Research Lead: Varifocal Half-Dome 3 ‘Almost Ready For Prime Time’ appeared first on UploadVR.

OC6: New ‘Half Dome’ Prototypes Smaller, Lighter & More Reliable, But Lack 140 Degree FoV

At Oculus Connect 6 today, Facebook showed off two successors to the ‘Half Dome’ varifocal headset it first showed at F8 2018.

Half Dome 2 3
Half Dome 1, 2, and 3

Half Dome was an advanced prototype headset with moving varifocal displays and a 140 degree field of view. Varifocal means that the headset can adjust the focus distance to be the same as the distance to the virtual object you’re looking at, which makes VR feel more real and reduces eyestrain.

While Half Dome 1 was of course impressive, the space and weight of the varifocal system did not line up with the goal for next generation VR to be light and compact (so that it can be worn for hours). Additionally, Facebook did not comment on the reliability of the mechanical system.

Half Dome 2

Half Dome 2 uses improved optics and actuators to deliver the varifocal experience in a smaller form factor. It’s 200 grams lighter than the original and substantially more compact.

Half Dome 2

Facebook also claims Half Dome 2 is more reliable than the original, thanks to a new varfiocal mechanism based on voice coil actuators and flexure hinge arrays, which reduce friction.

However, Half Dome 2 has a narrower field of view than the original. Facebook’s Michael Abrash stated that the field of view was still 20% wider than Quest, which would suggest somewhere around 120 degrees.

Half Dome 3

‘Half Dome 3’ is a more radical improvement. While the first two Half Domes use moving displays, Half Dome 3 uses a liquid crystal lens “made from a thin, alternating stack of two flat optical elements: polarization-dependent lenses (PDLs) and switchable half-wave plates”.

Half Dome 3

This means that Half Dome 3 has no moving parts, meaning it’s not subject to the reliability concerns of the other prototypes. This makes Half Dome 3 a much more realistic prospect at one day being a mass producable headset. However, Facebook was clear that this won’t be happening “any time soon”.

It also allows it to be significantly smaller than even Half Dome 2. This is potentially the most compact full featured VR headset we’ve seen shown yet. If Facebook’s next generation headsets can one day adopt this varifocal technology and this form factor, it will enable VR to be both visually and ergonomically comfortable for hours on end.

The post OC6: New ‘Half Dome’ Prototypes Smaller, Lighter & More Reliable, But Lack 140 Degree FoV appeared first on UploadVR.

Oculus Chief Scientist Dives Deep Into the Near Future of AR & VR

In his latest presentation at Oculus Connect 5, Oculus Chief Scientist Michael Abrash took a fresh look at the five-year VR technology predictions he made at OC3 in 2016. He believes his often-referenced key predictions are “pretty much on track,” albeit delayed by about a year, and that he “underestimated in some areas.”

Optics & Displays

Image courtesy Oculus

Revisiting each area of technology in turn, Abrash began by discussing optics and displays. His predictions for headset capabilities in the year 2021 were 4K × 4K resolution per-eye, a 140 degree field of view, and variable depth of focus.

“This is an area where I clearly undershot,” he said, noting that Oculus’ own Half Dome prototype shown earlier this year had already met two of these specifications (140 degree FOV and variable focus), and that display panels matching the predicted resolution have already been shown publicly.

SEE ALSO
Google & LG Detail Next-Gen 1,443 PPI OLED VR Display, 'especially ideal for standalone AR/VR'

Abrash highlighted the rapidly progressing area of research around varifocal displays, saying that they had made “significant progress in solving the problem” with the AI-driven renderer DeepFocus that can achieve “natural, gaze-contingent blur in real time,” and that they would be publishing their findings in the coming months.

Image courtesy Oculus

Beyond Half Dome, Abrash briefly mentioned two potential solutions for future optics: pancake lenses and waveguides. Like Fresels, the pancake lens isn’t a new innovation, but is “only now becoming truly practical.” By using polarization-based reflection to fold the optic path into a small space, Abrash says pancake lenses have the potential of reaching retinal resolution and a 200 degree field of view, but there would have to be a tradeoff between form-factor and field of view. Because of the way pancake lenses work “you can get either a very wide field of view or a compact headset […] but not both at the same time,” he said.

Image courtesy Oculus

But waveguides—a technology being accelerated by AR research and development—theoretically have no resolution or field of view limitations, and are only a few millimetres thick, and could eventually result in an incredibly lightweight headset at any desired field of view and at retina resolution (but that is still many years away).

Foveated Rendering

Moving on to graphics, Abrash’s key prediction in 2016 was that foveated rendering would be a core technology within five years. He extended his prediction by a year (saying that he now expects it will happen within four years from now), and that the rendering approach will likely be enhanced by deep learning. He showed an image with 95% of the pixels removed, with the distribution of remaining pixels dissipating away from the point of focus. The rest of the image was reconstructed efficiently through a deep learning algorithm, and it was impressively similar to the original full resolution version, ostensibly close enough to fool your peripheral vision. Foveated rendering ties closely with eye tracking, the technology that Abrash thought was the most risky of his predictions in 2016. Today, he is much more confident that solid eye tracking will be achieved (it is already part of the way there in Half Dome), but this prediction was also extended by a year.

Spatial Audio

Spatial audio was the next topic, and Abrash conceded that his prediction of personalised Head-Related Transfer Functions (the unique geometry of each person’s ear which influences how they perceive the soundfield around them) becoming a standard part of the home VR setup within five years might also need to be extended, but he described how a recent demo experience convinced him that “audio Presence is a real thing.” Clearly the technology already works, but the personalised HRTF used for this demonstration involved a 30-minute ear scan followed by “a lengthy simulation,” so it’s not yet suitable for a consumer-grade product.

Controllers & Input

Image courtesy Oculus

Regarding controllers, Abrash stood by his predictions of Touch-like controllers remaining the primary input device in the near future (alongside hand tracking). After running a short clip of one of Oculus’ haptic glove experiments, he adjusted his previous opinion that haptic feedback for hands wasn’t even on the distant horizon, saying that “we’ll have useful haptic hands in some form within ten years.”

SEE ALSO
Hands-on: HaptX Glove Delivers Impressively Detailed Micro-pneumatic Haptics, Force Feedback

Ergonomics & Form Factor

Abrash presented this sleek concept as plausible form-factor for an AR/VR headset once waveguide optics are mastered. | Image courtesy Oculus

On the subject of ergonomics, Abrash referred to the increasingly significant technology overlap between VR and AR research, noting that future VR headsets will not only be wireless, but could be made much lighter by using the two-part architecture already introduced on some AR devices, where heavy components such as the battery and compute hardware could be placed in a puck that goes in your pocket or on your waist. He said this companion device could also link wirelessly to the headset for complete freedom of motion.

Even still, optical limitations are largely the bottleneck keeping VR headsets from approaching a ski-goggle like design, but advances in pancake and waveguide optics could make for significantly more slender headsets.

SEE ALSO
Toward Truly Glasses-sized AR: First Look at DigiLens' AR HUD Reference Headset

Continued on Page 2 »

The post Oculus Chief Scientist Dives Deep Into the Near Future of AR & VR appeared first on Road to VR.

OC5: 4K Oculus Half Dome Prototype Would Be ‘Straightforward’

OC5: 4K Oculus Half Dome Prototype Would Be ‘Straightforward’

According to Oculus’ Michael Abrash, fitting the company’s new Half Dome prototype with 4K displays would be “straightforward”.

Abrash said as much during his keynote speech at Oculus Connect 5 today. He explained that the current Half Dome prototype, which made its debut at F8 earlier this year, has a resolution “roughly” the same as the Rift. He later added that “4K panels that would provide 30 pixels per degree over a 140 degree field of view have already been shown publicly, and using one in Half Dome would be straightforward.”

4K resolution panels will be essential to giving us clearer VR experiences in the future, further eliminating the screen door effect (SDE) seen in current headsets.

As Abrash alludes to in the quote, Half Dome also sports a 140 degree field of view and even varifocal displays that adapt to where the user is looking to accurately produce focal depth in VR. We were hoping we might be able to get a first look at the device at Connect this year though Oculus is focusing on its new standalone headset, Quest. Earlier in the day, Facebook’s Hugo Barra noted that Quest completed Oculus’ first generation of devices, lending more evidence to the idea that Half Dome will eventually materialize as Rift 2.

When we’ll actually see that happen remains unclear although, according to Abrash, it could be a little sooner than we think.

Tagged with: ,

The post OC5: 4K Oculus Half Dome Prototype Would Be ‘Straightforward’ appeared first on UploadVR.

What Magic Leap One And Facebook’s Half Dome Have In Common

What Magic Leap One And Facebook’s Half Dome Have In Common

iFixit revealed a full teardown of Magic Leap One Creator Edition recently along with some long-awaited clarity about how the system operates.

The teardown from iFixit credits Oculus founder (and former Facebook employee) Palmer Luckey with helping tear down the “Lightwear” glasses, which rely on a “Lightpack” processing unit that’s wired and worn on your side. Head on over to iFixit to see all the teardown photos or to Luckey’s website to read his review of the device. He also, by the way, suggests the headset sold fewer than 3,000 units so far.

The following illustration from iFixit breaks down the basic way the system functions as content flows from Lightpack to Lightwear:

Diving a little deeper, iFixit explains Magic Leap One uses color-specific waveguides to deliver visuals at two distinct focal planes, each composed of red-green-blue. This makes Magic Leap One a fixed multifocal display.

Not Exactly A Light Field Display

Don’t take it as a definitive guide, but the chart below shows a variety of display types for mixed reality as seen in a presentation by Facebook Reality Labs researcher Douglas Lanman at Display Week earlier this year.

Lanman’s presentation explained Facebook’s steps toward an opaque varifocal headset for VR. That’s quite a bit different from a see-through fixed multifocal headset like Magic Leap is using for AR.

In the chart above, you can see fixed focus headsets at the top. Those are the types of headsets we have today like HoloLens, Rift, Vive, PSVR, Daydream, and Gear VR. Fixed focus headsets usually fix your focus at a distance and render the environment and all digital elements at that distance.

From the chart above, a varifocal headset like Half Dome and a multifocal headset like Magic Leap One have something in common — they both depend on eye tracking to get digital objects to the right focal depth. Magic Leap One apparently decides on which plane to put content by flashing imperceptible lights into the corners of your pupil and measuring changes in the reflections.

Here are those lights as seen through a high-frame rate camera:

Tracking eye movements is not unique to Magic Leap. Apple bought SMI, one of the leaders in this area, and many other companies are exploring the technology with partners like Tobii. StarVR’s new ultra-high end commercial VR headset relies on eye-tracking to enable foveated rendering over an ultra-wide field of view and VRgineers’ enterprise XTAL headset uses eye-tracking to adjust IPD mechanically. If Facebook’s Half Dome ever makes it to market it likely will need eye-tracking too.

Optics And Eye Strain

Will VR and AR headsets be comfortable enough for most people to wear most of the day? The answer to this question may be related to the type of optical design used.

So far, most headsets from HoloLens to Oculus Rift and Vive Pro have been fixed focus headsets which struggle with the vergence-accommodation conflict. When you focus on digital objects that should be closer to you digitally, your eyes naturally want to point inward toward one another. Since the headset is focusing your eyes at a different distance the mismatch can create eyestrain in current headsets. In Stanford researcher Jeremy Bailenson’s book Experience on Demand he noted “most academics and thought leaders in VR believe this problem will prevent long-term use of headsets.”

I asked Bailenson for clarification to see if a fixed multifocal or varifocal headset would make a difference.

“There is a generally shared viewpoint among engineers and perceptual scientists that solving the vergence-accommodation conflict will reduce eye-strain for longer use cases of AR and VR,” Bailenson wrote in an email. ” I have only seen one demo that can linearly shift the focus of a display (as opposed to swap between a handful of focal points).  It was incredible from pure ‘amazement’ standpoint.  But there is not data that I am aware of that quantifies the degree of improvement there will be regarding simulator sickness, especially in terms of long-term use.”

Here’s how Lanman described the problem to me earlier this year:

“Nearly all consumer HMDs present a single fixed focus. Some have focus knobs, but most just lock the optical focus of the displays to something around two meters. When you look at a near object, vergence (eye rotation) and accommodation (deformation of the eye’s crystalline lens) move together. As your lens deforms to focus on a nearby virtual object, it is focusing away from the fixed focus of the HMD. So, most people report seeing some blur. Sustained vergence-accommodation conflict has been linked, in prior vision science publications, to visual fatigue, including eye strain.”

Magic Leap One appears to be the first standalone AR headset which begins to track eye movements to place digital content at a couple different depths. Is this approach more comfortable for your eyes? The answer isn’t clear yet, but Luckey’s technical breakdown notes getting this right “is even more important for AR than VR, since you have to blend digital elements with real-world elements that are consistently correct.”

While a modern VR headset already delivers digital content over a far wider field of view than current AR headsets, the varifocal Half Dome architecture Facebook debuted earlier this year would also track eye movements to get the focal distance right all the time, at least theoretically.

Half Dome physically moves elements of the display in tandem with pupil movement:

So with a headset based on Half Dome you’d see more of a virtual world in your periphery than current VR headsets and your eyes might be more relaxed and able to see clearer objects up close. Compare this possible roadmap for VR headsets to what we’ve seen from AR.

The field of view on the original HoloLens AR headset is so limited that certain applications, like gaming, generally aren’t fun when you constantly see characters or objects getting cut off.

Teardown showing the components of the original HoloLens

Magic Leap One improves field of view so wearers note the limitation less frequently. Microsoft’s next generation HoloLens, due in 2019, could improve the field of view further for AR. Does the next HoloLens also ditch the fixed focus display?

Indoor vs. Outdoor

There is a long way to go before AR and VR headsets can meet in a middle ground where one headset could be a dual-mode “mixed reality” device great for both use cases.

Something that often gets muddled in discussion of VR and AR is outdoor use. Current headsets don’t have tracking systems in place to make outdoor operation safe and reliable yet. I’m also not sure people will want to wear fanny packs filled with batteries in order to enjoy AR throughout their day. Plus, see-through AR displays are usually most effective indoors in relatively dimly lit locations. For all these reasons, using any headset outdoors is mostly an academic concept in 2018. The dream of “mixed reality” is one day for a single headset to turn from perfectly transparent (AR) to completely opaque (VR) with digital objects shown throughout its wearer’s entire field of view in either mode. But there are hard engineering problems to overcome.

One simple approach is to slap a sunglasses-like filter onto a transparent AR display, effectively turning it into a VR display at any moment. This quick fix won’t change the underlying optics and field of view, though, or reduce the needs of power-hungry tracking systems which are necessary to establish a sense of safety and confidence for the person wearing a headset.

Fixed focus VR displays, though, like the kind popularized in part by Luckey and Oculus, are very cheap and work great indoors. You could turn the lights out and dance in the dark with an Oculus Rift on and you could still feel like you’re standing on the beach on a sunny day. Facebook’s Half Dome might be years away from transformation into something consumer ready, but by increasing field of view, and adding both eye-tracking and variable focus to a VR headset, Facebook is showing that it is effectively trying to engineer the perfect display technology specifically for VR and indoor use.

Facebook’s head VR researcher Michael Abrash making predictions in 2016 about VR headset capabilities in 2021.

Magic Leap raised more than $2 billion over the last six years or so, to some degree, by promising a revolution to AR optics by way of a “photonic light field chip” and a whale that could fill a gymnasium.

But that’s not Magic Leap One, and it is not clear how Magic Leap Two or Three would improve things. Instead of a whale, you can enjoy a small car darting around your room with Magic Leap One. The car is likely to keep you so interested in its movements that you’ll never notice the edge of the display as it darts around directly in front of you. That’s about the best an AR headset can do in 2018.

Don’t Use Mixed Reality When You Mean AR or VR

Some investors and entrepreneurs remain entranced by the potential of AR headsets to change the way we carry out our everyday lives both inside and outside our homes. Luckey seems to think it is these people — not developers — who represent the bulk of the folks buying Magic Leap One. That partially explains why Luckey refers to Magic Leap One as a “Tragic Heap.” For years, Magic Leap’s CEO Rony Abovitz raised money to build this headset while complaining publicly about the optics of current VR systems and promising a “photonic lightfield chip” to leap ahead of the market. Even now. Abovitz claims their headset “utilizes our unique optics to correct issues found in classic stereoscopic systems.”

Well the time for Magic Leap hype is over. Language like Abovitz uses might be good at raising money for a startup, but it does little to lure true developers into making a $2,300 purchase. It looks like Magic Leap One is a solid AR developer kit but there’s nothing magical about these optics and enthusiasm for AR shouldn’t come at the expense of a clear picture of the overall market. The perfect AR display remains elusive just as the perfect VR display is still only a concept. In the meantime, lots of people using the terms “XR” or “Mixed Reality” to describe the market as one single thing may be doing themselves and others a disservice by oversimplifying things too early.

What’s important to remember is that Mark Zuckerberg’s 2014 acquisition of Oculus has already seen the release of three completely different consumer VR headsets in an attempt to zero in on a gadget that is compelling to both developers and buyers. We expect that gadget to arrive in Q1 2019, precisely when Magic Leap should be thinking about how it can raise more money to build successive generations with a better field of view and a better outdoor tracking system. Facebook has the personal information of billions of people to target the sale of ads against, helping to fund the development of next generation technology. Even with its billions raised so far, though, I’m doubtful Magic Leap has the cash necessary to develop a headset delivering on the true promise of “mixed reality”.

While Magic Leap is delivering one of the first headsets on the market with eye-tracking built into every device, it won’t be the last. Eye-tracking is looking like it will be critical to major advances in AR and VR optics, cost and usability, but it also comes with enormous privacy risks.

Tagged with: ,

The post What Magic Leap One And Facebook’s Half Dome Have In Common appeared first on UploadVR.

Watch Oculus Detail Its Varifocal Half-Dome VR Prototype Here

Watch Oculus Detail Its Varifocal Half-Dome VR Prototype Here

Last month, we reported on a talk by Douglas Lanman of Oculus Research (now Facebook’s Reality Labs) from the 2018 SID Display Week event. The fascinating 40-minute session showcased some of the team’s latest work in VR hardware. Now you can watch that talk for yourself.

Below is a direct feed video of Lanman’s talk, released this week. Over the course of the session, he talks about Oculus’ work with Varifocal Displays and eye-tracking to create even more immersive VR experiences than what can be seen now with the Oculus Rift. Around midway through, Lanman starts sharing videos of prototypes created over the past few years in the Research Labs.

It makes for a pretty fascinating watch and, eventually, leads to the Half-Dome prototype that Facebook first teased at its F8 conference last month. This new device has a massive 140 degree field of view with varifocal optics and more. We don’t know when/if we’ll see the Half-Dome released as a true successor to the Oculus Rift but hopefully we’ll have more to talk about at this year’s Oculus Connect developer conference. That takes place this September.

Tagged with:

The post Watch Oculus Detail Its Varifocal Half-Dome VR Prototype Here appeared first on UploadVR.

Facebook Explains Why It Engineered The Half Dome Varifocal VR Headset

Facebook Explains Why It Engineered The Half Dome Varifocal VR Headset

At Display Week in Los Angeles Facebook revealed why its researchers and engineers built a varifocal VR headset.

Over the last few years researchers at Facebook’s Reality Labs (formerly known as Oculus Research) developed a series of prototypes designed to solve a fundamental problem facing current VR headset design. The event opened Tuesday with a keynote by Douglas Lanman, who leads the Computational Imaging Team at FRL which developed the prototypes in partnership with eye tracking systems developed by Rob Cavin and Alex Fix as well as wide-field-of-view optics developed by a team led by Jacques Gollier.

The work was first revealed as the Half Dome prototype at Facebook’s recent developer conference, but the presentation during Display Week went much deeper as part of a symposium put together by the world’s preeminent researchers and engineers in display technology. Lanman used the event to explain how and why Facebook engineered this system with moving displays over multiple generations. It started with a loud monstrosity but was eventually engineered into what we see in Half Dome.

The headset actually moves the displays to match the positioning of your eyeballs, and could help with the vergence-accommodation conflict plaguing VR headsets today. In virtually all consumer VR headsets, the lenses make your eyes focus far away. When objects appear near, there’s a conflict in where the lenses of the headset are focusing your eyes and where they naturally wants to focus. This can cause eyestrain and limits how long some people want to wear a headset.

Here’s how Lanman described the issue in an interview with UploadVR:

“Nearly all consumer HMDs present a single fixed focus. Some have focus knobs, but most just lock the optical focus of the displays to something around two meters. When you look at a near object, vergence (eye rotation) and accommodation (deformation of the eye’s crystalline lens) move together. As your lens deforms to focus on a nearby virtual object, it is focusing away from the fixed focus of the HMD. So, most people report seeing some blur. Sustained vergence-accommodation conflict has been linked, in prior vision science publications, to visual fatigue, including eye strain.”

Exploring the kinds of displays to solve this problem is a “daunting engineering challenge,” according to Lanman, so the “science community is only beginning to investigate.”

“In terms of visual clarity of near objects, varifocal displays have proven beneficial in our experience, as well as according to prior publications,” Lanman said.

In our interview, Lanman offered the first hint of what it feels like to try the Half Dome system.

“In a quiet room I don’t hear the screens or feel them moving,” Lanman said. “These are still feature prototypes, so the engineering isn’t completely perfected.”

An Oculus spokesperson declined to say when developers or journalists might be able to test the system. (Oculus has in the past presented new dev kits for testing at its developer conference late in the year.)

“We may never see these specific technologies in a product,” spokesman Brandon Boone wrote in an email. “Not ruling it out forever, but for now, we aren’t showcasing it any further than we currently have.”

Why Varifocal VR Matters

This chart from Lanman’s presentation and sourced to Matsuda et al shows different types of VR headsets with focus support.

Virtual worlds that keep everything at a distance already look great and Half Dome’s moving displays wouldn’t do much to improve that, as Lanman explains. But experiences that have lots of interactions within arms reach (like many of the best VR experiences) could be made to appear way more detailed (less blurry) with a varifocal display helping to naturally focus your eyes in the near field. Lanman notes, however, some readers over the age of about 45 may not have “noticed an issue with clarity of near objects.”

“This may be due to the viewer beginning to experience presbyopia,” Lanman said. “Wherein they can no longer focus on close objects without the aid of bifocals, trifocals, progressives, or some other augmentation.”

The work presented this week by Facebook Reality Labs joins other major advancements, like an 18.1 megapixel display developed by LG and Google that could one day make it harder to distinguish real from virtual in a headset. Japan Display Inc. and Samsung also presented super high resolution displays tuned for VR. Taken altogether, the work makes clear that some of the world’s largest technology companies remain committed to developing much better VR headsets.

Lanman said his career is “driven by the goal of making displays indistinguishable from reality.”

“With AR/VR, I believe there is a unique opportunity to make a beyond-theatrical experience, where expansive views, close objects, and interaction are all possible,” Lanman said. “With this, I expect AR/AR will evolve to realize an infinite canvas for unlimited stories.”

Also be sure to read the Oculus blog post about their Half Dome efforts.

Tagged with: , ,

Oculus on Half Dome Prototype: ‘don’t expect to see everything in a product anytime soon’

At Facebook’s F8 developer conference Oculus revealed a glimpse at an intriguing new headset prototype dubbed ‘Half Dome’. Including a 140 degree field of view, varifocal displays, and what appears to be eye-tracking, the prototype is a tantalizing peek at the company’s research and what may lay ahead of us—just don’t expect it everything we saw “anytime soon,” says Oculus co-founder and Head of Rift Nate Mitchell.

Besides the fact that Oculus is undoubtedly working on a second flagship PC VR headset, nothing is known about it thus far. And derailing the hypetrain somewhat, Mitchell took to Reddit to address comments reeling from the prospect of Half Dome’s technology making its way into a potential Rift 2.

Image courtesy Facebook

“Seriously, a varifocal display?” writes Reddit user ‘DarthBuzzard’. “I honestly expected that to be CV3 and CV2 would have simulated depth of focus rather than full depth of focus. Looks like things really are moving faster than expected!”

In response, Mitchell had this to say:

“[Maria Fernandez Guajardo, Head of Product Management, Core Tech at Oculus] covered a bunch of areas of long term research for us. This is just a peek into some feature prototypes we’ve been working on. However, don’t expect to see all of these technologies in a product anytime soon.”

While this doesn’t entirely negate a prospective Rift 2 with varifocal displays, 140 degree field of view, and eye-tracking (or any combination of the three), being able to productize all of these these things into a single headset will likely take time to get right.

SEE ALSO
Oculus Claims Breakthrough in Hand-tracking Accuracy

VR headsets are ideally robust devices built to withstand the daily abuses from their owners, and varifocal displays, which physically move to accommodate a wider range of focus, introduce a number of moving parts that are constantly moving in tandem with the user’s gaze. These parts undoubtedly also complicate manufacturing and increase the overall cost of the device too.

 

Eye-tracking is however something that is both physically robust, and probably much cheaper to make for Oculus considering last year’s acquisition of Eye Tribe, a Denmark-based eye-tracking startup which advertised “the world’s most affordable eye tracker” as far back as 2013.

As for the wider field of view: it’s still uncertain if the varifocal displays were a key technology in obtaining the 140 degree FOV, although Fernandez Guajardo stated at F8 that the company’s “continued innovation in lenses has allowed [Oculus] to pack all of this technology and still keep the Rift form-factor.” One of the images shown at F8 does show a much larger pair of supposed Fresnel lenses—so not a stark impossibility either.

Image courtesy Facebook

At GDC last year, Head of Oculus PC VR Brendan Iribe stated that Rift will remain the company’s flagship VR headset for “at least the next two years.” Mincing Iribe’s statement somewhat, that puts a potential Rift 2 launching sometime in 2019 at its earliest.

We hope to see more at Oculus Connect 5 which should be sometime in Fall 2018.

The post Oculus on Half Dome Prototype: ‘don’t expect to see everything in a product anytime soon’ appeared first on Road to VR.

Oculus Demos ‘Half-Dome’ Rift Prototype With 140-degree FOV And Moving Varifocal Screens

Oculus Demos ‘Half-Dome’ Rift Prototype With 140-degree FOV And Moving Varifocal Screens

During the second day of Facebook’s F8 developer conference, Oculus demonstrated what appears to be a future version of its Rift VR headset, internally called “Half-Dome.” While preserving the form factor and weight of the current Rift, the fully functional Half-Dome prototype includes several major hardware innovations designed to increase visual immersion and comfort.

Above: Oculus Half-Dome.

The first and most amazing technology is Varifocal, a mechanical system that actually moves the screens within the headset depending on what you’re looking at, mimicking your eyes’ ability to focus on nearby objects. Oculus’ Maria Fernandez Guajardo noted that until now, the VR industry has had to place objects at a minimum two-meter distance to prevent users from having eye-focusing issues; Varifocal solves this, enabling you to read a note or examine an object in your hands. The feature uses an optimized mechanical design with no noticeable noise or vibrations.

Above: Varifocal moving screens inside of Half-Dome.

Another tweak is an enhancement to the standard field of view offered by traditional VR headsets, which is around 100 degrees compared with the average person’s 210-220-degree real world field of view. Half-Dome has a 140-degree field of view, a nice step up that increases your peripheral vision, enabling you to see — though not focus on — objects at the edges of your head. Oculus also suggested that improvements in display resolution would naturally follow due to the evolution of panel technologies.

Alongside the new hardware, Oculus also showed off software realism improvements for everything from hands to avatar face and body rendering, as well as tricks to reproduce real world 3D environments in VR. Since the first thing most players see in VR is their hands, Oculus is using a new system of “deep marker labeling” to pinpoint joint points on hands, track them accurately using 2D cameras, then represent them convincingly in 3D using AI solutions.

“The aim is to turn these marker positions into labeled hand poses that we will use to train our models,” said Guajardo, “but labeling hands is particularly hard for complex interactions.” Regardless, demonstrations showed individual fingers moving smoothly and hands manipulating a music box, all convincingly. Other demos showed head and mouth tracking that enabled photorealistic avatar head motions for person-to-person communications, including reasonably lip-synched speech, and the evolution of avatar bodies into increasingly complex forms. Facebook CTO Mike Schroepfer said that the company is working on projects “to bring all of the body into VR.”

Above: Facebook’s Point Cloud technology for vaguely recreating rooms in 3D.

Environmental reconstruction is another big area of Facebook and Oculus development. On day one of F8, the companies showed off software that stitches photos and videos together into “point cloud” reconstructions of rooms. This software works with whatever existing reference images or videos you have of a space, but the VR-explorable product is deliberately hazy, like a moving 3D pointillist painting.

Oculus also showed a more sophisticated second system that uses a burst of stereoscopic images to create a photorealistic panorama in 3D — one that can be enjoyed in VR. This system uses image pairs to gather depth information, blending them together to create super-detailed 3D locations. Guajardo suggested that users’ “most evocative places” are their homes, parents’ homes, and favorite vacation spots, all of which can be recreated with this software without the need for professional equipment or artistic talent.

Above: Maria Fernandez Guajardo discusses Oculus’ photorealistic 3D VR room recreation technologies.

A demo of the second system showed a real room in a home next to a virtual 3D reconstruction. The real and VR spaces looked almost identical — so similar that you could only tell the difference in fine texture details, and the subtle presence of the videographer’s shoe in the original. Mirror reflections and other elements of the original scene were preserved.

Above: Maria Fernandez Guajardo discusses Oculus’ photographic burst technology for creating photorealistic 3D panoramas.

Schroepfer said that putting all of the innovations together demonstrates why Facebook is so excited about VR, and why it will become the only “way people will want to connect over long distances.” No dates were given for the release of the new software innovations or the Half-Dome successor to the Rift.

This post by Jeremy Horwitz originally appeared on VentureBeat.

Tagged with: