Hands-On With Kura’s Breakthrough Wide Field Of View AR Technology

San Francisco-based startup Kura Technologies (official website) claims it will launch compact AR glasses with a wide field of view, high resolution, high opacity, high brightness, and variable focus in mid-2020. We got the chance to try a series of four different prototypes from Kura that each demonstrated portions of these promises. We came away very impressed.

The nature of the demos we tried makes it hard to say what the actual finished device will be like, but we’re optimistic. None of the demos were shown on a product that resembles the mock-up images on their website and all four of our prototypes were described as being 8-12 months old from where they are at with the technology right now. You can read more about those demos deeper into this story.

Photographs and videos of any of the hardware at all were not allowed during my meeting — only workspace photos like the ones shown below.

kura workspace office sf ar 1 kura workspace office sf ar 2

We’re told Kura intends to bring a functional prototype to CES in January that will have all of the functionality in a single device. However, it’s worth noting that the device they plan to ship in mid-2020 is specifically targeting only enterprise customers first.

If they can pull it off — and that remains a big if — this startup will have created a product with specifications years ahead of all known public competitors, including Microsoft, Magic Leap, and Nreal.

 

The Kura Gallium

To understand more about why what they’re doing seems significant, let’s take a step back. Kura’s product is called Gallium. Kura describes Gallium as having “eyeglass form factor”, yet the claimed specifications are far beyond even any known large bulky AR headsets. 

According to marketing materials, the glasses are said to be powered by a hip-mounted compute pack via a Snapdragon 855, similar to how Magic Leap and Nreal work, but when speaking with CEO Kelly Peng, she told us the initial version will likely be tethered to a PC at first before providing the compute pack as a secondary power option later. Then further in the future, an adapter could allow for wireless communication with a PC that’s within range.

Kura’s website lists the price for the glasses with the compute pack as $1199, a Lite version without the compute pack for $899, or the compute pack on its own for $399.

Kura AR Glasses

Claimed Specifications

  • Field of View: 150° (Binocular, Diagonal)
  • Focus: 10 cm to infinity
  • Brightness: 4000 nits (outdoor viewable)
  • Max Transparency: 95%
  • Resolution: 8K 75Hz / 6K 100Hz / 4K144
  • Image Quality: 100% DCI-P3, HDR, True Black Capable
  • IPD: Automatic accommodation of 55-68mm
  • Weight: 80 grams

Kura AR Glasses

These specifications would put Gallium significantly ahead of any known AR headset. HoloLens 2 and Magic Leap have a maximum diagonal field of view of just about 50 degrees. Magic Leap One does not have sufficient brightness to be used outside, and even HoloLens 2 has just 1/4 the claimed brightness of Gallium.

Crucially, Magic Leap One supports just two focal planes, and HoloLens 2 is fixed focus. With automatic IPD accommodation and varifocal from 10cm to infinity, Gallium would be visually comfortable to wear all-day.

If the company truly is achieving all this in an 80g pair of glasses, it would likely accelerate the arrival of consumer AR by years. But the magnitude of these claims should be met with skepticism — even after we tried many of these features in person.

 

The Prototype Demos

During our visit to Kura we tried out four different prototype demos and spoke with CEO Kelly Peng for nearly an hour. Our tour of the Kura workplace, which doubles as a home for several of the core team members, was eye-opening (pun intended) to say the least.

Three of the four demos did not have head tracking and were not on wearable devices. Instead, they were mounted on tables, completely stationary, to show off the display, optics, field of view, brightness, etc. in controlled environments. This is common for early head-mounted technology prototypes.

The first demo, which was stationary on a mount, was a great example of their optics technology using 90%+ transparent lenses combined with high-brightness images to really make objects look like they were in the real environment rather than suffering from the faded and blurry effect you get in at lot of current AR devices. The models shown were far brighter than anything I’d seen in an AR device before with great focal clarity. As a glasses wearer, the quality of the image and the field of view was really encouraging to see in such an early stage of development.

The second demo was also stationary, but this one had an even larger field of view and showed a larger range of animations and types of content. Most of the animations shown in the demo video embedded above were shown during this demo, and they looked about as crisp as you could hope for in an AR device. Again, I was pretty impressed. This demo also included hand tracking so I could reach out and see my hand moving around. There wasn’t any interaction here but it did show a wide range of colors.

Next up, in demo number three, I tried a fully wearable device similar to the one pictured below as an early prototype that had head and hand tracking operational. Again, this was said to be at least eight months old. This was another good demonstration of the field of view because when my hand reached out into the view of the cameras and lenses, it added a augmented overlay to my skin and I could interact with all of the floating multi-colored particles. I could reach my right arm across my body and see the AR overlay from my fingertips all the way down to my elbow. It didn’t have a postage stamp-sized vision box like in other AR devices. There was a delay trail when I moved my arm for the overlay to re-align itself — but again, early prototypes and all that.

Finally the last demo was the roughest and most experimental of the bunch. One of the Kura Gallium’s touted features is the focal distance that can adapt from 10cm all the way to infinity. In this demo I saw a green matrix-style animation of a cat floating in front of my eyes, almost large enough to look life-sized, and then it slowly shrank and faded into the distance like it was being shot into space, Bag Raiders style. The trick here though is that I could still clearly see it even as it drifted into the distance. It never lost focus.

Kura Gallium Real Occipital

Kura appears to be using Occipital’s technology for positional tracking and scene reconstruction. This lets Kura focus their resources on the display technology. An Occipital video from November 2018 appears to show footage of an old Kura prototype. This prototype has a form factor significantly bulkier than the images shown today on Kura’s website and it’s very similar to one of the prototypes we tried — specifically the third one mentioned above.

Prior to our demo we reached out to Peng about this prototype, who confirmed on Twitter: “This is purely software or integration test demo we built early on, not the optics we are going for product. We used to make some giant reflectors optics 3-4 years ago, but since then totally moved away from that because of the size, contrast ratio, brightness issues.”

“Structured Geometric Waveguide”

How exactly is Kura achieving this?

Before this announcement, no credible company has claimed specifications anywhere near these. As recently as October 2018, Facebook’s chief AR and VR researcher Michael Abrash stated that the technology to enable compact wide FoV AR glasses “doesn’t yet exist“.

Almost all AR headsets today, including HoloLens 2 and Magic Leap One, use a diffractive waveguide. This technology has a fundamental limitation on field of view, and can make the real world appear dull due to the semitransparent nature of the see-through optics. This is all despite both products having a larger form factor and higher price than Gallium.

kura tech sensors pic

Kura claims their breakthrough is to use a microLED strip with a “structured geometric waveguide” as the combiner. While microLED displays are normally expensive and there are ongoing efforts to figure out how to affordably mass produce them, Kura’s design would only need a single row of pixels, which would allow for low cost and mass production.

The company describes this as follows:

“Like in a diffractive waveguide, light is coupled down the eyepiece via total internal reflection, but unlike a diffractive system, the structures in the eyepiece are explicitly much larger than a wavelength, which prevents colored ghosts in ambient light. Furthermore, the out-coupling elements are ordinary geometric optics, not holograms, which mitigates the narrow angle of acceptance from which diffractive elements suffer from. In addition, a careful multi-layer design allows the out-coupling elements to cover about 5% of the eyepiece’s area, allowing us to maintain very high transparency.”

It is possible Kura Technologies invented the missing display technology needed to make mass market AR glasses achievable. Again, though, it is hard to confirm exactly how the Gallium works without seeing all the pieces put together into a finalized product design. It is not uncommon for technologies to be possible and impressive in the prototype phase but never work out as a true product due to issues such as manufacturing being too hard or the expense involved in producing hardware at scale.

But since we’ve seen the pieces all functioning, albeit mostly separately at this stage, we’re optimistic enough to say Kura seems to be one of the key companies to keep an eye on in the AR space.

Kura Execs

Kura’s Team

Kura’s CEO Kelly Peng is on Forbes 30 under 30 for Manufacturing & Industry. At UC Berkeley, Peng says she worked on custom LiDAR designs for self driving vehicles. The CTO, Bayley Wang, was a high-performance optical simulation algorithms researcher at MIT and a math genius winner of a major North American undergraduate mathematics competition. The COO, Garrow Geer, is said to have been a particle accelerator operator and research engineer at Jefferson Lab and CERN.

Kura also tells us they have employed GoPro’s former lead electrical engineer, the designer of the electronics in the Xbox controller, “one of the most reputable optical experts in the world”, experts with over 100 patents in optics, displays and materials and several decades of combined experience in optical design, industry leaders with over 20 years of experience in AR/VR manufacturing and sales.

The company describes its team as “industry leaders, brilliant technologists and experienced subject-specific experts, with MIT, UC Berkeley, Stanford, EPFL & UBC alumni.”

Abrash Waveguides

Is Facebook Doing This Too?

Facebook, the company behind Oculus, is also developing AR glasses. While the company has not revealed any specifics on what display technology it is using, it did give several hints at Oculus Connect 5 in 2018.

When talking about display technologies, the company’s chief researcher Michael Abrash stated that waveguides “could potentially extend to any desired field of view in a slim form factor“. On screen, a graphic showed waveguides as allowing for up to 200 degree field of view.

He also noted that since no suitable display technology existed yet for AR, “we had no choice but to develop a new display system“.

At the time, this confused some optics experts, as well known limitations of diffractive waveguides limit their practical field of view to around 50 degrees. Abrash’s description of waveguides did not reflect any known designs.

Given Abrash’s comments, Facebook’s large investment in AR research, and the company’s hiring of renowned display technologies experts like Douglas Lanman, it is possible Abrash was referring to a similar system to what Kura is working on — a non-diffractive waveguide using geometric optics. 

We’ll keep a close eye on Oculus Connect 6 for any details on Facebook’s approach to AR optics and how it compares to what we’ve seen of Kura. 


Stay tuned to UploadVR for more details on Kura, including our in-depth interview with CEO Kelly Peng next week.


This article about Kura is co-authored by Staff Writer David Heaney, who did the background research and wrote the first draft, Senior Editor David Jagneaux, who provided the hands-on impressions and additional details, and Managing Editor Ian Hamilton, who provided editing.

Editor’s Note: We added clarification that Kura is targeting enterprise customers first.

The post Hands-On With Kura’s Breakthrough Wide Field Of View AR Technology appeared first on UploadVR.

Oculus Insight: Facebook Details Quest’s Inside Out Tracking System

Facebook provided background on the development of the inside out positional tracking technology which enables both Oculus Quest and Rift S to operate without any external cameras.

A pair of blog posts published today by Facebook explain how a team spread across the company’s VR development labs in Zurich, Menlo Park, and Seattle built the technology.

Facebook Optitrack Oculus Quest Home Testing
Facebook employees installed OptiTrack cameras in their own homes to test out VR tracking in a variety of conditions which could be used as the basis to improve Oculus Quest’s tracking system.

According to Facebook, the company used OptiTrack cameras and “by comparing the measurements recorded with the OptiTrack cameras with the data from Oculus Insight, the engineers were able to fine-tune the system’s computer vision algorithms so they would be accurate within a millimeter.” Employees tested out the cameras in their own work spaces and homes to recreate a variety of conditions in which a headset like Quest might be used.

Here’s how Facebook described the process:

“The OptiTrack systems would track the illuminators placed on participants’ HMDs and controllers and throughout each testing environment. This allowed us to compute the exact ground-truth 3D position of the Quest and Rift S users and then compare those measurements to where Oculus Insight’s positional tracking algorithm thought they were. We then tuned that algorithm based on the potential discrepancies in motion-capture and positional data, improving the system by testing in hundreds of environments that featured different lighting, decorations, and room sizes, all of which can impact the accuracy of Oculus Insight.”

“In addition to using these physical testing environments, we also developed automated systems that replayed thousands of hours of recorded video data and flagged any changes in the system performance while viewing a given video sequence. And because Quest uses a mobile chipset, we built a model that simulates the performance of mobile devices while running on a general server computer, such as the machines in Facebook data centers. This enabled us to conduct large-scale replays with results that were representative of Quest’s actual performance, improving Insight’s algorithms within the constraints of the HMD it would have to operate on.”

Facebook says that to get this same Simultaneous Localization And Mapping (SLAM) technology to work inside slimmer AR glasses, they’ll have to figure out how to reduce latency further while “cutting power consumption down to as little as 2 percent of what’s needed for SLAM on an HMD.”

Here’s a video detailing how the technology works:

The post Oculus Insight: Facebook Details Quest’s Inside Out Tracking System appeared first on UploadVR.

Facebook Patents Reveal Deep Research On True Haptic VR Gloves

Multiple patents awarded to Facebook this year suggest the company is researching a range of technologies which could enable force feedback gloves for VR.

Gloves may be the ultimate goal for VR input. The term “haptic gloves” usually refers to finger tracking gloves with vibration motors on each finger. Force feedback gloves, though, go further by restricting the movement of fingers in response to a simulated object or surface.

Facebook previously showed research involving a haptic glove, but not force feedback gloves. These three patents, however, provide insight into some of the company’s research.

Note these are not applications, but actual awarded patents.

Microfluidics

In March of this year, Facebook was awarded a patent titled Switchable fluidic device.

Switchable Fluidic Device

The patent describes a glove with “soft materials that use millimeter or smaller channels filled with fluid“. By controlling the flow of fluid through these tiny channels, the system adapts the pressure it applies to the finger joints.

When the pressure is high, it “prevents or enables a physical movement of a portion of a user in contact with the virtual object in the virtual space“, according to the patent description. “For example, if a user’s finger is in contact with a virtual object (e.g., a virtual wall) in a virtual space, the haptic assembly prevents a physical movement of the user finger to move in a direction through the virtual object in the virtual space. Accordingly, the user can receive a perception of contacting the virtual object.

One “embodiment” of the gloves is described as covered with infrared LEDs for positional tracking from cameras, just like an Oculus Touch controller.

Hard/Soft Touch Simulation

In June, Facebook was awarded another patent related to haptic gloves, titled Haptic devices that simulate rigidity of virtual objects.

Adaptive Rigidity

Almost all gloves to date use vibration motors to mimic the feeling of touching virtual objects, however, this is not particularly realistic. The glove in the patent instead uses an array of plates which dynamically actuate to touch the user’s finger with a force simulating the object the user’s finger is touching.

When pressing against a hard virtual object such as a button, the plates would resist the pressure of the finger. When pressing a less rigid virtual object the plates can “give” much more easily to pressure, providing the feeling of softness.

Pneumatic Bladder

Last week, Facebook was awarded yet another patent. This time the idea covers an alternative to microfluidics for force feedback: Pneumatically controlled haptic mechanisms with nested internal structures for haptic feedback.

Pneumatic Glove

The glove is covered in an array of stacked “pods” which each contain a pneumatic “bladder” made of “a durable, puncture resistance material, such as thermoplastic polyurethane (TPU)”. By adding and removing air from each bladder, the pressure against the user’s hand at that position can be altered: “Depending on a posture of the user’s finger when the pressure inside the bladder is increased, the user may experience his or her finger becoming stiff and rigid, bending downwards, or bending upwards (e.g., pushing and pulling sensations).

This pneumatic technology, because of the higher pressure it can generate, could simulate more intense events, even pushing back on the fingers rather than simply restricting them. The patent describes how this could be used in VR games:

just prior to releasing an arrow from a bow in real life, a tremendous force is applied to the pads of the fingers drawing the bow string. Therefore, in virtual reality, the haptic stimulation created by the wearable device would need to be intense to provide some realism to the virtual reality experience (e.g., one or more pods on each string-contacting finger push against the string-contacting fingers and attempt to straighten these fingers, as would the bow string in real life)”

Pneumatic Glove

While such a system may not be reliable for hours of usage per week, the patent describes designing with this in mind: “Due to the ever-changing nature of virtual and augmented reality, the pods may be required to transition between the two states hundreds, or perhaps thousands of times, during a single use. Thus, the pods described herein are durable and designed to quickly transition from state to state.”

How Far Away Is All This?

At Oculus Connect 3 in 2016, Oculus Chief Scientist Michael Abrash made a series of predictions about the future of VR technology. Among these was the claim that Touch-like VR controllers would be the state of the art “for at least 5 years”, and “maybe” much longer. Abrash postulated that controllers could potentially be “the mouse of VR”, and remain so even “40 years from now”. He followed this by saying:

The only thing I can see replacing Touch-like controllers is the ability to use your hands as direct physical manipulators as you do in the real world, and I don’t see that happening in the next 5 years because it requires haptic and kinematic technology that isn’t even on the distant horizon.

At Oculus Connect 5 in 2018, however, when revisiting these predictions, Abrash changed his outlook:

“I still don’t think it’ll happen in the next 4 years, but something interesting may in fact be on the distant horizon. […] I believe we’ll have useful haptic hands in some form within 10 years”

It is possible research related to these patented ideas — or perhaps some similar research underway at Facebook — may have informed Abrash’s updated guidance. Of course, organizations of Facebook’s size file patents all the time on lots of ideas that never see actual use.

Facebook Reality Lab, the company’s VR/AR research division led by Abrash, has been on a hiring spree for years now. In Facebook earning calls, the company mentioned increasing investment in VR/AR research.

The post Facebook Patents Reveal Deep Research On True Haptic VR Gloves appeared first on UploadVR.

SIGGRAPH 2019: Foveated AR With Prescriptions And A Physical Tail

Computer graphics conference SIGGRAPH in Los Angeles featured a series of VR and AR-related projects premiering at the event.

ar glasses siggraph 2019 nvidia
New optical designs for AR headsets from NVIDIA at SIGGRAPH 2019.

The long-running conference serves as a yearly showcase of research, tools and art focused around the field of computer graphics. SIGGRAPH was held from July 28 to Aug. 1 at the Los Angeles Convention Center with an immersive pavilion featuring an arcade, museum and a village of mixed reality installations. There’s also exhibition space and presentation areas where emerging technologies are discussed and shared, as well as a VR Theater hosted as part of the event.

We saw A Kite’s Tale from long-time Disney visual effects artist and VR enthusiast Bruce Wright premiering in the theater alongside other new productions like Doctor Who: The Runaway. There’s more exploratory research at SIGGRAPH too and art-focused works from students and universities.

You can check out our 39-minute walk through the conference here:

Here are a few projects that caught our eye at SIGGRAPH 2019:

Prescription and Foveated AR

NVIDIA researchers presented two kinds of AR displays at SIGGRAPH 2019. One display accounts for the wearer’s prescription with its optics and the other moves elements in connection with gaze tracking.

The Prescription AR system is “a 5mm-thick prescription-embedded AR display based on a free-form image combiner,” according to the abstract. “A plastic prescription lens corrects viewer’s vision while a half-mirror-coated free-form image combiner located delivers an augmented image located at the fixed focal depth (1 m).”

The Foveated AR system is “a near-eye AR display with resolution and focal depth dynamically driven by gaze tracking. The display combines a traveling microdisplay relayed off a concave half-mirror magnifier for the high-resolution foveal region, with a wide FOV peripheral display using a projector-based Maxwellian-view display whose nodal point is translated to follow the viewer’s pupil during eye movements using a traveling holographic optical element (HOE).”


The foveated system uses an “infrared camera” to track eye movement and drive the optics directly in front of the eyeball. “Our display supports accommodation cues by varying the focal depth of the microdisplay in the foveal region, and by rendering simulated defocus on the ‘always in focus’ scanning laser projector used for peripheral display.”

Anthropomorphic Tail

Arque is eye-catching work from the Embodied Media Project at Keio University’s Graduate School of Media Design in Japan which proposes “an artificial biomimicry-inspired anthropomorphic tail to allow us to alter our body momentum for assistive, and haptic feedback applications.”

The tail’s structure is “driven by four pneumatic artificial muscles providing the actuation mechanism for the tail tip” and, according to the abstract for the project submitted to SIGGRAPH’s emerging technologies, it highlights what such a prosthetic tail could do “as an extension of human body to provide active momentum alteration in balancing situations, or as a device to alter body momentum for full-body haptic feedback scenarios.”

Ollie VR Animation Tool

Ollie is a new VR animation tool designed for intuitive creation in VR that’ll be shown at SIGGRAPH.

We’ve seen tools like Quill, Tvori and Mindshow used for VR-based animations, but Ollie focuses on a notebook-like interface with “motion paths and keyframes visualized spatially, automatic easing, and automatic squash and stretch” meant to make it easier for first time animators to create something. The app’s creators tell me they should have the entire app running on Oculus Quest to show at SIGGRAPH.

LiquidMask

This project from Taipei Tech’s Department of Interaction Design pumps liquid into a VR headset to provide tactile sensations. “Filling liquid in the water pipe as a transmission interface, this system can simultaneously produce thermal changes and vibration responses on the face skin of the users,” according to the project’s description.

This post was originally published on July 16 and updated with videos from the conference on August 8.

The post SIGGRAPH 2019: Foveated AR With Prescriptions And A Physical Tail appeared first on UploadVR.

Facebook Bets On Realistic Face Tracked Avatars As Key To VR’s Future

facebook codec avatars

Dozens of people are working at Facebook Reality Lab in Pittsburgh, Pennsylvania on research into ultra-realistic avatars that could redefine communication in the 21st century.

I spoke by phone with Yaser Sheikh, a Carnegie Mellon University associate professor running VR research in the city for Facebook since 2015. Sheikh offered context on a technical blog post released this week which details the company’s research toward ultra-realistic avatars. I followed up with Facebook over email for clarification after the call.

“Facebook Reality Lab Pittsburgh can accommodate close to 100 people,” the email explains. “To accommodate the team’s growth, FRL Pittsburgh will be moving into a larger building in Pittsburgh later this year.”

That suggests a big investment toward the future of the team. The group is also not the only one at Facebook researching more realistic and personalized avatars. To be clear, Facebook says this technology is “years away” from being realized in consumer headsets.

Valve’s Mike Ambinder To Talk Brain-Computer Interfaces At GDC

Valve’s Mike Ambinder To Talk Brain-Computer Interfaces At GDC

A talk by Valve’s Mike Ambinder at GDC on March 22 will offer an overview of brain-computer interfaces.

If you are attending GDC and staying through Friday, the talk looks like it’ll be a worthwhile one. The talk is titled “Brain-Computer Interfaces: One Possible Future for How We Play” and happens at 10 am Pacific in Room 2010, West Hall. While the talk isn’t specifically about VR, we’ve also got a list of six GDC talks focused on VR/AR you should check out.

We’ll be curious to see what sort of research will be talked about by Ambinder, an experimental psychologist. The featured image above is a Vive headset modified by Neurable that was said to read additional signals from the brain. Combined with eye-tracking and other indicators, future VR headsets could be improved with research into brain-computer interfaces.

Videos from Game Developers Conference talks are usually available some time after the event. Organizers don’t allow audience livestreams but we’ll plan to attend and live tweet interesting pieces of information.

Here’s the official description of Ambinder’s talk:

While a speculative technology at the present time, advances in Brain-Computer Interface (BCI) research are beginning to shed light on how players may interact with games in the future. While current interaction patterns are restricted to interpretations of mouse, keyboard, gamepad, and gestural controls, future generations of interfaces may include the ability to interpret neurological signals in ways that promise quicker and more sensitive actions, much wider arrays of possible inputs, real-time adaptation of game state to a player’s internal state, and qualitatively different kinds of gameplay experiences. This talk covers both the near-term and long-term outlook of BCI research for the game industry but with an emphasis on how technologies stemming from this research can benefit developers in the present day.

Takeaway

Attendees should leave the talk with an understanding of the pros and cons of various lines of BCI research as well as an appreciation of the potential ways this work could change the way players interact with games in the future.

Intended Audience

This talk is geared towards anyone with an interest in interface or interaction design or who is curious about how game design may evolve as a consequence of access to physiological signals.

Tagged with: , ,

The post Valve’s Mike Ambinder To Talk Brain-Computer Interfaces At GDC appeared first on UploadVR.

Facebook Files Patent For VR Finger Tracking AI Armband

facebook AI armband

Facebook filed a patent application for an armband which performs finger tracking by reading electrical signals inside the user’s wrist. Machine learning is used to convert these signals into finger positions.

Two variations of the device are described. In one, an active signal is sent through the wrist. Based on how the signal changes passing through the tendons and muscles of the arm their position can be determined.

In another variation no active signal is sent. Instead, the device directly reads the impedance of the arm without a probe signal.

Facebook doesn’t seem to be the only company working on this interesting concept however. New York based CTRL-Labs posted a video on YouTube of a seemingly similar armband:

The ability to directly use each of your fingers adds an entirely new level of interactivity to VR. However, that ability is absent from almost all consumer VR headsets today. Leap Motion shipped a finger tracking kit for the Oculus DK2 all the way back in 2014, but the tracking quality left a lot to be desired.

Facebook is already heavily researching optical finger tracking. HTC announced finger tracking for the Vive Pro late last year, but that hasn’t shipped yet and the tracking quality is currently unknown.

If the device described in the patent truly works, it could bring finger tracking to VR without having to do power hungry processing on multiple cameras pointed at your fingers. Furthermore, since it doesn’t rely on cameras the tracking would work at all angles regardless of the headset’s orientation. We’re excited to see what finger tracking solutions VR companies will deliver in the coming years.

Tagged with: , , , , ,

The post Facebook Files Patent For VR Finger Tracking AI Armband appeared first on UploadVR.

Samsung Files Patent For 180 Degree VR Headset With Curved OLED Displays

samsung wide fov headset

Samsung filed a patent application for a VR headset with a field of view of at least 180 degrees. The headset is described as using a curved OLED display.

The patent describes attaining the wide field of view while maintaining reasonable size and weight as a crucial design consideration. To achieve this, the design uses two lenses per eye. One pair of standard fresnel lenses with a field of view of 120, combined with a second set of wide angle strip lenses positioned at an angle.

This would provide a full vertical field of view for regular vision and partial in the peripheral. The curved screen would allow the overall design to remain relatively compact compared to other wide field of view headsets.

Companies frequently patent technologies which never come to market. But if Samsung did decide to go forwards with this design, they could leverage their competitive advantage as the world’s largest small OLED panel manufacturer. Samsung Galaxy smartphones already incorporate curved OLED technology.

The company could even keep the technology exclusive to such a headset, as they did with the “anti screen door effect” OLED technology in the HMD Odyssey+.

In an October interview with Lowyat.NET, the CEO of Samsung Electronics confirmed that the company was heavily looking in to both VR and AR. The Samsung Odyssey series has been well recieved by VR buyers. It offers Vive Pro resolution at a significantly more affordable price.

Samsung’s future in this industry seems promising- we’ll keep you updated on any further hints of the company’s future VR plans.

Tagged with: , ,

The post Samsung Files Patent For 180 Degree VR Headset With Curved OLED Displays appeared first on UploadVR.

Reseachers Built A 40-Plane Multifocal Display With Just One Screen

1600Hz multifocal display

Researchers from Carnegie Mellon University built a multifocal display with 40 unique planes. The system involves a 1600Hz screen and a focus-tunable lens.

All VR headsets on the market today are fixed focus. Each eye is given a separate image, but the screen is focused at a fixed distance from the lenses. This means that your eyes point (verge) towards the virtual distance to what you’re looking at, but focus (accommodate) to the fixed focal length of the display. This is called the vergence-accommodation conflict. It causes eye strain and headaches and also makes near objects look blurry.


Image from Oculus Research

One approach to solving this is to build a headset with multiple screens layered, with each at a different focus length. This is called a multifocal display. The problem however is that to truly solve the vergence-accommodation conflict, the reseachers researchers that 41 focal planes would be needed. If this were done with hardware it would massively increase cost and weight to the point of impracticality.

Their new multifocal display instead uses a lens which adapts its focus based on the voltage it receives. This is known as a focus-tunable or “liquid” lens. A single display panel is run at 1600H. The lens is cycled through its full range of focus at 40Hz. As the focus is changed, the rendered image on the display is changed to what the new focal distance should see. Thus the virtual world is running at 40FPS and for each frame 40 different focal lengths are displayed.


Diagram from Carnegie Mellon paper

Prospects And Limitations

Unlike varifocal displays, this 40-plane multifocal display doesn’t use or require eye tracking. Additionally since each focal plane is rendered independently it doesn’t require the resource intensive approximation of natural blur to look real.

The requirement to render 1600 frames per second is the main limitation of this approach. Each group of 40 are just different planes of the same frame of course, so it’s not quite as bad as 1600 true frames. Facebook are hard at work at reducing the GPU requirements of their varifocal blur, so it will be interesting to see which becomes practical for the consumer market first.

Tagged with: , ,

The post Reseachers Built A 40-Plane Multifocal Display With Just One Screen appeared first on UploadVR.