NVIDIA & Stanford Researchers Use AI to Develop 2.5mm Thick Wearable Display

The never-ending quest for thinner, lighter and more compact virtual reality (VR) and augmented reality (AR) devices primarily starts with display technology. From packing more pixels per inch (PPI) into displays to simplifying and streamlining the optics, it isn’t an easy process but Standford University researchers in conjunction with NVIDIA have recently showcased their latest project dubbed Holographic Glasses.

Holographic Glasses

Most of a VR headset’s bulk comes from the distance between its magnifying eyepiece and the display panel, folding the light in as short a space as possible whilst maintaining quality and without distortion. Hence why most VR devices use Fresnel lenses as they off a good trade-off between optic weight and light refraction.

As part of SIGGRAPH 2022 which takes place this summer, NVIDIA and Stanford University have unveiled their latest research, glasses that create 3D holographic images from a display just 2.5 millimetres thick. Even thinner than pancake lenses, to make this possible the: “Holographic Glasses are composed of a pupil-replicating waveguide, a spatial light modulator (SLM), and a geometric phase lens” to create the holographic images.

The SLM is able to create holograms right in front of the user’s eyes thus removing the need for that gap more traditional VR optics require to produce a suitable image. While the pupil-replicating waveguide and the geometric phase lens help further reduce the setup depth. To produce an outcome that suitably combined display quality and display size the researchers employed an AI-powered algorithm to co-design the optics.

Holographic Glasses

All of this inside a form factor that only weighs 60g.

As these are still early research prototypes this type of technology is still years away from deployment (or maybe never) with pancake lenses the next major step for most VR headsets. It has been rumoured that Meta’s upcoming Project Cambria will utilise pancake optics to give it a slimmer profile.

This isn’t the only VR collaboration between Stanford and NVIDIA for SIGGRAPH 2022, they’re also working on a paper looking at a “computer-generated holography framework that improves image quality while optimizing bandwidth usage.” For continued updates on the latest VR developments, keep reading gmw3.

Stanford Research Shows VR Users Can Be Identified Using Only 5 Minutes of Motion Data

Privacy in VR is an ever growing issue, especially now that all new Oculus accounts must login to Facebook with their real identity, which includes anyone who wants to use a Quest 2. Now researchers at Stanford University have shown they’re able to reliably identify individuals after only a five minute session in a standard consumer VR headset.

As reported by MIXED (German), researchers at Stanford devised a system that identifies users under “typical VR viewing circumstances, with no specially designed identifying task,” the team says in the research paper.

Using a pool of 511 participants, their system is said to be capable of identifying 95% of users correctly “when trained on less than 5 min of tracking data per person.”

Wearing an HTC Vive headset and given two Vive wand controllers, participants watched five 20-second clips from a randomized set of 360-degree videos, and then answered questionnaires in VR.

Image courtesy Stanford University

Notably, the answers to the questionnaires weren’t figured into the researchers’ dataset, but rather investigated in a separate paper examining head movements, arousal, presence, simulator sickness, and individual preferences.

Instead, VR videos were designed to see how users would react and move, with some including strong focal points such as animals, and others with no discernible focal point at all like the middle of a forest.

All of this nonverbal tracking data (both head and hands) was then plugged into three machine learning algorithms, which created a profile of a participant’s height, posture, head rotation speed, distance from VR content, position of controllers at rest, and how they move—a treasure trove of data points from just wearing a standard consumer VR headset.

“In both the privacy policy of Oculus and HTC, makers of two of the most popular VR headsets in 2020, the companies are permitted to share any de-identified data,” the paper notes. “If the tracking data is shared according to rules for de-identified data, then regardless of what is promised in principle, in practice taking one’s name off a dataset accomplishes very little.”

So whether you login to a platform holder’s account or not may already be a fairly minor issue in contrast to the wealth of information. Companies could harvest that de-identified biometrical data not only to figure out who you are, but predict your habits, understand your vulnerabilities, and create marketing profiles intent on grabbing your attention with a new level of granularity. We’re still not there yet, but as the number of VR consumers grows, so do the rewards for companies looking to buy data they simply never had access to before.

“With the rise of virtual reality, body tracking data has never been more accurate and more plentiful. There are many good uses of this tracking data, but it can also be abused,” the research paper concludes. “This work suggests that tracking data during an everyday VR experience is an effective identifier even in large samples. We encourage the research community to explore methods to protect VR tracking data.”

Granted, 500 users is a relatively small dataset in the face of what may soon be multiple millions of VR users. And when that number grows, it will undoubtedly become more difficult based on the data points alone the researchers were able to capture. The study however didn’t include a load of other burgeoning VR technologies that could be used to fill out personal profiles in the near future. Eye-tracking, optical mouth tracking, and integrated wearables such as fitness bands and smartwatches may be a part of the next step to filling out that remaining 5 percent—and all of those technologies are on the horizon for the next generation of consumer VR headsets.

The post Stanford Research Shows VR Users Can Be Identified Using Only 5 Minutes of Motion Data appeared first on Road to VR.

Samsung & Stanford University are Developing a 10,000 PPI OLED Display

Samsung & Stanford OLED

When it comes to display technology the more pixels the better, even more so when using them for virtual reality (VR) purposes as no one likes the ‘screen door’ effect. It’s been revealed that researchers at Stanford University and Samsung are working on ultrahigh-resolution OLED displays capable of 10,000 PPI (pixel-per-inch), perfect for XR.

The teams have managed to take existing designs from thin solar panels to develop this new OLED technology, which uses a combination of a base nanopatterned metasurface and RGB OLED films. This has allowed them to build a proof-of-concept design which achieves the high pixel density whilst offering double the luminescence efficiency and higher colour purity – in comparison to traditional colour-filtered white-OLEDs found on TV’s.

“We’ve taken advantage of the fact that, on the nanoscale, light can flow around objects like water,” said Stanford University materials scientist Mark Brongersma. “The field of nanoscale photonics keeps bringing new surprises and now we’re starting to impact real technologies. Our designs worked really well for solar cells and now we have a chance to impact next generation displays.”

These micro displays could be used not only in VR future VR headsets but also on glasses for AR capabilities and even contact lenses. Samsung will be using the research to develop a full-sized display which reportedly should be much easier and cost-effective to produce.

OmniVision AR Glasses Reference Design key art

OLED displays have been favoured for TV’s and VR devices due to their improved black levels but headsets such as Valve Index and Oculus Quest 2 have utilised fast-switch LCD panels to improve refresh rates.

At past conventions like CES 2019, companies such as Taiwan’s INT Tech have showcased impressive 2,228 ppi and higher displays with Varjo’s enterprise-class headsets using a 3000 ppi ‘Bionic Display’. One with a 10,000 ppi could provide flawless image quality and the researchers saying the tech could theoretically be scaled to 20,000 ppi.

It’s going to be a few years before ultrahigh-resolution OLED displays make it into VR, as and when further advancements are revealed, VRFocus will let you know.

Stanford & Samsung Develop Ultra-dense OLED Display Capable of 20,000 PPI

Researchers at Stanford and Samsung Electronics have developed a display capable of packing in greater than 10,000 pixels per inch (ppi), something that’s slated to be used in VR/AR headsets and contact lenses of the future.

Over the years, research and design firms like JDI and INT have been racing to pave the way for ever higher pixel densities for VR/AR displays, astounding convention-goers with prototypes boasting pixel densities in the low thousands. The main idea is to reduce the perception of the dreaded “Screen Door Effect”, which feels like viewing an image in VR through a fine grid.

Last week however researchers at Stanford University and Samsung’s South Korea-based R&D wing, the Samsung Advanced Institute of Technology (SAIT), say they’ve developed an organic light-emitting diode (OLED) capable of delivering greater than 10,000ppi.

In the paper (via Stanford News), the researchers outline an RGB OLED design that is “completely reenvisioned through the introduction of nanopatterned metasurface mirrors,” taking cues from previous research done to develop an ultra-thin solar panel.

Image courtesy Stanford University, Samsung Electronics

By integrating in the OLED a base layer of reflective metal with nanoscale corrugations, called an optical metasurface, the team was able to produce miniature proof-of-concept pixels with “a higher color purity and a twofold increase in luminescence efficiency,” making it ideal for head-worn displays.

Furthermore, the team estimates that their design could even be used to create displays upwards of 20,000 pixels per inch, although they note that there’s a trade-off in brightness when a single pixel goes below one micrometer in size.

Stanford materials scientist and senior author of the paper Mark Brongersma says the next steps will include integrating the tech into a full-size display, which will fall on the shoulders of Samsung to realize.

It’s doubtful we’ll see any such ultra-high resolution displays in VR/AR headsets in the near term—even with the world’s leading display manufacturer on the job. Samsung is excellent at producing displays thanks to its wide reach (and economies of scale), but there’s still no immediate need to tool mass manufacturing lines for consumer products.

That said, the next generation of VR/AR devices will need a host of other complementary technologies to make good use of such a ultra-high resolution display, including reliable eye-tracking for foveated rendering as well as greater compute power to render ever more complex and photorealistic scenes—things that are certainly coming, although aren’t here yet.

The post Stanford & Samsung Develop Ultra-dense OLED Display Capable of 20,000 PPI appeared first on Road to VR.

Sandbox VR Raises $11M Investment from Will Smith, Justin Timberlake, Katy Perry & More

Sandbox VR, the location-based VR attraction, announced an additional $11 million in funding from some of the biggest household names in entertainment and investment.

Sandbox VR says in a press statement that their latest investment brings their total funds to $83 million, including the Series A round earlier this year that netted the company $68 million from Andreessen Horowitz, Alibaba, Floodgate Ventures, Stanford University, Triplepoint Capital, and CRCM.

The most recent funding round was led by David Sacks of Craft Ventures and the Andreessen Horowitz Cultural Leadership Fund, with additional investors including Justin Timberlake, Katy Perry, Orlando Bloom, Will Smith, Honda Keisuke, Dreamers Fund, Michael Ovitz, and Kevin Durant & Rich Kleiman of Thirty Five Ventures.

“We believe that VR is finally ready to take off as a mass-market phenomenon in malls, where it can be optimized for a social experience,” said David Sacks, co-founder and general partner at Craft Ventures. “We chose the Sandbox team because of their background in game design; their VR experiences have a level of interactivity — with both the VR world and other players — that we couldn’t find elsewhere. We believe that Sandbox VR is poised to become the first VR experience for millions of consumers around the world.”

SEE ALSO
Inside 'Avengers: Damage Control', The VOID's Newest Immersive Experience

The company currently runs centers in Los Angeles, San Francisco, Vancouver, Hong Kong, Jakarta, Macau, and Singapore. Locations in Austin, Chicago, Dallas, New York, and San Diego are marked as “coming soon” on the company’s website.

The company says a total of 16 total locations are however planned to open by the end of 2020.

The post Sandbox VR Raises $11M Investment from Will Smith, Justin Timberlake, Katy Perry & More appeared first on Road to VR.

Out-of-home VR Destination Sandbox VR Closes $68M Series A Financing

The big investment deals that brought so many VR companies into the limelight have cooled down somewhat over the past two years, although that hasn’t stopped the Hong Kong-born VR destination company Sandbox VR from landing a $68 million Series A funding round.

The financing round was led by Andreessen Horowitz, the Silicon Valley-based VC firm, and includes participation by Alibaba, Floodgate Ventures, Stanford University, Triplepoint Capital, and CRCM, Business Insider reports.

Unlike The Void, Sandbox VR hasn’t publicly announced any deals for branded VR experiences, instead making its own VR games including a futuristic shooter, a haunted house, and an underwater treasure hunting adventure. All of these support between two and six players at once, with the company charging around $40 per person for a 30-minute playsession.

 

The company’s approach is decidedly different from The Void, which features large-scale tracking volumes, interactive sets, and real-time effects. Instead, Sandbox VR is focusing on greater scalablity thanks to reduced complexity and physical footprint.

“We tried everything, what we really liked about [Sandbox] was that really though about archetyping this as modest-sized rooms that you could really put anywhere,” Andreessen Horowitz’s Andrew Chen tells TechCrunch. “So it’s this really scalable thing that you could imagine putting inside of a mall or a boutique retail location. You could scale a single location to having 10 or 20 rooms the way a movie theater might have 12 screens.”

Sandbox VR is currently operating seven locations across North America and Asia. More locations are planned for Los Angeles, Austin, New York, and Chicago. Sandbox VR founder Steve Zhao says the company has “inked multiple deals with Westfield malls across the country.”

The post Out-of-home VR Destination Sandbox VR Closes $68M Series A Financing appeared first on Road to VR.

Stanford Study Finds VR Helps People Be More Compassionate

Virtual reality (VR) has often been referred to as ‘the empathy machine’ for its ability to put people in the shoes of others. However, until now, that moniker was the result of anecdotal evidence. A study by researchers at Stanford has backed up claims that VR can make people more empathetic.

Stanford researchers developed a VR application called Becoming Homeless as part of a study to test how media and technology can affect empathy. Based on the evidence gathered, the published study showed that VR can help people be more compassionate.

Stanford University

Becoming Homeless features interactive VR scenarios that involve losing your home. One scene involves the user searching their apartment trying to find items that can be sold in other to make enough money for rent, while another sees the user on a bus, trying to protect their belongings from being stolen.

“About 10 million headsets have been sold in the U.S. over the past two years. So, many people now have access to VR experiences. But we still don’t know much about how VR affects people,” said graduate student and lead author of the research paper Fernanda Herrera. “This research is an important step in figuring out how much of an effect this technology can have on people’s level of empathy in the long term.”

The study discovered that those who tried Becoming Homeless expressed enduring positive attitudes towards the homeless, when compared to those who read a narrative or interacted with a 2D version of the same scenario.

“Experiences are what define us as humans, so it’s not surprising that an intense experience in VR is more impactful than imagining something,” said Jeremy Bailenson, a professor of communication and a co-author of the paper.

A trailer for Becoming Homeless is available to view below and further information on the study can be found on the Stanford University website. For future coverage on developments in VR, keep checking back with VRFocus.

Cardiologist Use VR To Visualise Heart Defects

There are a number of applications for virtual reality (VR) in the medical field. Many of which are tied to training of medical students, or  involving new ways to map and visualise medical conditions, which can often be complex. This application combines both.

At Lucile Packard Children’s hospital, Stanford paediatric cardiologist David Axelrod is able to use a VR simulation to model heart defects in patients. These VR models can be used as learning tools for trainee surgeons, or to show patients and families the ins and outs of the medical condition.

The VR applications currently has over fifteen heart models, which are used for instruction on heart anatomy and congenital heart defects. These models are used in 22 medical centres all over the world, including the University of Michigan.

“We built the hearts as prototypical congenital heart models based on a number of different patients, but each one is like a classical model of whatever heart defect it is depicting,” Axelrod said.

Axelrod hopes that the use of these VR models will expand into new areas and integrate them into the curriculum at Stanford, where the models are already in limited use. Axelrod also said that there are plans to upgrade the application from a model to a full educational experience.

Recently, the team behind the application have implemented a camera application that lets users take, save and export photos that show a heart both before and after surgery. Axelrod is envisioning more educational features could be added to the app over the next few months.

“If you’re able to train medical trainees more effectively on procedures that involve risk without actually having any real risk to a patient, then you’re going to get better outcomes in all areas in healthcare.” said David Sarno, founder of VR firm Lighthaus Inc. Axelrod partnered with Lighthaus to develop the models.

For future coverage of new and innovative uses for VR technology, keep checking back with VRFocus.

Anxiety Tackling Soteria VR Receives Audience Choice Award from Stanford University

Virtual reality (VR) is being used in a number of ways to help people with phobias, mental health disorders and more. iThrive Games and Deep Games Labs have created a VR experience about facing anxiety, Soteria VR, which has recently been awarded the Audience Choice Award by the Stanford University School of Medicine.

The award was given at the Virtual Reality and Augmented Reality Lab hosted by Brainstorm — Stanford’s Laboratory for Brain Health Innovation and Entrepreneurship, after six pre-selected entrepreneurs from around the country pitched startup ideas for using VR or augmented reality (AR) technology to improve brain and behavioral health.

Soteria VR

It was iThrive Games’ executive director, Dr. Susan Rivers, and frequent collaborator, Dr. Doris C. Rusch from Deep Games Lab that came up with the concept for Soteria VR, expanding on an earlier PC version of the game called Soteria: Dreams As Currency. Both versions of the videogame draw on techniques for overcoming anxiety: identifying and correcting avoidance behaviors, developing a tolerance for uncertainty and practicing acceptance.

“Designing VR experiences — especially games — for therapy is not widely embraced now. We see incredible promise in using immersive technologies to relieve suffering among individuals struggling with mental illness,” said Dr. Rivers in a statement. “Brainstorm provided an incredible opportunity to share our approach and engage with experts in mental health and technology to push our work forward. The fact that the audience selected our game as the Audience Choice Award winner is also very rewarding and further validates that that the work we’re doing does have a powerful impact.”

Soteria VR wasn’t the only winner at Brainstorm’s Virtual Reality and Augmented Reality Lab. Simon Fraser University won the grand prize for their idea to use VR to assist in addiction recovery by preventing relapse.

“Virtual reality and augmented reality (VR/AR) offer a lot of potential to transform the way that we, as physicians, diagnose and treat diseases like PTSD, autism, anxiety and opioid use,” said Dr. Nina Vasan, the Founder and Director of Stanford Brainstorm and Chief Resident in Psychiatry at Stanford. “Brainstorm wanted to capture this potential by identifying promising VR/AR applications and working with entrepreneurs to develop ventures that are effective from the medical, business, and technological perspectives.”

Additionally, iThrive is working closely with the Centerstone Research Institute in Nashville to devise a strategy for bringing more digital videogames into behavioural health services.

VRFocus will continue its coverage of VR and health, reporting back with the latest advancements.

Stanford Event Seeks Innovative Uses of VR for Behavioural Health

The first VR Innovation Lab for Behavioral and Brain Health takes place on October 7th at Brainstorm, Stanford University’s new VR/AR startups laboratory. Selected finalists will pitch their innovative solutions for behavioural health problems to the audience and a panel of expert judges, culminating in a Grand Prize Award and Audience Award.

Following the launch event for Brainstorm on October 6th, the VR/AR Innovation Lab begins at Stanford University School of Medicine on October 7th, as part of the 3rd annual conference of Innovations in Psychiatry and Behavioral Health. This year, the focus is entirely on VR/AR technology, where selected innovators from around the country will pitch startup ideas for diagnosing and treating mental or neurological disorders. The lab gives the innovators an opportunity to discuss and improve their ideas before making their final presentations.

Some of the finalist ideas include using VR to simulate environmental triggers for substance use and allow for contextual relapse prevention and skill building, a VR gaming experience that simulates the experience of a person suffering from schizophrenia to promote understanding and empathy, and a VR product to track behaviours and responses to develop a tool for patients with Autism to learn emotion identification and modification.

“Virtual reality and augmented reality offer a lot of potential to transform the way that we as physicians diagnose and treat diseases like PTSD, autism, anxiety, and opioid use,” says Dr. Nina Vasan, Founder and Director of Stanford Brainstorm and Chief Resident in Psychiatry at Stanford. “Brainstorm wanted to capture this potential by identifying the most promising VR/AR entrepreneurs and working with them to develop ventures that are effective from the medical, business, and technological perspectives. The Innovation Lab will showcase these ideas and give audience members the opportunity to work together and help the entrepreneurs improve their ideas.”

The Brainstorm Virtual and Augmented Reality Innovation Lab will be hosted at Stanford’s 3rd Annual Innovations in Psychiatry and Behavioral Health Conference on Oct 7th 2017 in the Li Ka Shing Center for Learning and Knowledge. For further information visit this web page.

The post Stanford Event Seeks Innovative Uses of VR for Behavioural Health appeared first on Road to VR.