Eyes-On: Light Field Lab Built A Tantalizing Holographic Display For The 2020s

Eyes-On: Light Field Lab Built A Tantalizing Holographic Display For The 2020s

There’s a monarch butterfly in the dark, suspended behind the glass panel of a decorative wooden lantern about a meter away from me. I am not wearing glasses. Light Field Lab representatives say they aren’t tracking my head’s position.

Like a practiced magician, Light Field Lab CEO Jon Karafin removes the lantern from its perch to reveal the butterfly is floating in the open air.

“This is a complete reconstruction of the light field such that your eye focuses on the holographic object from within the holographic viewing volume,” Karafin said.

Future vision concept art of a room scale holographic display from Light Field Lab.

I move my head to the left and right inside the small “visual acuity” area outlined with glowing tape on the floor. This is the sweet spot. I also move my head both up and down in this area about a foot. The butterfly’s wings seem to respond with the correct parallax as I shift. I move slightly closer and backward — still inside he box — and the butterfly seems to grow and shrink exactly as I would expect. This is what they said they would show me. The eight hours I needed to spend in a car driving to and from this demo in San Jose were not a waste of time.

“There are other volumetric display technologies that may leverage a surface to form 2D-only pixels in space (e.g. smoke, water, screens, mirrors, moving surfaces, etc.),” Karafin said. “However, these are not holograms.”

Real Holograms

Karafin really baked my noodle next.

He grabbed a magnifying lens off a tray and placed it inside the butterfly. How the fuck does a standard magnifying glass warp light from the inside of a holographic object? My eyes and brain had never had to process the way light was behaving relative to the lenses he was putting “through” what my brain perceived as a physical solid object.

“When using the magnifying glass in front of the butterfly, it responds exactly as it would in the real world,” Karafin explains. “When you pass the magnifying glass through the hologram, you see things that only a real hologram can achieve.”

Ok, so now the trip was worth it.

Next Steps

I noted deficiencies in the visuals of the display only outside the optimal viewing cone provided by the small display. Everything inside that cone — at least a meter away — looked great. I should note they also showed a moving fish that was less detailed than the butterfly, and as certain pieces of it moved closer to me I noted a bit of fuzzy softness there.

“The amount of movement depends on how close you are to the hologram, the size of the holographic surface, and the optical prescription for the holographic waveguides,” Karafin explained.

The field of view for the Light Field Lab display prototype is very small, but the units should be able to be stacked up like bricks for much larger panels.

I left the room and realized I should have asked to see the demo with the lights on. Light Field Lab representatives were crystal clear in saying that I would see something small float believably in front of me with no glasses, goggles or head tracking of any kind. But they also have a lot to prove with so many other volumetric display technologies promising things which can’t be delivered. Many competing ideas for Light Field Lab are low resolution when viewed up close, like Looking Glass. That’s why Karafin showed me up close how lenses affect his holograms.

“The holographic volume is two inches, but that’s only limited by the submodule prototype,” Karafin said. “That grows substantially in product.”

Eventually enough of the Light Field Lab “bricks” could be stacked up for an entire holographic wall, or room.

LFL representatives believe the first use cases for this tech will be for location-based entertainment and education, like theaters, museums or theme park attractions. I didn’t expect the lights-on demo to be impressive and it wasn’t, but it did give me a good look at the overall rig. Photographs weren’t allowed, but the panel looks a lot like it does in the image at the top of this post except that it is mounted on a tall server rack.

Bricks In A Wall

The word “brick” is useful here because that’s how I’d describe the box from which light emerges.

Karafin and crew believe they’ve uncovered a “recipe” for producing proper light fields from one side of this brick. My demo — reminiscent in some ways of my first Oculus Rift demo back when the company just had seed funding in late 2012 — was about showing me a proper light field display and proving it to me that it is real. Light Field Lab accomplished that. The next phase for the company is about expanding the size of the area where the full holographic effect can be viewed by stacking them up like bricks in a wall. That also means the tower portion of the installation needs to go away.

“We already do everything within the prototype that we will do to manufacturer the product,” Karafin said.

So next year my expectations for Light Field Lab is to see the same demo with a friend able to stand side by side with me. I want to observe a couple live butterflies with the same visual acuity as 2018 but with only wires running to a couple of these bricks with the computer towers in a separate area.

Karafin says that’s the plan. Light Field Lab is also working with rendering company OTOY to build out a content development pipeline for the technology.

“You will be able to move octane light field objects interactively on the display in a month once we hook up our custom build of our viewer to the panel,” OTOY CEO Jules Urbach said. “Been waiting to get multiple RTX 6000 before going up to LFL to finish this work.”

Last year Karafin said they would use their first funding to deliver a proof-of concept hardware prototype. Earlier this year they announced $7 million in funding to complete it. This week I looked at it. If some portion of $7 million is what it took to produce what I saw, then I need to see what Light Field Lab does with more money and another year to build a much bigger device.

Karafin showed slides before the demo depicting various  possible configurations.A holographic table so you can have something like Holochess from Star Wars, or a holographic theater where a shark seems to come out into the audience like Jaws in Back To The Future Part II. Or perhaps one day many years from now a fully holographic room — a holodeck — where redirected walking techniques and haptic effects might be used to simulate touch in a seemingly endless space. Karafin’s slides also include explanations of why most of these holographic pop culture reference points — including R2-D2 projecting Leia into the open air — probably can’t work in the real world. As Karafin explains, you need a surface behind the holographic volume “spraying” out light which enters the eye as naturally as if it had bounced off a real object.

Any doubt I have (and I don’t have too much of it now) the company can deliver this ground-breaking display technology to locations like theme parks will evaporate whenever they announce more funding and I see Light Field Lab successfully stack these units together.

Now Boarding: Hype Train To The Holodeck

A toy mirascope.

I’m handed a bag at the end of my meeting with a cheap plastic toy inside that was made in China. It is called a mirascope. Karafin says it is the analog equivalent of what they are doing with each point of light from their display.

“The mirascope is a dual parabolic reflector that forms a relayed real image from a real object. The real image you see results from the convergence of ray bundles, identical to how our system collimates and converges energy to form the hologram,” Karafin said. “Essentially, the mirascope illustrates how a real image is formed — but we replace the real object and mirascope with a completely solid state, holographic, digital projection backplane.”

I brought their little toy home and it completely blew my family away to try and grab the gem and discover it really isn’t there (it is at the bottom of the mirascope). The toy taught my household both the Light Field Lab name and its promise of true glasses-free virtual worlds not too far off from the Holodeck.

Next time I make that trip to see Light Field Lab, I’m bringing my family.

“Our vision is to enable the Holodeck, which makes the depictions of holographic interfaces in science fiction a reality,” Karafin said.

Tagged with: , , , ,

The post Eyes-On: Light Field Lab Built A Tantalizing Holographic Display For The 2020s appeared first on UploadVR.

Liveblog: GTC 2018 – ‘Light Field Rendering and Streaming for VR and AR’

VRFocus is once again providing liveblog coverage of sessions (where we can) at this year’s GPU Technology Conference (GTC) hosted by NVIDIA in San Jose California.  At GTC we’re expecting a number of sessions that will be touching on the fields of virtual reality (VR), augmented reality (AR) and also how mixed reality (MR) and related technologies might fit into the creative mix of both the present and future.

Our second covered talk today comes from OTOY and is being held by the company’s CEO Jules Urbach. Urbach is currently busy working on two latest ventures, OTOY and LightStage, which aim to revolutionise 3D content capture, creation, and delivery.

“We’ll discuss OTOY’s cutting-edge light field rendering toolset and platform, which allows for immersive experiences on mobile HMDs and next-gen displays, making it ideal for VR and AR. OTOY is developing a groundbreaking light field rendering pipeline, including the world’s first portable 360 LightStage capture system and a cloud-based graphics platform for creating and streaming light field media for VR and emerging holographic displays.”

Your liveblogger for the event is Kevin Joyce.

The Future of Virtual Lightfields with Otoy CEO Jules Urbach

Otoy is a rendering company that pushing the limits of digital light fields and physically-based rendering. Now that Otoy’s Octane Renderer has shipped in Unity, they’re pivoting from focusing on licensing their rendering engine to selling cloud computing resources for rendering light fields and physically-correct photon paths. Otoy has also completed an ICO for their Render Token (RNDR), and will continue to build out a centralized cloud-computing infrastructure to bootstrap a more robust distributed rendering ecosystem driven by a Etherium-based ERC20 cryptocurrency market.

LISTEN TO THE VOICES OF VR PODCAST

jules-urbach-2017I talked with CEO and co-founder Jules Urbach at the beginning of SIGGRAPH 2017 where we talked about relighting light fields, 8D lightfield & reflectance fields, modeling physics interactions in lightfields, optimizing volumetric lightfield capture systems, converting 360 video into volumetric videos for Facebook, and their movement into creating distributed render farms.

In my previous conversations with Urbach, he shared his dreams of rendering the metaverse and beaming the matrix into your eyes. We complete this conversation by diving down the rabbit hole into some of the deeper philosophical motivations that are really driving and inspiring Urbach’s work.

This time Urbach shares his visions of VR’s potential to provide us with experiences that are decoupled from the normal expected levels of entropy and energy transfer for an equivalent meaningful experience. What’s below the Planck’s constant? It’s a philosophical question, but Urbach suspects that there are insights from information theory since Planck’s photons and Shannon’s bits have a common root in thermodynamics.

SEE ALSO
Facebook Unveils Two New Volumetric Video 'Surround360' Cameras, Coming Later this Year

He wonders whether the Halting problem suggests that a simulated universe is not computable, as well as whether Gödel’s Incompleteness Theorems suggests that we’ll never be able to create a complete model of the Universe. Either way, Urbach is deeply committed to trying to creating the technological infrastructure to be able to render the metaverse, and continue to probe for insights into the nature of consciousness and the nature of reality.

Here’s the launch video for the Octane Renderer in Unity:


Support Voices of VR

Music: Fatality & Summer Trip

The post The Future of Virtual Lightfields with Otoy CEO Jules Urbach appeared first on Road to VR.

Otoy Wants to Make Light-field Rendering Affordable with a Supercomputing Cluster You Get Paid to Be Part Of

Otoy has announced the Render Token, a blockchain-based currency that underpins a distributed GPU rendering network. The company hopes to allow idle GPUs on consumer PCs to be tapped for rendering work, earning money for the owner in exchange for their computer’s work. The goal, Otoy says, is to make massive GPU rendering power available at low cost for rendering light-fields and more.

Otoy is a maker of rendering tools and a proponent of light-fields as the next-generation format of capture and display for AR and VR. Light-fields can be thought of as volumetric representations of a scene, where every view possible has already been calculated, allowing for real-time playback of cinema-quality scenery, even in demanding applications like virtual reality. Sounds great, right?

One problem with the practical application of light-fields is that they’re expensive to render, both computationally and temporally. If you want to farm your render out to the cloud to get it done in a reasonable amount of time, you can expect to pay a hefty fee.

For a company that’s pushing light-field as the future of immersive content, that rendering cost is a major blocker to adoption. And so on a quest to make GPU rendering dramatically more affordable, Otoy is mashing up the ideas of distributed supercomputing clusters and the blockchain with the hopes of creating a decentralized cloud rendering network that runs rendering tasks on idle GPUs in exchange for payment in the form of a cryptocurrency.

Introducing Render Token

The result is what Otoy calls the Render Token (RNDR). It’s a cryptocurrency coin based on the Ethereum blockchain, and the company says it’s the payment that will be used to incentivize and compensate participants in the rendering network for the use of their GPU power.

Distributed Computing Isn’t Exactly New

The idea of a distributed computing supercomputing cluster isn’t new. You may have heard of Folding@home or SETI@home, two popular distributed computing initiatives which borrowed unused computational power from idle computers running a piece of client software. But that computation power was offered by users on a volunteer basis. Now that blockchain technology (the underlying structure of cryptocurrencies) has been proven out, there’s a trusted method to distribute payments among a network of computers performing work for paying customers.

SEE ALSO
Beaming the 'Matrix' into Your Eyes: Otoy CEO on the Future of Real-time Lightfield Rendering

Intrinsic Human Value

Typical cryptocurrencies work by incentivising so-called ‘miners’ to run software on their computers to log and process cryptocurrency transactions for the whole network, and in exchange receive small bits of the cryptocurrency for their work. But all that processing power spent on number crunching is wasted, argues Otoy CEO Jules Urbach in his introduction of RNDR.

GPU hashing [AKA mining] incurs real world energy and cap-ex costs which return less and less value to the crypto-community as the blockchain grows. Over time, and on a global scale, this becomes enormously wasteful as GPU compute cycles are essentially thrown away hashing numbers with no intrinsic human value, while GPU rendering power on AWS remains scarce at $14.4/hour ( ~1000 OctaneBench).

Instead, Urbach says, the fundamental mining work that underpins crytocurenies could be used to produce valuable output in the form of rendered imagery.

The Render Token recalibrates the weighting of GPUs in the network, making it possible for each transaction on the blockchain to validate far greater value of equivalent GPU proof-of-render work that is valuable for real world jobs that are prohibitively expensive to fulfill quickly on local or centralized GPUs.

ICO Incoming

If you’re at all familiar with cryptocurrencies, you’ll know where this is all heading… an ICO. Otoy plans to make an ‘Initial Coin Offering’, which is a sale of the first Render Tokens. It’s both a way for Otoy to raise capital for their initiative and to establish the initial value of each Render Token. The company will offer a limited number of tokens, and, according to the Render Token White Paper, hopes to sell $134 million to support the project, presumably cutting off the supply after that amount is raised. That wouldn’t be the largest ICO to date (that would be Filecoin at $250M+, according to The Cointelegraph), but it’s not far off. Here’s how Otoy says they’ll spend the funds:

40% – will go to future development of each expansion phase (I-IV) and will support the team dedicated to the operations and engineering of the Render Token platform.

25% – running, maintaining, and scaling the network – this will include developing and creating new and more efficient solutions for rendering through custom built GPU solutions, effectively lowering the price of rendering across the network and the world.

20% – will be allocated to marketing and expanding the applications and reach and use-cases of the network.

10% – for third party services and contractors providing guidance and efficiencies to the project.

5% – for unforeseen roadblocks and circumstances.

Buying (or selling) Rendering Power

Owners of Render Tokens can then be spent to pay for rendering work on the network, or sold to others in exchange for difference currencies. Their ultimate value will be determined over time by the market, with prospective purchasers hoping value will increase following the ICO.

SEE ALSO
'Decentraland' – Using Ethereum Blockchain ICO to Sell Virtual Real Estate

More Than Light-fields

Light-fields are particularly compelling for AR and VR, and Otoy hopes that the Render Token platform will make rendering them faster and more affordable, but light-field isn’t the only thing that the system can render; the company points to the following categories that could be disrupted if they achieve their vision of affordable, distributed rendering:

Media – From blockbuster films to home movies, RNDR brings affordable GPU compute to democratize advanced special effects and graphics. This will accelerate the arrival of holographic displays and avatars to change storytelling forever.

Gaming – Billions of consumers worldwide put unprecedented demands on 3D game engines. RNDR will provide the infrastructure and standards to uplevel gaming and finally bring cinematic rendering to interactive experiences.

Manufacturing – RNDR makes scientific-grade rendering available to any 3D object. Industry will be retooled as physics-accurate rendering transforms imaging from 3D visualization to intelligent 3D simulation.

Medical – Radiology is being overhauled by the introduction of high-level rendering. From surgeons to new medical students, RNDR will enable unprecedented levels of fidelity in medical imaging at a fraction of the speed and cost.

Virtual Reality – RNDR will bring economical light field media and streaming to allow any artist to create high quality VR experiences at 72K resolution and beyond —rendering an immersive Metaverse in stunning detail.

Augmented Reality – As the ARKit and ARCore revolutions take off, RNDR will make photorealistic objects and scenes on wearables and mobile devices a possibility by democratizing the authorship, registration and streaming of light fields and next gen media formats.

Mixed Reality – With the breakout successes of WeChat and SnapChat, the economy of virtual goods and services is only just beginning. RNDR will provide the key distribution system to monetize and track and digital objects in the Metaverse.

The post Otoy Wants to Make Light-field Rendering Affordable with a Supercomputing Cluster You Get Paid to Be Part Of appeared first on Road to VR.

Jules Urbach’s Quest To Realize Star Trek’s Holodeck

Jules Urbach’s Quest To Realize Star Trek’s Holodeck

Jules Urbach is on the brink.

Rest is rare for the 43-year-old with curly black hair and just a touch of gray in his beard. The long hours put dark circles around his eyes. He often works through the night. He replies quickly when asked how much he slept the night before our interview.

“100 minutes,” he said.

The answer seems rather specific. Alissa Grainger, Urbach’s business partner, motions to his watch.

“He records it,” she said.

Urbach’s words come tumbling out faster than anyone can keep up. There’s an intensity to him. He’s obsessed with something and the circles around his eyes are a byproduct. There’s a future he’s been working toward for most of his life and he’s on a mission to deliver it as the co-founder and CEO of OTOY, a cloud-based 3D graphics rendering company. He’s a “mad scientist” as one investor describes him, with a The game inspired Urbach to try and recreate it.

“I asked my mom to buy me the arcade game; she said no,” Urbach said. “So I actually recreated the game on my Mac IIfx first for myself, then to let people play it in the school lunch areas. I got into Harvard and Yale after sending them the source code to the game (they were skeptical I did it, never having seen digital video on a computer – this was before quicktime and DCT codecs came out). The deep eureka moment I had first seeing Dragon’s Lair as a kid was how beautiful and cinematic the game looked – maybe 30 plus years ahead of its time – yet how simple the concept was that made it work as [a] video game.”

Today, in a Los Angeles high-rise, the cabinet greets visitors to the office as a symbol of his life-long quest to enable anyone to make their thoughts a reality. If that sounds both inspirational and pretty far out, you understand what it’s like to talk to Urbach.

Life-Long Obsession

Back in 2004, a New York Times article described Urbach as a “caffeinated” man just like the one I met recently. At the time he was working from his mother’s house and thinking about how to piggyback interactivity onto an AOL chat window. What’s changed between then and now for Urbach is he gained a business-focused co-founder in Grainger. For the last 10 years his tireless engineering of a technology pipeline built for the future has been balanced by Grainger’s attention to the day-to-day operations of a business.

Together, they’ve built a company which employs around 60 people headquartered in the heart of Los Angeles. OTOY’s revenue doubled each year for the last several, according to Grainger. After all this time, Urbach still retains a “significantly” greater than 50 percent share in the company. This majority position even after a decade hints at the level of long-term trust investors have placed in their pairing. They’ve made connections throughout Hollywood and Silicon Valley with advisors like Google’s Eric Schmidt and investors backing them like Ari Emanuel, the co-CEO of one of Hollywood’s most influential talent companies and the inspiration for blunt-talking super agent Ari on HBO’s Entourage.

Emanuel showed up late one night on Urbach’s doorstep. As Urbach described it, he brought Emanuel over to his computer and showed him “3D objects with live ads and web links injected on the surface moving through portals of other apps and pages.”

“What the fuck did I know,” Emanuel said. “It looked great. I’d never seen anything like that.”

Urbach, though, wasn’t ready at the time for an investment. He supported himself taking work-for-hire jobs, like using his self-coded tools to render the complex scenes in advertisements for movies like Michael Bay’s Transformers. Today, the most recent versions of those tools are used by artists to create breathtaking photorealistic sequences like the haunting opening of HBO’s Westworld.

Urbach garnered support from people like Emanuel despite (or perhaps because of) the fact that his ideas can sound like the ravings of someone who sees reality quite different from most people. When speaking to outsiders, Urbach is advised not to discuss certain subjects because it can seem so far-fetched.

The truth is, even with influential investors and cutting edge technology, very few people understand OTOY’s place in the market or Urbach’s vision. Nevertheless, Urbach’s sleepless nights building his rendering technology, and Grainger’s diligent focusing of the business, have placed the startup at the cusp of a shift in computing that can change everything.

“OTOY is juggling a lot of widely different tech,” Oculus chief technical officer John Carmack wrote in an email to me. “But if you squint at it right, there are a lot of pieces that fit together in a particular vision of the future.”

The Brink Of What?

To understand OTOY’s position it is helpful to grasp some long-term trends.

Recently, VR started to approach consumer quality and pricing levels after years of failed attempts. Smartphones selling in the hundreds of millions made low-cost movement sensors and high resolution displays usable in more affordable VR hardware. In 2014, Facebook’s Mark Zuckerberg saw enough potential to bet $3 billion on the hope that the technology could form the foundation of the next platform for personal computing.

To kickstart adoption, this technology needed a reason for people to use it. Facebook, Sony, Valve, Samsung and HTC all bet on games. The gamble was that people who love video games and spend much of their free time using flat screens to immerse themselves in virtual worlds would be the first to seek out this technology.

Parallel to this evolution, sandbox video games started to emerge offering large 3D worlds for players to shape and explore. With the arrival of Minecraft around a decade ago, gaming crossed a threshold enabling millions of kids and adults alike to build vast and dynamic worlds using simple tools. For professionals, in recent years world engines gained popularity. Unity empowers skilled creators using the toolset to produce virtual worlds that could work on any personal gadget. Its leading competitor, Unreal, rolled out “Blueprints” allowing people to build worlds without any formal knowledge of coding.

The mouse, keyboard and controller that defined computer interaction in the last few decades of the 20th century are left behind with the rise of VR. In its place, intuitive human behavior becomes the way people shape these virtual worlds. For example, you just reach out and grab a cup with your hands in VR instead of moving a mouse to rotate that same object on a computer screen. It is a transformation still underway but the long-term trend here is that the barriers to creation are lowering. You can increasingly make the virtual world you want and quickly invite others to share it with you. At the same time the fidelity — the photorealistic look of these worlds — is dramatically improving.

“Games want to be cinematic quality and film wants to be interactive. So we see a new category of content that is linear interactive storytelling,” said Sylvio Drouin, vice president at Unity Labs. “Jules wants to make beautiful content accessible to everybody.”

Beaming the ‘Matrix’ into Your Eyes: Otoy CEO on the Future of Real-time Lightfield Rendering

jules-urbachAt Unity’s Unite keynote in November, Otoy’s Jules Urbach announced that their Octane Renderer was going to be built into Unity to bake light field scenes. But this is also setting up the potential for real-time ray tracing of light fields using application-specific integrated circuits from PowerVR, which Urbach says that with 120W could render out up to 6 billion rays per second. Combining this PowerVR ASIC with foveated rendering and Otoy’s Octane renderer built into Unity provides a technological roadmap for being able to produce a photorealistic quality that will be like beaming the Matrix into your eyes.

LISTEN TO THE VOICES OF VR PODCAST

I had a chance to catch up with Urbach at CES 2017 where we talked about the Unity integration, the open standards work Otoy is working on, overcoming the Uncanny Valley, the future of the decentralized metaverse, and some of the deeper philosophical thoughts about the Metaverse that is the driving motivation behind Otoy’s work toward being able to render virtual reality with a visual fidelity that is indistinguishable from reality.

Here’s Otoy’s Unity Integration Announcement:

Here’s the opening title sequence from Westworld that uses Otoy’s Octane Renderer:

The post Beaming the ‘Matrix’ into Your Eyes: Otoy CEO on the Future of Real-time Lightfield Rendering appeared first on Road to VR.

OTOY Enables Groundbreaking VR Social Features

OTOY Enables Groundbreaking VR Social Features

Oculus and OTOY may have achieved a breakthrough in social VR functionality.

VR headset owners should soon be able to share a variety of environments and Web-based content with one another in virtual reality. For example, friends can feel like they are together on the bridge of the Enterprise, and on the viewscreen of the ship they see a list of Star Trek episodes to watch with one another.

We have yet to test all of this functionality first-hand, but we’ve seen some of it live in the Gear VR — accessing, for example, a Star Trek environment inside OTOY’s ORBX Media Player app from within the Oculus Social Beta.

The idea is that a website supporting ORBX can be embedded onto “a live interactive surface in the environment, which is broadcast to every user in the room while the host controls the browser using the gaze cursor, on-screen keyboard, or gamepad,” according to OTOY.

“Everyone can watch Balance of Terror live,” wrote OTOY CEO Jules Urbach, in an email, referring to one of the best episodes from The Original Series of Star Trek.

The son of the creator of Star Trek, Rod Roddenberry, is an investor in OTOY. That explains why his site, celebrating Star Trek’s 50th anniversary, is one of the first environments you can try with the new feature.

“When my father created the Holodeck, he hoped Star Trek fans would be inspired by a future where the material and virtual worlds blended together seamlessly, and allowed education, art and social experiences to transcend the physical,” Roddenberry said in a prepared statement. “I invested in OTOY to help further this goal and to map out the work needed to be done in the coming years and decades.”

An early version of the Oculus Social app launched almost a year ago, but it only allowed people to watch a Vimeo video or Twitch stream with random people. In March, Oculus added a friend list and trivia game so you could find people you know online and connect in a fun setting.

Unfortunately, the additions weren’t enough to encourage steady use of shared VR experiences. As a result, the Oculus Social app is frequently a ghost town. Last week, Oculus announced a series of upgrades to its social platform at its developer conference that could help push shared VR experiences forward.

OTOY’s implementation seems to have emerged from a close collaboration with Oculus. According to Urbach, the social app invisibly sideloads OTOY’s ORBX Media Player to enable this functionality. This summer OTOY joined Disney’s startup accelerator program, and as a result several Disney web sites are launching today with these environments, including Disney.com, ESPN.com, and ABCNews.com.

We’re still trying to understand how this type of functionality might be implemented more broadly. Oculus Chief Technology Officer John Carmack told us in July he was interested in the gLTF format, which OTOY supports, saying “I think most people hope that the metaverse won’t be built on proprietary media formats.” Last month I asked Carmack whether Oculus would make it possible to try out a number of VR apps in a social setting, and he responded that “I’m working on a general mechanism for this.” Is this that mechanism? We’re not sure yet, but it certainly looks like this is at least part of it.

The new functionality could be a preview of a social future inside VR headsets where owners find a limitless number of activities to share with friends. Ultimately, this announcement from OTOY looks like a step on the path toward shared VR experiences becoming a major reason people consider getting a headset.