Developers Now Receiving Google’s Experimental 6DOF Controller Kits

As applications for the Lenovo Mirage Solo 6DOF controller dev kits come to a close recently, it appears Google has begun sending out its first wave of units, an experimental hardware add-on for Lenovo’s Mirage Solo standalone headset that brings optical positional tracking to a pair of purpose-built controllers.

Alex Coulombe, the creative director and co-founder of VR startup Agile Lens, was one of the firsts to publish a few snaps along with his initial impressions of the dev kit; the headset already boasts 6DOF headset tracking but was matched with a single 3DOF controller (rotational only) at launch in May.

According to Coulombe, the 6DOF controller kit is about as plug-and-play as we would have hoped, saying “[i]f you don’t have the faceplate plugged in, everything is normal. As soon as you plug it in, the controllers just appear (sometimes at the wrong place for a moment). From there you can go about your business naturally like in any desktop 6DoF experience.”

The dev kit also features backwards compatibility with standard Daydream apps that use the 3DOF controller, Coulombe says.

SEE ALSO
Valve Promotes Latest Knuckles Controller from Prototype to Dev Kit

Putting it through its paces, Coulombe found the optical tracking system’s breaking point, but concludes it’s “not a big problem, [there are] few applications I can imagine where you’d really need to cross your hands over each other.”

Cubic VR founder Haldun Kececigil also received a unit, posting a brief look at the dev kit still fresh in the box and stating eagerly that tracking was so far “flawless” since the latest update.

Where the 6DOF controller dev kit will eventually will lead, we’re not sure. Healthy speculation: Google isn’t gearing up to mass produce the add-on itself, but rather seed its developer community with the tools to develop full-fledged 6DOF content for a headset yet to come, possibly one that will directly compete with Oculus Quest, which has been confirmed to launch sometime early 2019.

So far Google has been mum on the details, so at this point we just can’t tell.

The post Developers Now Receiving Google’s Experimental 6DOF Controller Kits appeared first on Road to VR.

Google Appears to Be Ramping Up R&D Efforts for “New and novel” AR/VR Lenses

Though there wasn’t much groundbreaking in the Daydream news shared at Google I/O this year, the company is growing its AR/VR team at an accelerated rate, suggesting Google has a number of things up its sleeve yet. One recent job listing which caught our eye suggests that the company is ramping up its R&D efforts for lenses, now hiring a ‘Diamond Turning Technician’ to assist with rapid prototyping of “new and novel” optics for AR and VR devices.

Among flurry of job listings for Google’s Daydream AR/VR team seen so far this year was something that we haven’t spotted previously: the company is seeking a Diamond Turning Technician experienced in operating “state of the art” lathe equipment used for precision optics prototyping. While we’ve known for some time that Google has been designing custom lenses for their Daydream products, the new job listing suggests and escalation of that work into in-house rapid prototyping.

Google’s custom lenses for Daydream 2017 (right) were a big improvement over the simple lenses of the original (left). | Photo by Road to VR

The job listing, which was posted in the last week, notes an internal “dynamic Research & Development (R&D) laboratory with a focus on new and novel [optical] designs and a close partnership with world-class Optical and Mechanical Designers.” The opening is at Google’s Mountain View headquarters.

Though externally Daydream developments have been slower in recent months, an analysis by Road to VR of the company’s job posting trends shows that hiring for Google’s AR/VR team is accelerating.

Data collected by Road to VR

In the last 10 months the company has listed 59 new job openings directly associated with its AR/VR activities, up 37% over the 10 months prior.

Google’s increasing hiring in the AR and VR space mirrors other major tech companies like Oculus who toward the end of 2017 and into 2018 began an AR/VR hiring spree that would nearly triple the number of the company’s usual job listings. And of course Apple has been not only hiring, but also making strategic acquisitions in the space, most recently picking up its very own holographic optics company.

The post Google Appears to Be Ramping Up R&D Efforts for “New and novel” AR/VR Lenses appeared first on Road to VR.

Google’s ‘YouTube VR’ App is Coming to Oculus Go Soon

Oculus and Google are finally bringing the official YouTube VR app to the Oculus Store for Oculus Go soon. Announced previously for Gear VR back at Google’s developer conference, the YouTube VR app has only been available for a select few flagship Samsung phones.

Update (September 26th, 2018): Today at Oculus Connect 5, the company announced that the YouTube VR app is finally coming to Oculus Go. There’s no specific launch date, however the company says it will be “soon”. The app was previously only supported on a few Gear VR-compatible phones. The original article follows below:

Original Article (July 26th, 2018): Gear VR owners have been waiting for an official YouTube VR app since the headset officially launched in late 2015. Previously, users would need to navigate to YouTube via VR web browsers such as Samsung Internet or Oculus Internet, which admittedly wasn’t to best way to view the video platform’s variety of content, which spans 360 degree, VR180, and standard formats.

Another recent development in the world of YouTube VR is the app’s new social viewing mode, dubbed ‘Watch Together’. Google says in a blog post that the new feature will let you “watch and discuss videos with others in a communal, virtual space,” although we have little to go on in terms of how that’s handled outside of this short gif below which shows thumbnail versions of avatars while viewing a 360 video. The company says it will be available on Daydream View and Gear VR.

The company also proclaims that YouTube VR now supports Samsung Gear VR, Daydream View, HTC Vive, and PSVR. That’s not really the whole story though.

Despite the fact that Gear VR is ‘powered’ by Oculus, and hooks into the Oculus Store for content, noticeably missing from the list of supported devices is any mention Oculus Go or Oculus Rift, the latter of which is actually supported via the YouTube VR app on Steam; it would be nigh impossible to create a HTC Vive exclusive when publishing through Steam using Valve’s OpenVR API and added benefit for user-created custom keybindings for SteamVR-compatible controllers.

While the Gear VR app isn’t out yet, it could be possible that Oculus Go will see de facto support too, but Google just doesn’t want the headlines to skew that way right now. Oculus themselves tell developers that “most Gear VR apps will run unmodified on Oculus Go,” so it may be when the app launches that Go users can get a crack at it too, although this is pure speculation.

– – — – –

We’ll be keeping our eyes out when the Gear VR app launches later this week, and updating this piece when more information arrives.

The post Google’s ‘YouTube VR’ App is Coming to Oculus Go Soon appeared first on Road to VR.

Google Adds Support For Two Controllers To Daydream SDK

Google Adds Support For Two Controllers To Daydream SDK

Google is showing indications that its operating system for VR could be evolving toward support of two-handed games. While it remains unlikely you’ll ever be playing Job Simulator or Tilt Brush on a Lenovo Mirage Solo or a Daydream View, the new addition to Google’s Daydream SDK could lay the groundwork for those titles and other great games like Beat Saber and Superhot to eventually run on future headsets powered by Google.

The Lenovo Mirage Solo and other Daydream headsets all use a single hand controller that operates a lot like a laser pointer. The latest update to the Google VR SDK for Unity, though, adds support for two of those types of controllers. Functionality shouldn’t change on devices that only support one controller, but apps could now see that controller as the player’s dominant hand. On certain devices capable of supporting two controllers, there could also be a non-dominant hand shown as well.

Google previously revealed computer vision research showing how it could determine the location of pointer-only controllers, effectively turning them into the point-and-reach controllers that are more compelling and fun to use, but there’s no indication in the documentation of support for point-and-reach hand controllers. That means Beat Saber or Job Simulator running on a standalone headset is still something we’ll have to wait to see.  I’ve reached out to Google representatives to see if Mirage Solo or Pixel phones will support two pointer controllers, and I will update this post if they respond.

Tagged with: ,

The post Google Adds Support For Two Controllers To Daydream SDK appeared first on UploadVR.

Google VR/AR Boss Confirms Commitment: ‘We’re Making Investments For The Long Term’

Google VR/AR Boss Confirms Commitment: ‘We’re Making Investments For The Long Term’

The Mirage Solo standalone headset powered by Google’s WorldSense tracking technology launched just days before the Google I/O developer conference, and yet a long keynote event came and went without mention of VR.

The $400 VR headset from Lenovo is the “first” Daydream standalone, but despite Google using that word to describe Mirage Solo no manufacturers have publicly committed to building a second one. The hardware is a big step up technically compared with Oculus Go, and it can run the entire Google Play catalog, but Lenovo’s Mirage Solo is unlikely to convince the masses that VR is a must buy. With no new information about either internal products like Daydream View or partner products like Mirage Solo — it makes sense that some developers and early adopters might be wondering if Google is still committed to VR.

“We haven’t confirmed anything else in the making,” said Google’s head of VR and AR, Clay Bavor, in an interview at Google I/O. “I am an emphatic believer in the long term promise of VR, AR and all things as I call them ‘Immersive Computing.’ It is very clearly to me and to us more broadly at Google part of the next phase of computing — computing that makes use of our environment, that vastly increases the richness of input and output — that’s going to be important. That’s going to be a big deal. And we’re making investments for the long term.”

Over the last decade Google has partnered with other companies to enable a variety of initiatives centered around its Android operating system. In recent years, though, Google started to launch its own products while bringing more work in house. For example, the tech giant recently acquired teams from HTC that worked on its Pixel phone. Was VR overlooked at I/O because Google is shifting focus to developing VR and AR products internally?

“We think VR and AR are going to be a big space,” Bavor said. “There’s room and there are roles for both Google devices and also for working with partners.”

Tagged with:

Google Open Sources Seurat, a ‘Surface Light-field’ Rendering Tool for 6DOF Mobile VR

Google announced Seurat at last year’s I/O developer conference, showing a brief glimpse into the new rendering technology designed to reduce the complexity of ultra high-quality CGI assets so they can run in real-time on mobile processors. Now, the company is open sourcing Seurat so developers can customize the tool and use it for their own mobile VR projects.

“Seurat works by taking advantage of the fact that VR scenes are typically viewed from within a limited viewing region, and leverages this to optimize the geometry and textures in your scene,” Google Software Engineer Manfred Ernst explains in a developer blogpost. “It takes RGBD images (color and depth) as input and generates a textured mesh, targeting a configurable number of triangles, texture size, and fill rate, to simplify scenes beyond what traditional methods can achieve.”

Blade Runner: Revelations, which launched last week alongside Google’s first 6DOF Daydream headset Lenovo Mirage Solo, takes advantage of Seurat to a pretty impressive effect. Developer studio Seismic Games used the rendering tech to bring a scene of 46.6 million triangles down to only 307,000, “improving performance by more than 100x with almost no loss in visual quality,” Google says.

Here’s a quick clip of the finished scene:

To accomplish this, Seurat uses what the company calls ‘surface light-fields’, a process which involves taking original ultra-high quality assets, defining a viewing area for the player, then taking a sample of possible perspectives within that area to determine everything that possibly could be viewed from within it.

This is largely useful for developers looking to create 6DOF experiences on mobile hardware, as the user can view the scene from several perspectives. A major benefit, the company said last year, also includes the ability to add perspective-correct specular lightning, which adds a level of realism usually considered impossible on a mobile processors’ modest compute overhead.

Google has now released Seurat on GitHub, including documentation and source code for prospective developers.

Below you can see an image with with Seurat and without Seurat (click to expand):

The post Google Open Sources Seurat, a ‘Surface Light-field’ Rendering Tool for 6DOF Mobile VR appeared first on Road to VR.

VR Quiz Game ‘Go Guess’ Uses Google 360 Captures to Make You Guess Where You Are

Go Guess (2018) is a new interactive VR quiz game from developers Oblix that tosses you into Google 360 captures and makes you to guess where you are. The twist? You’re playing against other people, and you can hobnob in a social VR space between rounds.

Like its 2D forebears such as GeoGuessr (2013), Go Guess tasks you with investigating the captures by hunting for clues. You might be out in the wilderness looking at plants, or on a sunny beach hunting for a sign—anything that will help you distinguish the Welsh countryside from Tasmania, or Japanese islands from the Pacific North West coastline.

Image courtesy Oblix

Giving you a number of nodes to teleport to, you can investigate the scene as much as you like. Rounds take place periodically, although avatars are named randomly currently, so it may be difficult to find and make friends in between matches, which is represented as a 3D render of Times Square in New York City.

Once you’ve found out where you are in the linked-together captures (or where you think you are), you can pull up a globe, mark the map and do some fine-tuning until you’re happy with the location. The closer your guess is to the actual location, the higher the points.

Image courtesy Oblix

Google opened up its treasure trove of 360 Street View captures to developers late last year, although there was a proviso for developing apps and games with the data: it had to be free.

The app is free and supports Oculus Rift headsets, although since it’s on Steam, it’s likely SteamVR-compatible headsets such as HTC Vive and Windows “Mixed Reality” VR headsets will work as well, albeit without appropriate controller models.

The post VR Quiz Game ‘Go Guess’ Uses Google 360 Captures to Make You Guess Where You Are appeared first on Road to VR.

Google Demonstrates Promising Low-cost, Mobile Inside-out Controller Tracking

A number of standalone VR headsets will be hitting the market in 2018, but so far none of them offer positional (AKA 6DOF) controller input, one of the defining features of high-end tethered headsets. But we could see that change in the near future, thanks to research from Google which details a system for low-cost, mobile inside out VR controller tracking.

The first standalone VR headsets offering inside-out positional head tracking are soon to hit the market: the Lenovo Mirage Solo (part of Google’s Daydream ecosystem), and HTC Vive Focus. But both headsets have controllers which track rotation only, meaning that hand input is limited to more abstract and less immersive movements.

Detailed in a research paper (first spotted by Dimitri Diakopoulos), Google says that the reasons behind the lack of 6DOF controller tracking on many standalone headsets is because of hardware expense, computational cost, and occlusion issues. The paper, titled Egocentric 6-DoF Tracking of Small Handheld Objects goes on to demonstrate a computer-vision based 6DOF controller tracking approach which works without active markers.

Authors Rohit Pandey, Pavel Pidlypenskyi, Shuoran Yang, and Christine Kaeser-Chen, all from Google, write, “Our key observation is that users’ hands and arms provide excellent context for where the controller is in the image, and are robust cues even when the controller itself might be occluded. To simplify the system, we use the same cameras for headset 6-DoF pose tracking on mobile HMDs as our input. In our experiments, they are a pair of stereo monochrome fisheye cameras. We do not require additional markers or hardware beyond a standard IMU based controller.”

The authors say that the method can unlock positional tracking for simple IMU-based controllers (like Daydream’s), and they believe it could one day be extended to controller-less hand-tracking as well.

SEE ALSO
Qualcomm Snapdragon 845 VRDK to Offer Ultrasonic 6DOF Controller Tracking

Inside-out controller tracking approaches like Oculus’ Santa Cruz use cameras to look for for IR LED markers hidden inside the controllers, and then compare the shape of the markers to a known shape to solve for the position of the controller. Google’s approach effectively aims to infer the position of the controller by looking at the users arms and hands, instead of glowing markers.

To do this, they captured a large dataset of images from the headset’s perspective, which show what it looks like when a user holds the controller in a certain way. Then they trained a neural network—a self-optimizing program—to look at those images and make guesses about the position of the controller. After learning from the dataset, the algorithm can use what it knows to infer the position of the controller from brand new images fed in from the headset in real time. IMU data from the controller is fused with the algorithm’s positional determination to improve accuracy.

Image courtesy Google

A video, which has since been removed, showed the view from the headset’s camera, with a user waving what looked like a Daydream controller around in front of it. Overlaid onto the image was a symbol marking the position of the controller, which impressively managed to follow the controller as the user moved their hand, even when the controller itself was completely blocked by the user’s arm.

Image courtesy Google

To test the accuracy of their system, the authors captured the controller’s precise location using a commercial outside-in tracking system, and then compared to the results of their computer-vision tracking system. They found a “mean average error of 33.5 millimeters in 3D keypoint prediction,” (a little more than one inch). Their system runs at 30FPS on a “single mobile CPU core,” making it practical for use in mobile VR hardware, the authors say.

Image courtesy Google

And there’s still improvements to be made. Interpolation between frames is suggested as a next step, and could significantly speed up tracking, as the current model predicts position on a frame-by-frame basis, rather than sharing information between frames, the team writes.

As for the dataset which Google used to train the algorithm, the company plans to make it publicly available, allowing other teams to train their own neural networks in an effort to improve the tracking system. The authors believe the dataset is the largest of its kind, consisting of some 547,000 stereo image pairs, labeled with precise 6DOF position of the controller in each image. The dataset was compiled from 20 different users doing 13 different movements in various lightning conditions, they said.

– – — – –

We expect to hear more about this work, and the availability of the dataset, around Google’s annual I/O developer conference, hosted this year May 8th–10th.

The post Google Demonstrates Promising Low-cost, Mobile Inside-out Controller Tracking appeared first on Road to VR.

Lenovo Mirage Solo Daydream Headset Now Available For Pre-order at $400

Lenovo’s Mirage Solo, a standalone VR headset in the Daydream app ecosystem, has had an official release date and price for some time now: May 5th and $400. Pre-orders are now live for both the VR headset and Lenovo’s VR180 camera.

You can pre-order Lenovo Mirage Solo on Amazon for $400 with a release date of May 6th.

There’s also a bundle featuring the Mirage Solo headset and Mirage Solo VR180 camera for $700, although the bundle doesn’t provide any discount off the combined prices; the camera costs $300 on its own.

Image courtesy Lenovo

The Mirage Solo is a standalone VR headset, meaning it has everything on board for VR, and doesn’t rely on a docked smartphone or host PC. Based on Google’s Daydream Android VR ecosystem, the headset’s biggest claim to fame is its 6 degrees of freedom (6DoF) tracking, which allows for room-scale positional tracking.

Much like Vive Focus, which was previously intended as a Daydream headset before HTC decided to use Viveport Mobile and release it first in China, tracking is done through the headset’s front-facing cameras, requiring no external sensors for room-scale experiences.

We went hands-on with Lenovo Mirage Solo back at CES 2018 in January, and while the system clearly has strong fundamentals, the price point puts it in a weird segment that may ward off many. The hands-on article offers a comprehensive look at the headset, so it’s definitely worth checking out if you’re still on the fence.

Specs

  • Qualcomm Snapdragon 835
  • RAM: 4GB
  • ROM: 64 GB UFS
  • Card Slot: microSD Card; Up to 256 GB
  • Single panel 2560 × 1440 QHD LCD display
  • Frequency: 75Hz
  • 110 degree FOV
  • Battery Capacity: 4000 mAh (2.5 hours of general use)
  • Android N Pro Audio, 3.5mm Audio Jack with Dual Microphones
  • Controller: single 3DoF Daydream controller

What’s in the Box

  • Lenovo Mirage Solo with Daydream
  • Wireless Daydream Motion Controller
  • Travel Adapter
  • USB Type-C Cable
  • 3.5 mm VR optimized Earphones
  • User Manual
  • Quick Start Guide
  • Warranty Card

 

The post Lenovo Mirage Solo Daydream Headset Now Available For Pre-order at $400 appeared first on Road to VR.

Stream Live Performances From FADER FORT in VR

The FADER have been running a special live music event at South by Southwest (SXSW) for over 15 years, dedicated to introducing the world to new and upcoming music acts. The small, intimate nature of the performance means that previously only a fortunate few could see these performances, that has changed with virtual reality (VR) video streaming.

In the past, the FADER FORT music event has allowed emerging artists to become known to a new audience, with performers such as Cardi B, Dua Lipa and Drake using it as a breakout gig. As a result, it become one of SXSW’s most in-demand shows.

The FADER have teamed up with Google VR in order to open up the show to a wider audience by livestreaming the event on YouTube in VR180 video. Users will e able to watch a 3D, 4K video of the show, where performances from special guests such as Saweetie, Bloc Boy, Valee and Speedy Ortiz will be featured.

The VR180 stream will be available to users via a 2D YouTube desktop or mobile app, or for a more immersive experience, viewers can don a Google Cardboard, Samsung Gear VR Google Daydream or even PlayStation VR to view the stream in VR.

The livestreams will be available from 14th-16th March, with the best acts being available to view live during the day in VR180, then an edited version of each set will be made available for viewing at any time, for those whose time-zones do not sync up so well or are otherwise occupied during the performance times.

The FADER said on its website: “Thanks to Google’s tech, the videos look great on a regular device, but seeing them in VR definitely makes for the best viewing experience. “

The full set list can be found at The FADER official website, though some of the special guest performances are intended to be surprises and have not yet been announced.

For further news on new and upcoming VR-related events, hardware and software, keep checking back with VRFocus.