Google’s New ‘Labs’ Team Brings AR/VR, Project Starline & Area 120 Under a Single Roof

Google is shaking things up with the reorganization of its AR/VR efforts, Project Starline, and Area 120 in-house incubator, dubbing the internal team ‘Google Labs’.

As first reported by TechCrunch, Google is shifting a few of its notable forward-looking projects into a single team and bringing them under the leadership of Google veteran Clay Bavor.

Before taking the reigns of Google Labs, Bavor led the company’s AR/VR team where he oversaw the 2016 launch of its Android-based Daydream VR platform. It was an ambitious undertaking, although it was subsequently abandoned in 2019 due to a disappointing reception to its slot-in smartphone efforts and poor market performance of its 6DOF Daydream headset, Lenovo Mirage Solo. The team also helped develop ARCore, the augmented software development kit that brought smartphone-based AR to millions of Android devices.

More recently, Bavor led Google’s Project Starline, an experimental light field display system that the company envisions as a “magic window” of sorts, allowing far-flung users to speak in a more natural way than video conferencing apps can provide—and all without the need of a headset or special glasses.

Both Project Starline and its AR/VR efforts have a shared lineage within the company, but it seems Google is adding an entrepreneurial flare to Labs with the inclusion of Area 120, the in-house tech incubator that has seen the successful launch of several startups including Threadit, Stack, Adlingo, Gamesnacks, Avera AI, and Orion WiFi.

An no, this doesn’t mean the company is reviving the 2000s-era Google Labs, which was used as a public testbed to demonstrate new projects like Gmail, Google Calendar and Google Wave. An internal company memo obtained by TechCrunch states the reorganization is “focused on starting and growing new, forward-looking investment areas across the company.”

“Central to this org is a new team called Labs, focused on extrapolating technology trends and incubating a set of high-potential, long-term projects,” the memo said.

As a result, it appears Area 120 is being elevated with its incorporation into Labs. TechCrunch notes that the incubator was “three layers deep in terms of reporting to Google CEO Sundar Pichai — even though Pichai himself had to sign off on its every exit.”

Google hasn’t officially announced Labs, however the company tacitly confirmed it by acknowledging Bavor’s new title, calling it “an expanded role” that will focus on “long-term technology projects that are in direct support of our core products and businesses.”

The post Google’s New ‘Labs’ Team Brings AR/VR, Project Starline & Area 120 Under a Single Roof appeared first on Road to VR.

Google Trials ‘Starline’ Glasses-Free Light Field Display

Google’s research into light fields is bearing fruit with a glasses-free 3D display technology called “Project Starline” available at a few of its offices.

Google revealed the work as part of its annual developer conference this week. It is pitched as working like a “magic window” and relies on “custom-built hardware and highly specialized equipment” with advances in real-time compression, spatial audio, computer vision, and machine learning to provide a sense of being face to face with someone no matter the physical distance.

The image below posted by Google’s Vice President of AR and VR Clay Bavor offers a look at the substantial footprint for the system while it is used in one of Google’s offices.

project starline google

Google also posted a video showcasing the technology used for some person-to-person interactions said to provide “a sense of volume and depth that can be experienced without the need for additional glasses or headsets.” The company says it is planning to trial deployments with enterprise partners later this year.

We tested some early-stage glasses-free light field display technology in 2018 and it required years more development and enormous investments to improve brightness and cost enough to put it within reach of average consumers. In our 2018 demonstration from Light Field Lab, for instance, the 3D effect only worked if you kept your head in a very specific area relative to the display. Indeed, even with Google claiming key breakthroughs in its efforts to prove its glasses-free 3D display technology as a direction “technology can and should go”, the company cautions that only “some of these technical advancements” are likely to make it into its communication products.

Still, we’d love to go eyes-on with Project Starline at some point for a better sense of its use cases and the investment Google will need to spend to bring its advancements into wider use.

Google VR/AR Boss Confirms Commitment: ‘We’re Making Investments For The Long Term’

Google VR/AR Boss Confirms Commitment: ‘We’re Making Investments For The Long Term’

The Mirage Solo standalone headset powered by Google’s WorldSense tracking technology launched just days before the Google I/O developer conference, and yet a long keynote event came and went without mention of VR.

The $400 VR headset from Lenovo is the “first” Daydream standalone, but despite Google using that word to describe Mirage Solo no manufacturers have publicly committed to building a second one. The hardware is a big step up technically compared with Oculus Go, and it can run the entire Google Play catalog, but Lenovo’s Mirage Solo is unlikely to convince the masses that VR is a must buy. With no new information about either internal products like Daydream View or partner products like Mirage Solo — it makes sense that some developers and early adopters might be wondering if Google is still committed to VR.

“We haven’t confirmed anything else in the making,” said Google’s head of VR and AR, Clay Bavor, in an interview at Google I/O. “I am an emphatic believer in the long term promise of VR, AR and all things as I call them ‘Immersive Computing.’ It is very clearly to me and to us more broadly at Google part of the next phase of computing — computing that makes use of our environment, that vastly increases the richness of input and output — that’s going to be important. That’s going to be a big deal. And we’re making investments for the long term.”

Over the last decade Google has partnered with other companies to enable a variety of initiatives centered around its Android operating system. In recent years, though, Google started to launch its own products while bringing more work in house. For example, the tech giant recently acquired teams from HTC that worked on its Pixel phone. Was VR overlooked at I/O because Google is shifting focus to developing VR and AR products internally?

“We think VR and AR are going to be a big space,” Bavor said. “There’s room and there are roles for both Google devices and also for working with partners.”

Tagged with:

Make 180VR Videos With The New YI Horizon VR180 Camera From Google & YI Technology

Perhaps a tad overshadowed by all that was going on with their virtual reality (VR) rivals at this year’s International Consumer Electronics Show (CES), what with the HTC Vive Pro being announced on one side and Xiaomi revealing not only a partnership with Oculus (and Qualcomm) but a new standalone headset for the Chinese market – the Mi VR Standalone. That’s not even counting the 4K gaming standalone HMD that is being brought out by iQIYI. Google did reveal, with its partner Lenovo its own big standalone news in the form of an update regarding the Mirage Solo. However the company wasn’t done there and had another piece of hardware to show at the event.

YI Horizon VR180 Camera

Teaming up with YI Technology an internationally focused company working in the field of advanced intelligent imaging technologies, the pair unveiled a new immersive camera, the YI Horizon VR180.

As you may have guessed from the name this camera creates experiences in 180 degrees, and integrates with both Google Photos and YouTube to use the VR180 format the company has rolled out to each. Video is captured in 5.7K resolution at 30fps whilst also supporting instant in-device stitching and also live streaming. A finalised video project can be viewed in VR through compatible headsets which naturally include both the Google Cardboard and Google Daydream but also the PlayStation VR gets a specific mention

Key features listed by the companies are as follows:

  • Capture stunning 5.7k resolution photos and video
  • One-button live video streaming
  • 2.2 inch, 640×360 retina touch screen
  • Features Type-C USB port with HD speed data transmission
  • Professional-grade 4 microphone design with outdoor noise reduction
  • Easily manage photos and videos by using the VR180 app by Google

Speaking on the reveal YI Technology CEO Sean Da said that  “Whether it is to demonstrate a makeup tutorial or share rich experiences from a recent trip with friends and loved ones, the Camera provides an easy and deeply engaging way to capture many of life’s most special moments. When the opportunity to work closely with Google’s VR team on this initiative presented itself, it was a no-brainer given the team’s extraordinary commitment to defining the future of VR experiences and extensive investment in the YouTube VR180 format. We believe the YI Horizon VR180 Camera will bring endless amounts of joy to all that use the device.”

Clay Bavor, Google’s Vice President of VR was also on hand. “We introduced VR180 as a way to make capturing high quality, immersive photos and video easy for consumers and professional creators. YI’s camera has amazing image quality, as well as features that we think will be compelling for creators like livestreaming and a preview display. We’re excited to see what consumers and creators are able to capture and bring to virtual reality.”

You can see a video below demonstrating the technology, that will be in VR180 if you have one of the compatible headsets (it is not really possible to get the full experience on YouTube alone). VRFocus will have more information about the camera as it is revealed.

‘Blade Runner: Revelations’ for Daydream to Launch with 6DoF Tracking Support

A new adventure set in the Blade Runner universe from developer Seismic Games is coming soon to the Daydream mobile VR platform. The game is optimised for WorldSense, a new six degrees of freedom (6DoF) positional tracking system developed by Google, found in upcoming mobile devices such as the standalone Lenovo Mirage Solo.

Announced at Google I/O last year, the Mirage Solo is the first standalone mobile VR headset using the Daydream VR platform. Google recently revealed that it is also the first device to feature WorldSense, their new self-contained tracking system based on “years of investment in simultaneous localization and mapping,” that claims to deliver “PC-quality positional tracking.”

Alcon Media Group has partnered with Google and Lenovo to bring Blade Runner: Revelations to Daydream. Seismic Games, who acquired mixed reality specialists Grue Games in late 2016, have a history with major IP such as Skylanders and Call of Duty. The studio teased a potential Blade Runner connection in September.

image courtesy Lenovo

In this new narrative adventure, players assume the role of Harper, a seasoned Blade Runner who “unravels a twisted replicant plot that threatens the delicate balance of Los Angeles in 2023.” According to the press release, players will “search for evidence with the help of their flying spinner, deadly blaster, and esper image reconstruction to try to solve the mystery in an immersive VR environment.”

While the game will work on all Daydream devices, it is said to be “best experienced” on the Mirage Solo. WorldSense support means that players can “duck, dodge and lean,” as well as stepping “backwards, forwards, and side to side, unlocking new gameplay elements that bring the world of Blade Runner to life.”

“We’re working closely with developers to bring new experiences to the platform that take advantage of all these new technologies,” writes Clay Bavor, Google’s Vice President of Virtual and Augmented Reality. Blade Runner: Revelations is the first example of Google’s partnerships with developers to bring 6DoF-optimised games to mobile VR.

“It’s the most immersive way to access Daydream,” he says, noting the headset’s wide field of view (110°) and “advanced blur-free display”. The Lenovo Mirage Solo is due to launch in Q2 2018.

The post ‘Blade Runner: Revelations’ for Daydream to Launch with 6DoF Tracking Support appeared first on Road to VR.

Lenovo Unveils Mirage Solo Daydream Standalone VR Headset

Lenovo today unveiled the first standalone VR headset in Google’s Daydream mobile VR ecosystem, the Mirage Solo. Revealed today at CES, the headset integrates Google’s WorldSense six degrees of freedom (6DoF) positional tracking alongside the headset’s 3DoF controller.

First teased back at Google I/O, Lenovo’s then unnamed headset was set to arrive with a similar 6DoF headset from HTC, however HTC recently scrapped plans to bring their headset to the West with Daydream support, instead releasing in China with a mobile version of the Viveport content store. Lenovo’s headset is the first and currently only standalone headset currently in the Daydream app ecosystem.

Clay Bavor, Vice President of Virtual and Augmented Reality at Google, says Lenovo’s headset marks an important shift for the Daydream platform, giving the user a “more immersive and streamlined way to experience the best of what Daydream has to offer without needing a smartphone.”

image courtesy Lenovo

There’s no definitive release date yet, but the headset is expected to hit shelves in Q2 2018. As for pricing, Lenovo says they’re working on reducing the price “so that it’s accessible to more people.” The company says the Mirage Solo will be priced “under $400.” While not explicitly stated, we’re reading between the lines here when we say it’ll likely have a $399.99 price tag, but that remains to be seen.

Mirage Solo is said to give users access to the entire Daydream catalog of over 250 apps and games, including Google apps like YouTube VR, Street View, Photos, and Expeditions. Notably, Google says a game designed for Lenovo Mirage Solo’s 6DoF capabilities based on Blade Runner 2049 (2017) called Blade Runner: Revelations will come to the headset as well. “We’re working closely with developers to bring new experiences to the platform that take advantage of all these new technologies,” the company says.

image courtesy Lenovo

Lenovo Mirage Specs

  • Dimensions (W x L x H) (mm) : 204.01 x 269.5 x 179.86 (inches) : 8.03″ x 10.61″ x 7.08
  • Weight: 645 g (1.42 lbs)
  • Color: Moonlight White
  • Operating System: Daydream OS
  • Processor: Qualcomm Snapdragon™ 835
  • Audio: Android N Pro Audio, 3.5 mm Audio Jack with Dual Microphones
  • RAM : 4 GB
  • ROM : 64 GB UFS
  • Card Slot : microSD Card; Up to 256 GB
  • Battery: 4000 mAh Li-ion Polymer (standby and general usage time TBA)
  • Display 5.5″ Resolution QHD (2560 x 1440) LCD, 75 Hz
  • Lens : 2 x Fresnel-Aspheric, 110° FOV
  • WLAN : WiFi 802.11 ac/n 2×2 MIMO Dual Band
  • Bluetooth: Bluetooth® 5.0 + BLE

What’s in the Box

  • Lenovo Mirage Solo
  • Daydream Wireless Motion Controller
  • Travel Adapter
  • USB Type-C™ Cable
  • 3.5 mm Earphones
  • User Manual
  • Quick Start Guide
  • Warranty Card

The post Lenovo Unveils Mirage Solo Daydream Standalone VR Headset appeared first on Road to VR.

Google is Developing a VR Display With 10x More Pixels Than Today’s Headsets

Earlier this year, Clay Bavor, VP of VR/AR at Google, revealed a “secret project” to develop a VR-optimised OLED panel capable of 20 megapixels per eye. The project was mentioned during SID Display Week 2017 but has gone largely under the radar as little information has surfaced since.

Following a general overview of the limits of current VR technology, and an announcement that Google is working with Sharp on developing LCDs capable of VR performance normally associated with OLED, Bavor revealed an R&D project that hopes to take VR displays to the next level. A video of the session comes from ARMdevices.net’s Nicolas “Charbax” Charbonnier.

“We’ve partnered deeply with one of the leading OLED manufacturers in the world to create a VR-capable OLED display with 10x more pixels than any commercially available VR display today,” Bavor said. At 20 megapixels per eye, this is beyond Michael Abrash’s prediction of 4Kx4K per eye displays by the year 2021.

“I’ve seen these in the lab, and it’s spectacular. It’s not even what we’re going to need in the ‘final display’” he said, referring to the sort of pixel density needed to match the limits of human vision, “but it’s a very large step in the right direction.”

SEE ALSO
Exclusive: How NVIDIA Research is Reinventing the Display Pipeline for the Future of VR, Part 1

Bavor went on to explain the performance challenges of 20 MP per eye at 90-120 fps, which works out at unreasonably high data rates of 50-100 Gb/sec. He briefly described how foveated rendering combined with eye tracking and other optical advancements will allow for more efficient use of such super high resolution VR displays.

The post Google is Developing a VR Display With 10x More Pixels Than Today’s Headsets appeared first on Road to VR.

Google’s AR/VR Chief on Ambient Computing, Conversational Interfaces, & AR/VR Strategy

clay-bavorAt Google’s big October 4th press conference, the company announced a new Pixel 2 phone and a range of new ambient computing devices powered by AI-enabled conversational interfaces including new Google Mini and Max speakers, Google Clips camera, and wireless Pixel Buds. The Daydream View mobile VR headset received a major upgrade with vastly improved comfort and weight distribution, reduced light leakage, better heat management, cutting-edge aspherical fresnel lenses with larger acuity and sweet spot, as well as an increased field of view of 10-15 degrees more than the previous version. It’s actually a huge upgrade and improvement, but VR itself only received a few brief moments during the 2-hour long keynote where Google was explaining their AI-first design philosophy for their latest ambient computing hardware releases.

LISTEN TO THE VOICES OF VR PODCAST

I had a chance to sit down with Clay Bavor, Google’s Vice President for Augmented and Virtual Reality to talk about their latest AR & VR announcements as well as how Google’s ambient computing and AI-driven conversational interfaces fit into their larger immersive computing strategy.

YouTube VR is on the bleeding edge of Google’s VR strategy, and their VR180 livestream camera can broadcast a 2D version that translates well to watching on a flat screen, but also provides a more immersive stereoscopic 3D VR version for mobile VR headsets.

Google retired the Tango brand with the announcement of ARCore on August 29th, and Bavor explains that they had to come up with a number of algorithmic and technological innovations in order to standardize the AR calibration process across all of their OEM manufacturers.

Finally, Bavor reiterates that WebVR and WebAR are a crucial part of the Google’s immersive computing strategy. Google showed their dedication to the open web by releasing experimental WebAR browsers for ARCore and ARKit so that web developers can develop cross-compatible AR apps. Bavor sees a future that evolves beyond the existing self-contained app model, but this requires a number of technological innovations including contextually-aware ambient computing powered by AI as well as their Virtual Positioning System announced at Google I/O. There are also a number of other productivity applications that Google is continuing to experiment with, but the screen resolution still needs to improve from having a visual acuity measurement of 20/100 to being something closer to 20/40.

SEE ALSO
New Google Daydream Revs Up Phone Performance for VR, & Brings Improved Comfort, Field of View

After our interview, Bavor was excited to tell me how Google created a cloud-based, distributed computing, physics simulator that could model 4 quadrillion photons in order to design the hybrid aspherical fresnel lenses within the Daydream View. This will allow them to create machine-learning optimized approaches to designing VR optics in the future, but it will also likely have other implications for VR physics simulations and potentially delivering volumetric digital lightfields down the road.

Google’s vision of contextually-aware AI and ambient computing has a ton of privacy implications that are similar to my many open questions about privacy in VR, but I hope to open up a more formal dialog with Google to discuss these concerns and potentially new concepts of self-sovereign identity and new cryptocurrency-powered business models that go beyond their existing surveillance capitalism business model. There wasn’t a huge emphasis on Google’s latest AR and VR announcements during the press conference as AI conversational interfaces and ambient computing received the majority of attention, but Google remains dedicated to the long-term vision of the power and potential of immersive computing.


Support Voices of VR

Music: Fatality & Summer Trip

The post Google’s AR/VR Chief on Ambient Computing, Conversational Interfaces, & AR/VR Strategy appeared first on Road to VR.

ARCore Support Confirmed For Galaxy S8+, Note8 & Future Samsung Phones

One of the big pushes throughout last week’s Oculus Connect 4 was the emphasis, as always on creators and on developers. But that was last week, and today saw Samsung take to the stage in San Francisco, California for the Samsung Developer Conference.

It was, however, Google that revealed perhaps the biggest news coming out of the event. That the two tech titans, rivals in the battle for smartphone-based virtual reality (VR) would be uniting their products on the augmented reality (AR) stage.

Taking to that stage was Google’s Vice President of Virtual and Augmented Reality, Clay Bavor. Who confirmed to crowds during the first day’s keynote that the two companies would be bringing Google’s AR creation platform ARCore to not only the Samsung Galaxy S8+ smartphone but the Samsung Galaxy Note 8 phablet.  Both of which use the Android operating system developed by Google. Bavor also confirmed that future Samsung phones, presumably in the Galaxy line, would also be receiving ARCore support.

It’s the latest in a series of announcements in the battle of AR development which began in June this year when Google’s rival Apple first revealed its own AR developer platform called ARKit. Apple, continuing to be vocal about their support for AR over VR. ARCore being subsequently revealed in August with initial support for the Google Pixel, Pixel XL and the Samsung Galaxy S8. This move deepens that initial bond, and speaking on Twitter Bavor confirmed he was “excited” to be working with Samsung further.

Following a 2017 that has seen interest in developing apps with AR skyrocket thanks to both ARKit and ARCore, it is surely a matter of time before Apple’s next move.  When we get news on any further developments with either AR platform we will of course let you know on VRFocus.

Clay Bavor of Google VR Talks About the Future of VR and AR

There has been some discussion lately over whether the future is virtual reality (VR) or augmented reality (AR). Our own Kevin Eva has weighed in on the topic, along with numerous analysts and critics. Google VR boss Clay Bavor has also expressed his opinions on the subject.

In a recent episode of the Recode podcastToo Embarrassed to Ask’ Bavor laid out how he feels about the possibilities inherent in VR: “Imagine you have some glasses and you put them on and you feel like you’re completely transported somewhere else: Courtside at a Warriors game, or Machu Picchu, or maybe back in a moment in your life that you’ve recorded. The reality is, we’re at the very beginning of the journey to that fully realized version of VR.”

Bavor spoke of Google Cardboard as being something designed to offer users just a taster of what was possible with VR, which is why it was made using Cardboard, in order to give as many people as possible the chance to experience what VR could offer.

As for the rivalry between VR and AR, Bavor doesn’t believe it needs to be a competition between them: “They’re different points on the same spectrum of — I call it immersive computing,” he said. “I don’t really care about the label, but it’s this idea that you have computing and digital imagery that feels like it’s there. Virtual reality, everything is computer-generated; augmented reality, you have bits and pieces of digital information overlaid on your environment.” Bavor said, as he began waxing enthusiastic about the strengths of AR, “One [use] I am very excited about is navigation. I spend my day in this building, I still can’t find all the conference rooms. Imagine that you could just look through your phone: ‘Oh, your next meeting is over there.’ Or you get out of a taxi or an Uber; you’re disoriented. It could just say ‘You’re over there,’ [and] overlay footsteps on the ground, leading the way.”

VRFocus will continue to report on new developments within the AR and VR areas.