Bringing Fun & Whimsey Into Your Home with ‘Woorld’, Winner of Google Play’s ‘Best AR Experience’

robin-hunicke-2017Funomena’s Woorld won the ‘Best AR Experience’ category at the recent 2017 Google Play awards. In the game you scan your room with a Google Tango-enabled phone, and then you’re encouraged to decorate your space with extremely cute art and characters designed by Katamari’s Keita Takahashi. Part of the gameplay in Woorld is to figure out how to combine different objects together in order to unlock new objects and portions of the story in your space.

LISTEN TO THE VOICES OF VR PODCAST

Funomena had to innovate on a lot of augmented reality user interaction paradigms and spatial gameplay in designing this game. I had a chance to catch up with Funomena co-founder and CEO Robin Hunicke at Google I/O to talk about her game design process, as well as her deeper intention of bringing sacredness, mindfulness, calmness, worship, spirituality, love, empathy, and kindness into your environment through augmented reality technology. She takes a lot of inspiration from Jodorowsky’s The Technopriests as well as the sci-fi novel Lady of Mazes by Karl Schroeder.

SEE ALSO
Exec. Producer of Indie Hit 'Journey' is Developing a Unique Title for Oculus Touch

Hunicke also sees that there’s a split that’s emerging between the commercial VR and the indie VR scene with the character of content that’s being funded, and she talks the importance of supporting indie game creators.


Support Voices of VR

Music: Fatality & Summer Trip

The post Bringing Fun & Whimsey Into Your Home with ‘Woorld’, Winner of Google Play’s ‘Best AR Experience’ appeared first on Road to VR.

Google’s ‘Expeditions’ Initiative is Leading Innovation in the Future of Immersive Education

Jennifer-HollandGoogle’s overarching mission is to organize all of the world’s information, and so it’s a natural fit for the company to be one of the leading innovators for using VR for immersive education. Google Expeditions was born out of a hackathon soon after the Google Carboard launched back at Google I/O 2014, and it’s since been shared with over 2 million students who have gone on virtual field trips. At I/O last week, the company had Tango demos that showed me just how compelling augmented reality is going to be in the future of collaborative & embodied educational experiences.

LISTEN TO THE VOICES OF VR PODCAST

I had a chance to catch up with Daydream’s Education Program Manager, Jennifer Holland, at Google I/O where we talked about the history of Expeditions, and how successful it’s been in creating new levels of immersion and engagement with students. She talks about how the Expedition experiences are designed to be agnostic to any specific age or subject matter, but also independent of specific teaching strategy or philosophy.

Google has been rapidly iterating on creating useful tools that are immediately useful for teachers to introduce immersive experiences into their lesson plan, and there’s a lot that is left up to the teacher to be able to guide and direct the interactions and group learning exercises.

Holland also talks about some of the tools that have been built into expeditions, as well as the feedback that is driving the future of immersive education towards shared augmented reality experiences with Tango-enabled devices.

One of Google’s biggest strengths in the VR community is cultivating mental presence by using open web technologies to fuse together information about our world so that we can experience it in a new way. Google Earth VR is a perfect example of fusing many different sources of data about our world, and providing an entirely new immersive experience of it in VR.

Right now, Google’s Expeditions team and their collaborators are the only ones who are creating educational experiences, but they’d like to eventually make it easier at some point for people to create their own Expeditions. The Google Expeditions team announced during their Google I/O session that they’ve been using Mozilla’s WebVR framework, A-Frame, in order to rapidly prototype Expeditions experiences in VR, and Unity to prototype experiences in AR.

SEE ALSO
For Google, the Future of VR Is on the Open Web with WebVR & WebAR

I expect that WebVR and WebAR technologies will be a critical part of Google’s VR & AR strategies, as they’re helping to drive the standardization process with the work of WebVR primary spec author Brandon Jones. AR has the advantage over VR that the students faces aren’t occluded, and so there is a bit more collaborative learning and interaction between students, which you can see from this video of Expeditions AR:

My direct experience of seeing the Tango AR experiences at Google I/O is that the 6DoF inside out tracking is so good that it’s possible to feel a sense of virtual embodiment as you walk around virtual objects locked in space. I haven’t been able to experience this level of quality tracking in phone-based AR before, and so it was really surprising to feel how immersive it was. You’re able to completely walk around virtual objects, which triggers a deeper level of embodied cognition in being able to interact and make sense of the world by moving your body.

Embodied Cognition is the idea that we don’t just think with our minds, but that we use our entire bodies and environments to process information. I feel that the world-locking capabilities of the Tango-enable phones start to unlock the unique affordances of embodied cognition that usually comes with 6DoF positional tracking, and it was a lot more compelling that I was expecting it to be. But after seeing the Tango demos, I feel confident in saying that AR is going to be a huge part of the future of education.

The Google Cardboard or Daydream hasn’t generated a lot of excitement from the larger VR community as they’re seen as the gateway immersive experiences to have higher-end, PC-driven experiences. But Google’s ethic of rapidly iterating and creating a minimum viable products that are highly scalable has given them over two years of direct experiences of innovating with immersive education. They’ve been able to reach over 2 million students, and they’ve also been doing a number of research pilot studies with these VR expeditions. Google researchers Matthew Kam and Jinghua Zhang presented some of their preliminary research at the IEEE VR Embodied Learning Workshop in March, and you see some of the highlights in this Twitter thread, including work that’s happening to create an immersive education primer for Circle Center.

I’m really excited to see how Google continues to innovate with immersive education, and you can look forward to seeing a solo version of Expeditions on Daydream that will be released soon that features guided tours, history lessons, and science explainers. What Google is finding is that Expeditions is not just for students, but also adults for casual and continuing education, enterprises for training applications, and even Major League Baseball have started to explore how to use immersive education experiences to engage audiences in a new way. At the end of the day, Google is showing that if you want to expand your mind and learn about the world, then Daydream & Expeditions are going to have some killer apps for you.

For more information on embodied cognition, then be sure to check these previous interviews:

You can watch the full Google Expeditions session from Google I/O.


Support Voices of VR

Music: Fatality & Summer Trip

The post Google’s ‘Expeditions’ Initiative is Leading Innovation in the Future of Immersive Education appeared first on Road to VR.

Daydream 2.0 bringt Chrome, Daydream Cast und mehr

Das nächste Update von Daydream mit dem Namenszusatz Euphrates erhält erhebliche Verbesserungen, von denen alle Anwender der VR-Plattform profitieren werden – auch indirekt, denn unter der Haube stecken ebenfalls wichtige Neuerungen. Bei einer Presseeinladung auf der gerade laufenden Entwicklermesse I/O hat Google die neuen Funktionen des Updates vorgestellt, Daydream 2.0 „Euphrates“ soll mit Android O verfügbar sein.

Google Chrome in Daydream

Eine eigene App bringt den Google-Browser Chrome nativ in VR. Praktisch: Lesezeichen und Einstellungen bleiben erhalten, die Version soll identisch mit der Smartphone-App sein und dann flüssig und komfortabel im VR-Headset laufen.

Daydream Home

Damit man auch in der virtuellen Realität nicht ganz vom echten Leben abgeschottet ist, führt Google in Daydream 2.0 ein Dashboard ein, das man in jeder VR-Anwendung aufrufen kann. Im Dashboard kann man aber nicht nur Benachrichtigungen checken, sondern auch Einstellungen ändern und von App zu App wechseln. Außerdem passt Google die Oberfläche für die angekündigten All-in-One-Headsets von HTC und Lenovo an, bei denen eine Touchsteuerung wegfällt. An den Store legt Google ebenfalls Hand an und möchte die Benutzeroberfläche ändern sowie mehr Kategorien einführen, womit sich Inhalte besser finden lassen. Google will mit den Änderungen zu den Stores von Oculus und SteamVR von Valve aufschließen.

Daydream Cast

Daydream Cast nutzt Google Cast, um Inhalte aus der virtuellen Realität heraus zu streamen. So kann beispielsweise die ganze Familie am Fernsehgerät an der VR-Erfahrung teilhaben, wenn auch nur in 2D.

Daydream Sharing

In eine ähnliche Richtung geht Daydream Sharing. Hier lasen sich Screenshots anfertigen und über soziale Medien teilen. Als Beispiel nannte Google das VR-Malprogramm Tiltbrush: Internetnutzer würden vor allem dann Tiltbrush entdecken, wenn Anwender der Software etwas darüber posten würden.

360-Grad-Videos teilen

Die beliebteste virtuelle Anwendung seien 360-Grad-Videos, erklärt Google. Damit Anwender gemeinsam eine Erfahrung erleben können, lassen sich Freunde zu YouTube-360-Grad-Videos einladen. Das ist allerdings keine Funktion von Daydream, sondern funktioniert über die YouTube-VR-App. Auch ein soziales Element hält die Software bereit, Zuschauer können miteinander kommunizieren und Videos teilen.

Unter der Haube

Neben den für Daydream-Experten sichtbaren Änderungen gibt es auch für Entwickler wichtige Verbesserungen. Beispielsweise einen Multi-Prozessor unterstützenden VR-Compositor, die Bereitstellung der Vulkan-Schnittstelle, Multi-Layer-Unterstützung sowie Multiview-Stereo-Rendering.

Im Audio-Bereich fügt Google die Unterstützung von Unity und Unreal sowie Wwise und FMOD hinzu. Außerdem gibt es einige eigene Audio-Effekte. Beispielsweise lässt sich die Raumakustik vorberechnen, was Rechenzeit spart. Eine Geschwindigkeitsoptimierung und Verringerung der Latenz verspricht eine direkte Anbindung der Sensoren und ein Echtzeit-Sensor-Buffering.

Damit die neuen Daydream-Headsets von HTC und Lenovo funktionieren, unterstützt das Update VR-Systeme mit zwei Bildschirmen sowie den Qualcomm Snapdragon 835 und die Grafikeinheit ARM Mali-G71.

Quelle: RoadtoVR

Der Beitrag Daydream 2.0 bringt Chrome, Daydream Cast und mehr zuerst gesehen auf VR∙Nerds. VR·Nerds am Werk!

Google Plans to Close the Gap Between PC and Mobile Graphics with Seurat

While Google’s I/O conference has plenty of interesting news for consumers, the majority of it is geared towards developers, helping them build, create and maximise the potential for their future or current projects. In terms of virtual reality (VR) the company aims to increase the graphical fidelity of mobile experiences with a project called Seurat. 

Google Seurat – named after the French painter – is a way of processing complex scenes that could only be handled by a desktop PC, and making it possible for a mobile device to render it, all in real-time.

Google Seurat_1

To achieve this Andre Doronichev, director of product management at Daydream explained: “As a developer you define a volume, one in which you wish the user to move around and view your scene. You also define parameters like the number of polygons and overdraw. And then you let the tool do its magic. It takes dozens of images from different parts of the defined volume, and then it automatically generates an entirely new 3D scene that looks identical to the original, but is dramatically simplified. And you can still have dynamic interactive elements in it.”

Doronichev then went on to showcase a project Google made in collaboration with ILMxLAB, that highlighted how Seurat could even be used for projects like a movie scene, which require even more processing power. In the video below ILMxLab’s executive creative director John Gaeta said: “[Seurat] potentially opens the door to cinematic realism in VR.”

Seurat already supports Unreal Engine, Unity and Maya, and Google is testing the tool with a select group of partners currently, prior to rolling it out later this year.

VRFocus will continue its coverage of Google I/O, reporting back with all the latest updates.

I/O 2017: Google Unveils VPS – Indoor Navigation Through AR

I/O 2017: Google Unveils VPS – Indoor Navigation Through AR

Clay Bavor, VP of VR for Google, took the stage to share intriguing new information for immersive technology. He made a blockbuster announcement for the new standalone VR headset Google is working on with HTC and also shared that the S8 will finally be getting Daydream support.

Not to be left out, AR also got some shine from Bavor and there are some cool things down the pipeline.

First up, the 2nd gen AR phone will go on sale this summer. The new Asus ZenFone AR is a far cry from the first Tango-equipped device, adopting the small form factor prevalent across the mobile phone industry currently. Bavor didn’t spend very time on hardware before he shifted right into new technology that could send shockwaves across the mobile industry.

“AR is most powerful when it’s tightly coupled to the real world,” says Bavor. “The more precisely, the better.” Google has been working with the Google Maps team to get precise location data for indoors. The result is Visual Positioning Service, or VPS, which uses your phone to find distinct visual features in your surroundings to triangulate and get you to your desired space.

The example on stage showed a VPS equipped phone take a user directly to the specific screwdriver he or she needed in a Lowe’s store. The visual representation of how this works showed the phone’s camera marking “feature points” with different color dots. It recognizes where items are in the space down to within a few centimeters. Then the user interface shows navigation-like elements as the user is guided down aisles.

Bavor followed the demo up with an anecdote on how an audio-based version of VPS could help those with impaired vision and “transform how they make their way through the world”.

He also revealed that VPS will also be one of the core capabilities of Google Lens, which we recently covered as well. A lightweight pair of AR equipped glasses with Lens tech and VPS would be incredible to experience and, hopefully, we’ll see that initiative come to fruition in the near future.

Lastly, Bavor tackled AR’s capacity as an educational tool. With over 2 million students served by the Expeditions VR experience, which gave teachers a way to travel with their students without leaving the classroom, Google is now adding an AR mode to give students an augmented way to learn about things seen in the classroom.

The video shown displayed a classroom where each student was equipped with phones on selfie sticks as they watched an augmented volcano erupt on their desk and a tornado take shape in the class. The AR mode got the students up, moving, and excited about the things popping up in their learning environment.

Implications are that Google is gearing up to take Tango and AR the extra mile as they add more and more functionality. It will be interesting to see what happens next.

Tagged with:

Watch Google’s ‘Visual Positioning Service’ AR Tracking in Action

At today’s annual Google I/O developer conference, Clay Bavor, VP of Virtual Reality at Google, announced a new augmented reality service called Visual Positioning Service, the latest development for the Tango platform that promises to not only precisely map the world around you in concert with Google Maps, but also give you a ‘GPS-like’ turn-by-turn navigational experience when you’re indoors. In addition, he announced an AR mode for an upcoming educational tool called Google Expeditions.

Following some major VR announcements including Daydream-compatible standalone VR headsets in the works from HTC and Lenovo, Bavor moved on to Google’s AR operations, confirming that the Asus Zenfone AR, the second Tango-enabled consumer smartphone, is still on track for a summer 2017 launch.

Google Tango is a smartphone-based AR platform that has the ability to map the world around you in real-time using a number of on-board sensors and the phone’s camera.

image courtesy Google

The Visual Positioning Service (VPS) revealed on stage, which combines Tango’s inside-out tracking system with Google Maps, provides “very precise location information indoors,” claims Bavor. As an example, Bavor described looking for a specific screwdriver at a Lowe’s home improvement store. Holding up a VPS-enabled phone inside the store will allow the system to know exactly where you are “within a few centimetres”, and will be able to direct you (based on previous collected data) to the exact tool you were searching for, sort of like an in-door GPS complete with turn-by-turn directions.

Google says VPS works today in partner museums and select Lowe’s stores.

image courtesy Google

In other words, “GPS can get you to the door, VPS can get you the exact item”. In the future, VPS combined with an audio interface could transform the way visually-impaired people move around the world. It will also be “one of the core capabilities of Google Lens”—a new image recognition initiative also announced today.

Finally, Bavor announced AR is being added to Google Expeditions, the popular ‘virtual field trip’ tool for education. Introduced two years ago for the inexpensive Cardboard VR headset, Expeditions has since been used by 2 million students. The new AR mode, demonstrated on video in a classroom with students holding several Zenfone ARs on selfie sticks, was described as “the ultimate show and tell.” The Expeditions AR mode will be added later this year.

The post Watch Google’s ‘Visual Positioning Service’ AR Tracking in Action appeared first on Road to VR.

VR Development Gets Easier With Vitals in Android O

During the Google I/O conference, Stephanie Saad Cuthbertson, Director of Product Management at Android spoke about the Android O developer preview, due for release later in summer. The new Android version focuses on two aspects; Fluid experiences for user interaction and creating the best user experience and Vitals for battery performance and reliability.

Several new tools are being introduced specifically for developers working with Android apps. Cuthbertson spoke about the three main fundamentals of the Android experience; battery life, start-up time and stability. As such Google are introducing tools that allow for things like security enhancements to protect your phone and tablets from harmful apps and disable them. Google Play Protect can scan apps to make sure they are all safe and don’t pose a threat. Boot time on O is twice as fast as in previous versions of Android for devices using Pixel and apps will also run faster.

It was also noted that apps themselves can have a large impact on the system. Some apps which run continuously in the background can consume lots of system resources and battery life so Android O will introduce limits to be placed upon apps so they don’t continue to drain battery if an app goes wrong.

A feedback suite for developers is also going to be introduced. Titled the Play Console Dashboard, the utility will show the issues that cause battery drain, crashes and slowdown in the UI. For each issue the app has, it will show how many users are affected and give guidance on how to fix the problem. A profiler is available to visualise the problems happening within the app. The unified visual profiler allows for activity of network, CPU and memory to be shown clearly, so devs can see everything on a unified timeline. For example if looking at the CPU it is possible to view the call stack and check how long every call is taking and jump to the exact line of code to fix a problem.

VRFocus will bring you further updates on Android O and VR apps as it becomes available.

Google I/O 2017: Qualcomm, HTC, Lenovo und Google kooperieren für All-in-One-Headset

Wenige Stunden vor der Entwicklerkonferenz Google I/O 2017 geisterte es durch die Gerüchteküche: Google sollte ein All-in-One-Headset vorstellen, das vielleicht sogar Mixed Reality verwirklichen sollte: sprich, sowohl für AR als auch VR sollte man die neue Hardware benutzen können. Ganz so ist es nicht gekommen, dafür scheint Google bemüht zu sein, die unterschiedlichen VR-Kräfte zu bündeln und setzt dabei auf Qualcomm.

Qualcomm, HTC, Lenovo und Google kooperieren

Der wichtigste Prozessorhersteller im Android-Markt forscht schon länger an VR und stellte ein Referenzdesign für ein All-in-One-Headset vor, das wir auf der IFA 2016 ausprobieren konnten. Die VR-Brille wird Daydream unterstützen und wohl den aktuellen Snapdragon 835 erhalten. Google und Qualcomm wollen die Brille allerdings nicht verkaufen, sondern nur das Design übernehmen und die Fertigung den Hardware-Partnern überlassen. Das gleiche Konzept verfolgt Microsoft mit den neuen Mixed-Reality-VR-Brillen. Eine Gemeinsamkeit ist auch das Inside-Out-Tracking, wodurch die Systeme von Google/Qualcomm und Microsoft keine Kameras oder Lasertürme benötigen. Google nennt das System World Sense. Wie es sich im Vergleich zu herkömmlichen Lösungen und vor allem Microsofts Mixed-Reality-Brillen schlägt, muss ein Test zeigen.

Erste Hardware-Partner nannte Google auch. Kaum überraschend ist die Ankündigung, dass HTC ein System bauen wird. Der Hersteller liefert aktuell die Hardware für das kleinere Google-Pixie-Modell und kann vor allem mit der HTC Vive viele Erfahrungen im VR-Bereich vorweisen. Der zweite Partner ist Lenovo, die bereits Smartphones mit dem 3D-AR-System Tango von Google auf den Markt brachten. Preise für die neuen VR-Brillen bleiben derzeit offen, eine Veröffentlichung kündigte Google unbestimmt auf Ende 2017 an.

Der Beitrag Google I/O 2017: Qualcomm, HTC, Lenovo und Google kooperieren für All-in-One-Headset zuerst gesehen auf VR∙Nerds. VR·Nerds am Werk!

Samsung Galaxy S8 Gets Google Daydream Support

Google Daydream has many phones that are currently compatible, including the upcoming LG flagship smartphone that will be launched later in the year. Google also announced during I/O that Samsung’s latest smartphones will also be getting Daydream support.

Clay Bavor Vice President of Virtual Reality at Google announced during the presentation that the Samsung Galaxy S8 and S8+ would also be getting an update that would introduce Google Daydream support at some point during the summer. Samsung are one of the world’s most popular smartphone manufacturers, especially in the west and are the creators of one of the Google Daydream’s main competitors with the Samsung Gear VR.

The Samsung Galaxy S8 has a 5.8 inch quad HD display using what Samsung refer to as the infinity display. It is powered by a Qualcomm Snapdragon 835 processor with 4GB of RAM and 64GB of internal storage, which means a lot of virtual reality apps can be stored on the device. The S+, meanwhile, has a 6.2 inch display, also with infinity display.

The LG smartphone referred to by Bavor will probably be the LG G6, which has already launched in South Korea, and is due to launch in the US and Europe in summer. The LG G6 has a 5.7 inch display with a 18:9 aspect ratio, a resolution of 1440×2880 and is powered by a Qualcomm Snapdragon 821 processor.

Starting from summer, Samsung Galaxy S8 and LG G6 users will be able to experience apps such as Along Together, Hungry Shark VR and Polaris on the Google Daydream.

VRFocus will continue to keep you informed on updates regarding the Google Daydream.