‘Neos’ Aims to be the Google Docs of VR World Building

We catch up with Sightline creator Tomáš ‘Frooxius’ Mariančík whose work to create the Google Docs of VR world building with his latest project Neos, is progressing rapidly.

We’ve not heard much of Tomáš ‘Frooxius’ Mariančík, the developer behind the excellent Sightline series of VR experiences, in a little while. The last we heard he had embarked on an ambitious project, one that he claimed would allow teams of creators to collaborate seamlessly on shared VR worlds. After months of non-stop work, Mariančík has emerged ready to show off his progress on Neos, a collaborative ‘world engine’ that he hopes will provide the platform for building the metaverse, entirely within virtual reality. A Google Docs for VR world building if you will.

In order to demonstrate how far the project has come, Mariančík has produced an hour long video in which he and a colleague join each other from two separate rooms,  within the same virtual space.

They begin in what Mariančík calls Neos’ “scratch space”, the ‘construct’ area from which collaborators can start their creative journey. The duo then go on to fill the world, starting first with basic, hand drawn constructs rapidly progressing to more and more advanced and elaborate world building. By the time the pair have finished, they’ve created a realistic office environment, created from assets captured within photogrammetry, and have infused the worlds surfaces with countless copies of Sonic the Hedgehog videos. As you do.

neos-4 neos-3 neos-1

One of the central ideas behind Neos is to offer collaborators intuitive, humanistic tools, inspired by the way we interact with the real world. For example, if you want to attach one virtually created object to another, you can apply virtual glue, a UI concept represented by a sphere in which you hold the objects, wait for the glue to ‘dry’ and once done, the two objects are stuck together. Or, if you want to apply textures to an object, simply grab from an array of materials (represented by spheres), pop it into your ‘material gun’ on one of your controllers and apply that material to as many as you like.

The power of Neos is its network abstraction layer, which allows creatives to get on and build things even if they’re physically many miles apart, focusing on getting the job done. This abstraction layer manages to display a synchronised world using minimal network bandwidth as objects shared within the shared universe are synchronised in full just once with their states computed from those sync’ed relationships. This also means all users should see the same thing as everyone else in the space at all times.

Neos is still a work in progress, but its key architecture is clearly in place and its potential power should be immediately obvious. Can the developer succeed in defining and building a Google Docs from which friends and colleagues can create the metaverse? Time will tell.

The post ‘Neos’ Aims to be the Google Docs of VR World Building appeared first on Road to VR.

Only Select Developers Can Publish Google Daydream Apps Until 2017

Google Daydream is supposed to launch Fall 2016, but only a select group of developers will be able to publish Daydream apps to the Google Play store until 2017.

daydream-ecosystem-android-vr
See Also: Google Daydream SDK Launches Out of Beta, Adds Unity Integration

Daydream, Google’s high-end VR initiative for Android, is set to launch in the next few months, but the company is not flinging the door wide open when it comes to VR app submissions. While any Android developer can submit Cardboard applications to the Google Play store, Google will be keeping a tighter grip of Daydream apps early on by restricting which developers can publish their applications through Android’s app store.

Only developers who are accepted into the Daydream Access Program (DAP) will be allowed to publish apps at this Fall’s Daydream launch. Everyone else will be allowed to publish apps “early next year.”

Developers can apply now to join the DAP. The application form consists of some pretty basic information gathering, including a description of the VR app that’s being developed and whether or not it has launched on any other VR platforms. Google says that those who are selected to join the DAP “get a first look at updates to Daydream’s developer tools and are connected to our team and the DAP community throughout the development process.”

DaydreamHome
See Also: Start Making Google Daydream VR Apps Today with a DIY Dev Kit

The criteria upon which developers will be accepted into the Daydream Access Program is unclear. This restrictive step appears to be a rather simple way for Google to sift for high quality VR content at Daydream’s launch, rather than opening the floodgates to any and all would-be Android VR developers. It’s tough to say exactly what the company’s reasoning for introducing the DAP is, but some guesses include a way to ensure that the initial Daydream offerings abide by VR best practices, show a good face for Google’s new VR initiative, and establish a foundation of initial high-quality apps for newer VR developers to learn from.

Google, which just last week launched the Daydream SDK out of beta, is hosting an October 4th press event which is widely expected to see the announcement of new Daydream-ready phones from the company, amidst other news. This aligns with Google’s promise earlier this year that we’d see the first Daydream phones launch in Fall.

The post Only Select Developers Can Publish Google Daydream Apps Until 2017 appeared first on Road to VR.

Google Daydream SDK Launches Out of Beta, Adds Unity Integration

After announcing Daydream earlier this year, Google’s platform for high-end virtual reality on Android, the company has now says the Daydream VR SDK has reached version 1.0 and is now ready for download.

Building upon the prior Cardboard SDK, Google has now combined both Cardboard and Daydream development into the Google VR SDK. The company says the SDK includes “a simple API used for creating apps inserted into Cardboard viewers, and the more complex API for supporting Daydream-ready phones and the Daydream controller.”

Brahim-Elbouchikhi-google-vr-daydream
See Also: Google’s (Day)dream: ‘Hundreds of Millions of Users in a Couple of Years’

The Daydream side of the SDK is a foundation for VR developers, handling important basic functions that every VR app needs, like stereo rendering, spatial audio, head tracking, lens distortion, and asynchronous reprojection.

Developers can get the Google VR SDK over at the Google VR developer site. In addition to the Unreal Engine Daydream integration which has been improving since its launch alongside the Daydream announcement, the promised Unity integration has finally arrived in the form of a ‘technical preview’ which developers can download today. Unity writes on their official blog:

Unity’s native support for Daydream aims to solve the hard problems for you. To get optimal performance and latency from the platform we have done a deep integration with the Daydream SDK to leverage the platform’s asynchronous reprojection and VR performance mode. We have also made it easy to switch in and out of VR mode so that your applications can easily expand to the Google VR audience.

While the Daydream Unity technical preview is currently a separate package, the company says it will ship as part of the standard Unity package at a later date.

daydream-ready-smartphones-android-vr
See Also: Existing Phones Unlikely to Qualify as ‘Daydream Ready’, Says Google, VR Fans Should Wait to Upgrade

The timing of the Google Daydream VR SDK launching out of beta comes just ahead of an October 4th press event hosted by Google which is widely expected to see the announcement of new Daydream-ready phones from the company, amidst other news. This aligns with Google’s promise earlier this year that we’d see the first Daydream phones launch in Fall.

As part of the announcement of the Google VR SDK hitting version 1.0, the company teases, “Stay tuned for more information about Daydream-ready phones and the Daydream headset and controller coming soon.”

The post Google Daydream SDK Launches Out of Beta, Adds Unity Integration appeared first on Road to VR.

VirtualGrasp Aims to Finally Deliver Realistic Hand Interactions in VR

Swedish robotics specialists are working on a system to allow game developers to deliver realistic, realtime and dynamic interaction animations, so that your VR hands can finally grasp in-game objects convincingly.

By the end of this year, every major virtual reality platform will have its own motion control solution, from the Vive’s SteamVR devices to Oculus’ Touch, but whilst these controller give developers the ability to deliver near 1:1 mapped input in VR, the virtual results of those actions, particularly with our hands, can look, well, pretty ‘shonky’. Objects we pick up often just stick incongruously to the in-game controller models and, should the developer have included hand models, snap into a predefined multi-purpose positions which is more often than not pretty unconvincing.

Hand presence isn’t just about accuracy of their position in space then, but also about representing the myriad subtleties our endlessly adaptable digits are capable of. For example, observe yourself picking up a tumbler glass, your fingers will wrap the cylinder securely, picking a wine glass though and you may clasp more daintily by the stem. Not only that, if your grip will change depending where on that object you choose to hold it. The sorts of subtleties we take for granted in every day life are generally deemed expendable in games and ‘brute forcing’ a real solution – say a canned animation for every world object – is clearly out of the question.

gleechi-virtualgrasp-2

Now, Swedish company Gleechi claim to be well on their way to resolving these issues. The company claims that the animation systems they’re building are based on 8 years of robotic research conducted by the dedicated research group at KTH, Sweden’s leading technical university.

Gleechi’s says that its VirtualGrasp product resolves the need for labour intensive manual animations for the hands by using a “predictive and adaptive algorithm” which analyses the ‘physical’ properties of a virtual object, deciphering the most appropriate and realistic grip formation for the in-game hand model and snapping to that position. The software is still in an early state, but as you can see in the video embedded at the top of the page, it really does seem to work and seeing it in action you realise just how poor most in-game interactions look.

gleechi-virtualgrasp-3

Clearly VirtualGrasp is a technology with general purpose benefits for many games, but in VR that extra piece of the realism puzzle, one that further ties your physical self to your virtual self, may have more significant implication for immersion and presence.

If you’d like to know more about VirtualGrasp, head over to the company’s website here.

The post VirtualGrasp Aims to Finally Deliver Realistic Hand Interactions in VR appeared first on Road to VR.

Leap Motion’s ‘Interaction Engine’ Unlocks Natural Human Input for VR

Leap Motion has announced it’s to early access to the beta of its Interaction Engine, a set of systems designed to help developers implement compelling, realistic and hopefully frustration-free input with just their hands.

If you’ve ever spent time with a physics-sandbox title, you’ll know that a large part of the appeal is the satisfaction and freedom to play within a virtual world that behaves somewhat like reality – with none of the real-world restrictions applied. But this presents myriad problems, not least of which is that those real-world modelled physics breakdown when physicality is removed. Without physical boundaries in place, objects on the virtual plane will behave according to the digital physics model, right up to the point you accidentally put your digital self through said objects – at which point things kinda breakdown.

maxresdefault (2)

These issues are particularly acute when it comes to integrating naturalistic hand interaction with a digital space and its objects, for example in VR. Bridging the “gray area” between accuracy and what ‘feels good’ to a human being is part of that elusive magic when you encounter an input interface that just works. More specifically, in the case of VR, that bridging involves implementing an alternative set of rules when a player connects and grasps a virtual object in 3D space, bending realities rules in favour of a visual experience that more closely matches our expectations of what should happen.


These are all issues that Leap Motion, the company most well known for its depth sensing sensor peripheral of the same name, have been grappling with for many months now and they’re Interaction Engine aims to remove a lot of the pain for developers by providing a framework that “exists between the Unity game engine and real-world hand physics,”

The last time we encountered Leap Motion, they showed us the first glimpses of their work to try and boil down such an enormously complex set of problems into something that developers can interface with easily. At CES in January the Leap Motion team let us get our hands on Orion with an early verison of their Interaction Engine, a significant milestone for the company in terms of their overall tracking framework with impressive leaps in lowered tracking latency and the systems ability to handle hand tracking issue

Leap Motion’s release of Interaction Engine’s Beta completes another piece of the peripheral-free VR input puzzle that the company has dedicated itself to over the last couple of years.

“The Interaction Engine is designed to handle object behaviors as well as detect whether an object is being grasped,” reads a recent blog post introducing the Interaction Engine, “This makes it possible to pick things up and hold them in a way that feels truly solid. It also uses a secondary real-time physics representation of the hands, opening up more subtle interactions.”

Leap Motion have always had a knack for presenting complex ideas involved in their work in a visual way immediately graspable by the viewer. These latest demo’s illustrate that user-friendly fuzzy logic Leap Motion believe strike a neat balance between believable virtual reality and frustration-free human-digital interaction.

The Interaction Engine represents another milestone for Leap Motion on its quest to achieve hardware free, truly human input. And if you’re a developer, it’s something you can get your hands on right now as the beta is available for download here, read all about it here and chat with others doing the same.

The post Leap Motion’s ‘Interaction Engine’ Unlocks Natural Human Input for VR appeared first on Road to VR.