VRTK v4 Beta Makes The Best Unity VR Framework Even Better

vrtk logo

VRTK is a VR framework designed to allow developers to add interactivity to their apps & games without coding the physics of these interactions from scratch. This month the beta for version 4 released. Version 4 is a complete rewrite of the framework. It brings numerous improvements including making it more modular and more hardware agnostic.

VRTK’s Humble Origins

In April 2016, Harvey Ball got his HTC Vive. But when he wanted to develop for it, he noticed that there was no general framework for VR interactions. From his bedroom in the UK, he decided to make one- he called it SteamVR Unity Toolkit. It let developers easily add teleportation and object grabbing to their games.

The toolkit quickly became the most popular Unity VR framework with thousands of developers using it. It had become so popular that during the launch of the Oculus Touch controllers Facebook sent Harvey a Rift and Touch for free in order to add support. With the toolkit now being cross-platform, it was renamed to VRTK.

As an open source project, the community added many features, such as climbing, new grabbing mechanics, and archery physics.

Problems Emerge

VRTK was starting to show fundamental architectural problems. Harvey had originally built it on top of the SteamVR Plugin- the Oculus integration for example was just an abstraction layer. If the SteamVR Plugin had a major update (and it did eventually) VRTK would break, and supporting future hardware would require ever more complex abstraction layers. It became clear that VRTK needed to be rewritten from the ground up to be easier to use, more modular, and truly hardware agnostic.

Such an enormous task would require hiring developers- and that requires money. Harvey tried launching a Kickstarter campaign, but it failed to meet its goals. Some even accused him of trying to “cash in”. Next, he tried Patreon- but that too failed to generate the level of funding needed, Harvey claims. He also claims that Valve Corporation declined to support VRTK due to being considered a competitor.

Throughout 2017, Harvey had to pour his own money into keeping up development on VRTK. The overwhelming task of documentation, tutorials, and supporting developers slowed the development to a crawl. Worse, VRTK was being blamed for enabling the many “asset flip” titles flooding the Steam marketplace.

In December, Harvey decided that he had had enough, deciding to stop development of VRTK. The lack of funding and scale of negativity had reached its limit.

Oculus To The Rescue

In January 2018, Harvey received an email from Oculus VR, LLC. They had heard of the demise of VRTK and wanted to provide the funding necessary to continue development. Harvey was skeptical, thinking that Oculus would want to decimate the principles of VRTK or make it exclusive.

His skepticism turned out unfounded however. Oculus offered a 6-month full grant with no conditions attached. With this funding, Harvey was able to continue development of VRTK, and so v4 was born.

The funding was used to take on dedicated community member Christopher-Marcel Böddecker as a full time developer.

v4: A Rewrite

VRTK v4 is a completely hardware agnostic rewrite. In fact, it’s theoretically now engine agnostic, so it could even support Unreal Engine in the future. Instead of a single script as in v3, v4 now uses prefabs containing simple scripts. Whereas v3 would often require custom code to achieve seemingly simple tasks, v4’s modularity means tasks such as a pump action shotgun can be achieved with just configuring existing components.

This new modularity also means that v4 can support augmented reality devices in future.

The old video tutorials, which became outdated quickly, have now been replaced with VRTK Academy, a full documentation wiki maintained by both VRTK developers and the community.

While v4 is in beta, the VRTK team claims it isn’t buggy and recommends any developers use it rather than v3 for current & future projects. It can be downloaded as a zip file from GitHub.

With the Oculus grant being only 6-months, VRTK is still in need of funding. If you want to support their project, you can contribute to their Patreon.

Tagged with: , ,

The post VRTK v4 Beta Makes The Best Unity VR Framework Even Better appeared first on UploadVR.

Accessible XR Development After VRTK

As a developer moving from the web and app world into 3D and XR, I’ve had to constantly re-evaluate my platform and tool choices as the industry evolves at tweetstorm velocity. Today’s XR development pipeline is clogged by a glut of proprietary hardware and software APIs and SDKs by competing firms like Oculus, HTC Vive, Microsoft, Google, Apple, Sony and SteamVR — to say nothing of emerging third-party peripherals like Logitech’s VR-tracked keyboard, the new AR-enabling Zed Mini dual-eye camera for the Rift or Vive, or any other industry-disrupting Kickstarters that might’ve sprung up since I started typing this paragraph.

Left to right: a bunch of cool stuff I want.

Each platform’s fine — even technologically stunning, one might argue — with respective strengths, weaknesses and use cases. But the distinctions force XR developers to ask hard questions: Where is the market going? How do I invest my skill-building time? What devices should my app support? What platform can I get a job working on? Developers must be business analysts as much as creative technologists to stay relevant. It’s easy to suffer choice paralysis with such a wide array of options, and easier still to bet on the wrong technology and lose.

Personally, I also face certain technical, logistic and financial realities as an independent XR developer in the Midwest (US), where the industry hasn’t proliferated as it has in major coastal cities. Thankfully, game engines like Unity and Unreal are rapidly democratizing this space. Both engines seek to bridge the gaps between the various XR SDKs, employing thousands of engineers to ensure their software plays nicely with just about any significant third-party API. For example, as I wrote about in August, the Oculus SDK integrates beautifully with Unity and comes equipped with many of the scripts and prefabs needed to quickly prototype, develop and deploy a custom Rift app.

I miss bossing around my hand-modeled #MadeWithBlocks BB-8. Check out my deep dive on this project, The Future of VR Creation Tools.

That’s fantastic, but it’s still non-standard. To port the same Unity app to the HTC Vive or a Windows HMD is non-trivial — not impossible or even terribly difficult, but non-trivial. Maintaining your app for multiple SDKs over the long haul is similarly non-trivial. Non-trivial costs money and time and we’re all short on both.

Instead imagine if XR practitioners had to worry less about betting on the right platform or device and could instead focus on creating unique and compelling experiences, content and UX. The first step down that path was VRTK — but sadly, one of the best tools to combat the VR SDK surplus will soon be hobbled by the loss of its founder.

VRTK: The Open Source Approach

This free, open source Unity toolkit aims to knit together a single workflow for a variety of VR APIs. It comes with the same stock prefabs and scripted mechanics you might find included in any single proprietary SDK, but makes each piece of functionality identical whether deployed to Oculus, SteamVR (read: Vive and, with v3.3.0, Windows HMDs) or Daydream — covering all major VR HMD manufacturers today.

It’s a boon to anyone wanting to dip their toes in the waters of VR development. Think of it: Want to implement teleportation locomotion over a Unity NavMesh? Just drop the component onto your player prefab. Want to test out grab mechanics, or a quick bezier pointer? VRTK’s demo scenes have you covered, and they’ll work easily on a variety of devices. Since it’s open source, you’re also free to dive in and customize the code. Struggling to get a feature working in your own project? Check out this implementation on a varieties of SDKs — not a bad way to grok new XR coding concepts.

Sadly, VRTK’s creator is sunsetting the woefully underfunded project. The UK-based developer TheStoneFox — who until recently was actively seeking contributors, partnerships and support — announced recently that he would will be stepping back from the project post-version 3.3.0. Though VRTK boasts an active Slack community, a growing list of “made with” titles and a recent Kickstarter, TheStoneFox was unable to attract the support necessary to sustain it for the long term.

Now, as the opportunity to contribute to and utilize a premier open-source VR development pipeline expediter will fade going forward, what if anything will replace it?

OpenXR: One API to Rule Them All

The VRTK approach —using Unity scripting to knit together similar mechanics across a spectrum of VR SDKs — is necessary in the current fragmented development landscape, but there are downsides. Some community still has to monitor the various proprietary SDK updates and your end-user VRTK app still has to be mindful of VRTK’s changes over time. In this way, VRTK treated the symptoms of the VR SDK overload, but was not equipped to address the root cause. Enter OpenXR, The Khronos Group’s upcoming industry standard:

The standard, announced December 2016, is being written now and is quickly gaining traction among industry players (with the notable exception of Magic Leap). Instead of forcing developers to grapple with variable propriety SDKs and all the accompanying business consequences, companies will instead tailor their hardware and software to comply with OpenXR’s spec. Khronos, the non-profit responsible for shepherding the Vulkan, OpenGL, OpenGL ES and WebGL standards, is leading the charge. Cue the infographics!

On the left, the problem — on the right, the solution:

Images courtesy of https://www.khronos.org/openxr.

“Each VR device can only run the apps that have been ported to its SDK. The result is high development costs and confused customers — limiting market growth,” reads some fairly accurate marketing copy on their website. “The cross-platform VR standard eliminates industry fragmentation by enabling applications to be written once to run on any VR system, and to access VR devices integrated into those VR systems to be used by applications.”

A working group of industry heavyweights have agreed the standards be extensible to allow for future innovation and should support a range of experiences — anything from a 3-DoF controller all the way to a high-end, room-scale devices.

The only thing missing is a realistic timetable before this standard has an impact on the development community and its day-to-day workflow. Until the market-movers get their act together, we’ll be left scrambling (and patching up VRTK projects, in many cases).

OpenXR supporters: everyone except Magic Leap.

The Cinema of Attractions: Slow Your Reel

But should we so quickly welcome industry standardization while the technology is still so new and full of possibilities? That’s the question asked in a recent Voices of VR podcast by Kent Bye and Rebecca Rouse. The two discussed the early days of cinema — when exploration and experimentation were the status quo — and Rouse drew striking parallels between that era and the current period in XR production and development.

Pure spectacle then and now. Left: a Cinema of Attractions-era still. Right: Chocolate VR.

“[Scholars of early film] came up with this term ‘cinema of attractions’ because they saw an incredible wealth of diversity and kind of range of exuberant experimentation in those early pieces, so they were very hard to sort of clump them together — there was such diversity — but this ‘attraction’ idea was a large enough umbrella, because all of those early pieces are in some way showing off the technology’s capabilities and generate this experience of wonder or amazement for the viewer. And the context in which they were shown is that of attractions, so they were shown at world’s fairs and as a part of vaudeville shows with other kinds of performances and displays.”

 — Rebecca Rouse, assistant professor of communication & media at Rensselaer Polytechnic Institute

Sounds eerily familiar, huh? The whole podcast is well worth a listen, but tldr: while there are obvious consumer and market advantages to XR standards, Rouse argues that perhaps we shouldn’t jump the gun here— not during this era of frenetic, often avant garde XRexperimentation across art, science, cinema and gaming. Looking around the industry, it’s hard to disagree.

EditorXR

One man-eating-the-camera-brilliant new application of XR technology is Unity Labs’ EditorXR. Created by Unity’s far-future R&D team (whose roles often find them working on projects and products five-to-ten years away from consumer adoption), EditorXR offers you an interface to create custom XR Unity scenes entirely within virtual reality.

Oh! And there’s flying, among other superpowers — soar through your scene like Superman or scale the whole thing down to a pinhole. They’ve literally ported the Unity inspector, hierarchy and project windows (again among others) to an increasingly user-friendly VR UI pane on your wrist. With the latest update, you’re able to:

  • hook into Google’s Poly asset database web API in real-time inside VR
  • create multiplayer EditorXR sessions for editing Unity scenes with friends and collaborators
  • run EditorXR with Unity’s primary version 2017.x editor

It’s still new and I’ve encountered bugs, but it’s a foregone conclusion that this tech will become a standard feature of Unity’s scene creation process as XR technology matures and proliferates. Even their alpha and beta efforts evoke the same sense of wonder and possibility that early Cinema of Attractions-era moviegoers must have felt.

For more insight on the design side, check out this deep dive on the future of XR UX design by Unity Lab’s Dylan Urquidi or the Twitter feed of Authoring Tools Group Lead, Timoni West.

ML-Agents

Another experimental Unity project, ML-Agents, explores one of the most promising avenues for the future of XR development, design and UX: machine learning. Using so-called “reinforcement learning” techniques which expressly don’t feed the AI model any sample data or rules for analysis, ML-Agents instead applies simple rewards and punishments (in the form of tiny float values) based on the outcomes to their [usually very narrowly defined set of] behaviors.

Stretched out over hundreds of thousands if not millions of trial-and-error training sessions, the computer experiments with its abilities and forms a model for how to best achieve the desired goal. In this way, your Agents become their own teacher s— you just write the rubric.

The original GitHub commit contained some basic demo scenes and the development community quickly took up the torch from there. Unity’s Alessia Nigretti followed up the original blog with one describing how to integrate ML-Agents into a 2D game. On Twitter, @PunchesBears has been demonstrating similar concepts — and showing that often enough, Agents respond to developers’ carefully calculated reward system in ways they don’t anticipate. Similar to actual gamers, no?

In one of my favorite applications of ML-Agents, the developer Blake Schreurs actually brings a 6-DoF robo-arm Agent trained to seek a moving point in space into virtual reality — with slightly terrifying results once he assigns that moving target to his face.

Imagine someone applying this training model to actual robotics and fat-fingering the wrong key. Or don’t, whatever. 

He’s down for the count! I was immediately reminded of the audiences pouring out of theaters in 1895, afraid they’d be run down by the Lumière brothers’ Arrival of Train at La Ciotat. We’re still in the salad days of both machine learning and XR development compared to where we hope to be 10 or even 50 years from now. In that time, some combination of traditional or procedural AI with these new machine learning approaches will doubtless lead to great developments in gaming and XR at large — or even in the very design process and daily workflow of computing itself.

Rift OS Core 2.0

With Rift’s new Core 2.0 OS, your entire Windows PC is accessible from your right-hand menu button. Being able to view and use your desktop apps, as well as pin windows inside other VR apps, introduces new possibilities for XR workflows (and even for traditional computing workflows) in VR.

While working on my next project, entirely within VR, I can watch Danny Bittman’s great Unity rendering and lighting tutorial on YouTube in a pinned browser while messing with those same settings on my wrist in UnityXR. I can watch @_naam craft original assets in Google Blocks at the same time I do, or I could gather assets from the Poly database and deploy them to my Unity scene in real-time VR, pulling up Visual Studio to code some game logic as I please.

That sounds pretty goddamn metaversal to me — and before long, we likely won’t even need code.

The XR Developer of the Future Is Not a Developer

If XR technology is to go mainstream, the development process must be as efficient and accessible as possible — and likely even open to non-developers through content creation and machine learning applications. Spanning sciences and disciplines, there’s so much more to talk about and speculate over that this piece hasn’t even touched on (next time I’ll examine WebVR and A-Frame as viable XR development pathways). More and more pieces of this accessible, standardized XR development pipeline will fall into place as the immersive computing revolution rolls on, though I’m thankful the XR industry isn’t ready to ditch its Cinema of Attractions ethos quite yet.

VRTK’s Open Source Tools Help New Developers Get Started In VR

VRTK’s Open Source Tools Help New Developers Get Started In VR

What’s becoming clear in VR development is that with the market split across Rift, Vive and PlayStation VR, the developers of some of the most successful apps like Job Simulator, Fantastic Contraption and Raw Data work hard to make their software work well across all three headsets.

Even if it is hard to stand out from dozens of apps launching each week, making a virtual world available across Steam, the PlayStation Store and Oculus increases the chances of a developer finding success. Though large teams working with big budgets often turn to the Unreal world engine from Epic Games for building VR products, the very well-funded Unity Technologies is the engine most indie developers use in bringing their products to fruition across multiple systems. Unity is used by a majority of VR developers and its asset store makes it easy for developers to find cheap or free tools with which to build more immersive worlds.

VRTK — the Virtual Reality Toolkit — is one of those Unity-based tools that a large group of indie creators used to jumpstart their efforts in VR development. The open source toolset was created by Harvey Ball, also known as TheStoneFox, after he bought an HTC Vive last year.

“I wanted to build something for it as I was new to game dev. I’d been using Unity for about a month just as a hobby,” Ball wrote to me. “I tried to use the SteamVR Unity plugin and found it confusing, realized a lot of people found it confusing and started VRTK as a way to help people get into developing for VR.”

Ball’s crowdfunding on Patreon for VRTK stands at nearly $2,000 per month and the community for the effort on Slack is populated by more than 2,000 members. Here’s a list of games made with the software. Ball works on VRTK in the evenings when he can but still holds a day job as a Web developer. He says the money from Patreon isn’t his income — it’s so that “if we as a community need something that requires money then we have budget for it.” He also launched a Kickstarter project earlier this year with a very high goal for a more ambitious roll out. Though it received more than 400 backers, the project fell short of its goal.

I asked Ball to break down why he is building VRTK as an open source solution and to break down the benefits of the tools:

I wanted to make it available to as many people as possible. My belief is the more people building for VR is only better for evolving the platform. VRTK is all about getting as many people as possible working together on solutions to common problems for the common good. Also making it accessible for people new to dev but with good ideas so to remove high barriers of entry. Charging for it would just limit that for people. Plus it enables developers to build for a crazy fragmented market, using VRTK means it just works on Steam, Oculus, PSVR, etc.

The initial benefit is the VRTK abstraction layer. So if you use VRTK components for the mechanics of the game then it just works on any supported SDK. So it just works on SteamVR or Oculus or PSVR without any coding. If you build something for SteamVR you have to either write your own abstraction layer or rewrite chunks of code for other SDKs. Some popular SteamVR games already are suffering from this where they can’t be easily ported to Oculus Home. If they had been built on top of VRTK then porting is easy.

Ball said he plans to add support for Microsoft’s Windows platform as soon as he gets his hands on the controllers.

We’ve heard developers swear by VRTK in the past, but I put a call out on Twitter to find out how the toolkit has helped developers and the response was pretty strong. I’ll embed some of those tweets at the bottom of this post, but Unity developer @Mcdoogleh offered a response that summed up a lot of what I heard. The dev is working on a project based on VRTK aiming to release next year, “and without question it would’ve taken me longer to develop without VRTK.”

“It’s useful to developers for quite a lot of reasons, one of them being that as far as VR development goes, prototyping is extremely tricky. So VRTK provides…features which are useful just to play around with,” Mcdoogleh said. “It’s also end to end… So you could just use VRTK as a foundation and build up from it or update as you go along to get the latest features. Anywhere from prototyping to a release product, again, highly useful. In addition, VRTK can provide a useful point of reference to other developers, so they may not necessarily want to use VRTK, but they can use components of it as a foundation for their own code. Lastly, it’s knowing that it’s going to be supported, Harvey has been very engaged in terms of talking with developers, and has fostered a community of developers who can help each other.”

Tagged with: ,

Sarah Stumbo on Unity 2017.1 and Supporting Virtual Reality

Unity has helped support videogame developers on all platforms, but it looks like they’ve managed to find a way to simplify the processing of creating a videogame by bringing in new features such as Timeline, Unity teams, Cinemachine and much more. What’s so great about these new features is that it enables teams to work on a videogame together at the same time as well as making it easier for videogame devs to control the way in which characters move, are animated as well as controlling when and where the camera moves. 

VRFocusNina Salomons talks to Unity’s XR Evangelist Sarah Stumbo about what Unity is doing when it comes to virtual reality (VR). She explains that these new features are not specifically for conventional gaming but also apply to VR videogames. It can be very expensive and very frustrating for developers to have to upgrade, change and optimize their experience for new headsets being released and their subsequent updates. It seems that Unity aims to fix this with the XR Toolkit.

She discusses that the XR Toolkit will help improve inputs, allowing for developers to create a single experience that can be cross-platform and go on all headsets, including Microsoft’s mixed reality headsets. She also mentions that VRTK and how that is a higher level features for virtual reality. What’s very clear is that Unity aims to help support the community, hoping to create an environment and online community where developers can help support each other when building XR applications for the future. Unity 2017.1 is available now in beta, Check out the video below to get more information.

VR Solution Builder Virtual Reality Toolkit Seeks Kickstarter Funding

Whether you’re already in the videogame industry or not building virtual reality (VR) titles can be a daunting prospect, with plenty of new skills and techniques to learn to make a viable project. Software engines such as Unity or Unreal are two of the most popular tools for developers to use but there are more. Virtual Reality Toolkit (VRTK) is a piece of software designed to make it easy and quick to build VR solutions in Unity3d for a range of headsets. Currently available as VRTK version 3 for free, the creator is looking to Kickstarter to crowd fund version 4.

With VRTK developers get a range of features to use within their VR projects for free. These include options such as pointers, teleportation, animated curves, headset pointer and much more. To expand upon these, further resources and talent are needed. Run as a solo project by The StoneFox, VRTK v3 has already been used by quite a few studios to build VR titles on Steam, such as QuiVR, CarCar Crash, Deisim, Manastorm and Vive Spray to name a few.

VRTK - prototype climbing

The Kickstarter campaign aims to raise £150,000 GBP over the next month, and while VRTK will still remain free and open source, five professional unity assets that will be sold on the Unity asset store will form the rewards for Kickstarter backers. These are: Fully rigged VR hands, VR avatars, Teleport beam suite, Interactions with realistic weapons and an Inventory system. Depending on the reward tier backed, users will get access to one or all of these assets.

Currently VRTK supports the Oculus Rift SDK and SteamVR for HTC Vive but with version 4 this will expand to further headsets including Google Daydream, Samsung Gear VR and OSVR. Other planned features for VRTK include:

  • Support for future VR accessories, pucks, etc.
  • Refinement and improvement of existing solutions
  • Support for visual scripting plugins (Playmaker, etc)
  • Text input solutions
  • More locomotion features
  • More 3D controls
  • Anti harassment solutions
  • Camera effects such as tunneling
  • Binaural audio
  • Regular updates to the Unity Asset Store
  • More robust roadmap of features with better estimated delivery dates (rather than just Soon)
  • Better example scenes
  • Script start up wizard for easy adding common scripts to scenes
  • More video tutorials on how to build things with VRTK
  • Tutorials on how to build multiplayer with VRTK
  • Better online help documentation
  • Continuation and growth of existing community and support

The campaign also has several stretch goals depending on its success, so keep reading VRFocus for further updates.