Oculus Research Devises High-accuracy Low-cost Stylus for Writing & Drawing in VR

Using a single camera and a 3D-printed dodecahedron decorated with binary square markers, the so-called ‘DodecaPen’ achieves submillimeter-accurate 6DoF tracking of a passive stylus. Lead by National Taiwan University PhD student Po-Chen Wu during his internship at Oculus Research, the work presents a low-cost tracking and input solution with many potential applications in virtual and augmented reality.

As shown in the video below, the ‘passive stylus’ in this case is an actual ball-point pen, allowing for a quick visual demonstration of the impressive accuracy of the tracking system, with the real and digitised drawings being almost indistinguishable from each other. Although the project focused on stylus applications, the video also highlights how the dodecahedron could be attached to other objects for virtual tracking, such as a physical keyboard.

According to the paper published on the NTU’s website, the DodecaPen’s absolute accuracy of 0.4mm is comparable to an advanced OptiTrack motion capture setup using 10 cameras—a combined resolution of 17 megapixels. The DodecaPen system achieves the same accuracy with a single, off-the-shelf, 1.3MP camera. The research clearly shows that marker corner alignment alone is not enough for robust tracking; the team instead used a combination of techniques detailed in the paper, including Approximate Pose Estimation and Dense Pose Refinement. The 12-sided shape was chosen to retain constant tracking quality, so that “at least two planes are visible in most cases.”

The key advantage of the DodecaPen is its simple construction and minimal electronics, making it particularly suited to 2D and 3D drawing. However, the team recognises its limitations and drawbacks, being prone occlusion due to the single camera, and relying on a reasonable amount of ambient light to maintain accuracy. Also, the paper notes that their computer vision algorithm is ‘slow’ compared to 300-800Hz motion capture systems, as well as Lumitrack, another low-cost tracking technology. DodecaPen’s solution is limited by the fiducial marker recognition software and the motion blur generated by the camera, resulting in unwanted latency.

The conclusion states that the system could “easily be augmented with buttons for discrete input and an inertial measurement unit to reduce latency and increase throughput.” A more complex stylus could also offer a better simulation of real drawing, including pressure sensitivity and tip tilt, which would make it better suited to emulate a pencil or brush rather than a pen. The problems of occlusion and limited low-light performance could be improved with multiple cameras with higher quality image sensors and lenses, but each upgrade would add to the system’s cost and complexity.

SEE ALSO
Oculus Research Reveals "Groundbreaking" Focal Surface Display

A made-for-VR stylus like the DodecaPen could prove to be a versatile tool for traditional productivity tasks in VR, which are largely limited today by a missing solution for fast and easy text input.

The post Oculus Research Devises High-accuracy Low-cost Stylus for Writing & Drawing in VR appeared first on Road to VR.

Google Releases ‘Resonance Audio’, a New Multi-Platform Spatial Audio SDK

Google today released a new spatial audio software development kit called ‘Resonance Audio’, a cross-platform tool based on technology from their existing VR Audio SDK. Resonance Audio aims to make VR and AR development easier across mobile and desktop platforms.

Google’s spatial audio support for VR is well-established, having introduced the technology to the Cardboard SDK in January 2016, and bringing their audio rendering engine to the main Google VR SDK in May 2016, which saw several improvements in the Daydream 2.0 update earlier this year. Google’s existing VR SDK audio engine already supported multiple platforms, but with platform-specific documentation on how to implement the features. In February, a post on Google’s official blog recognised the “confusing and time-consuming” battle of working with various audio tools, and described the development of streamlined FMOD and Wwise plugins for multiple platforms on both Unity and Unreal Engine.

Image courtesy Google

The new Resonance Audio SDK consolidates these efforts, working ‘at scale’ across mobile and desktop platforms, which should simplify development workflows for spatial audio in any VR/AR game or experience. According to the press release provided to Road to VR, the new SDK supports “the most popular game engines, audio engines, and digital audio workstations” running on Android, iOS, Windows, MacOS, and Linux. Google are providing integrations for “Unity, Unreal Engine, FMOD, Wwise, and DAWs,” along with “native APIs for C/C++, Java, Objective-C, and the web.”

This broader cross-platform support means that developers can implement one sound design for their experience that should perform consistently on both mobile and desktop platforms. In order to achieve this on mobile, where CPU resources are often very limited for audio, Resonance Audio features scalable performance using “highly optimized digital signal processing algorithms based on higher order Ambisonics to spatialize hundreds of simultaneous 3D sound sources, without compromising audio quality.” A new feature in Unity for precomputing reverb effects for a given environment also ‘significantly reduces’ CPU usage during playback.

Much like the existing VR Audio SDK, Resonance Audio is able to model complex sound environments, allowing control over the direction of acoustic wave propagation from individual sound sources. The width of each source can be specified, from a single point to a wall of sound. The SDK will also automatically render near-field effects for sound sources within arm’s reach of the user. Near-field audio rendering takes acoustic diffraction into account, as sound waves travel across the head. By using precise HRTFs, the accuracy of close sound source positioning can be increased. The team have also released an ‘Ambisonic recording tool’ to spatially capture sound design directly within Unity, which can be saved to a file for use elsewhere, such as game engines or YouTube videos.

Resonance Audio documentation is now available on the new developer site.

For PC VR users, Google just dropped Audio Factory on Steam, letting Rift and Vive owners get a taste of an experience that implements the new Resonance Audio SDK. Daydream users can try it out here too.

The post Google Releases ‘Resonance Audio’, a New Multi-Platform Spatial Audio SDK appeared first on Road to VR.

Hands-On: ‘Overturn’ is a Serviceable Fast-paced VR Shooter But Missing a Standout Feature

Described as an “action-puzzle adventure game”, Overturn’s puzzle elements play a secondary role; the game weighs heavily on fast, first-person action, combining projectile weapons and frantic melee combat with fists, shields, and blades. Overturn is available now on Steam for HTC Vive and Oculus Rift; it is also in development for PSVR.

Waking up in a laboratory complex, you’re immediately introduced to the game’s narrative design, which is text-based, describing your character’s thoughts in the centre of your vision. It’s a welcome choice in this case, as even the best voice talent would struggle to deliver this questionable script in a convincing manner. Once you begin exploring the laboratory, you’ll find the text ‘checkpoints’ already laid out across the level—similar in appearance to Valve’s ‘developer commentary’ text bubbles—hardly conducive to maintaining immersion.

Image courtesy YJM Games

Floating text aside, Overturn delivers a sharp presentation, with intuitive menu systems, useful tips placed logically in the game world, and slick environments, lighting, and effects. Production values aren’t sky-high, but there are atmospheric moments, particularly when it goes dark and you’re given a flashlight (I just wish it could be held it in both orientations), and levels have been designed intelligently to play to the strengths of VR. The anime-style character design might not be to all tastes, but it is less exaggerated than the work-in-progress footage (and the studio’s previous title that appears to be based in the same universe, Smashing The Battle VR) and works well with the overall aesthetic.

VR FPS locomotion enthusiasts will be pleased to hear the game offers both freeform ‘traditional’ movement input for those who want to glide around smoothly, and teleporting with snap turning for those susceptible to VR sickness. The teleport option is well-implemented, limiting the jump distance so you can’t ‘cheat’ the system too much, and the fuzzy visual blur on each transition is surprisingly effective. Snap turning is also welcome, but the rotation might benefit from being even faster, if not instant.

Image courtesy YJM Games

Overturn’s straightforward level progression is well-judged in terms of pacing, introducing the player to melee combat with fists on a few basic enemy types, before offering grenades and laser weapons. The blocking system works well enough, bringing your hands together to defend as you would in boxing, but the actual punching—and eventually blade-swinging—can quickly regress into wild hand-flailing, particularly on ‘Easy’. ‘Normal’ difficulty is challenging enough to warrant a more deliberate approach at times, but there is a distinct lack of nuance to the close-quarters combat, with little in the way of impact animation. Ranged weapons fare much better, and the grenade-throwing physics are intuitive.

Image courtesy YJM Games

Once you meet Magi, a mysterious girl with incredible powers, the game steps up several notches, and combat becomes more frantic and varied. Magi follows you around, offering a range of power-ups including ‘Time Slow’; as always, slow motion combat is endlessly enjoyable in VR. She can also craft health packs, which you consume by holding them up to your face, and you need to maintain her energy and health too. Battle arenas begin to introduce cover, which, as with all VR FPS games, becomes useful in a more organic way than traditional cover mechanics played on a flat display, as players will naturally gravitate to walls to physically hide/duck behind.

Image courtesy YJM Games

Enemy types and boss battles are also varied, and the game introduces new weapons and mechanics at an enjoyable pace. Since its recent launch on Steam, Overturn has grappled with enemy balancing, with the ‘Normal’ mode being rather too difficult at times. After a few rounds of updates, the game is in a better place, while still offering a serious challenge. There isn’t a stand-out feature, and its presentation is dripping with clichés, but Overturn is weirdly compelling enough to retain my attention.

The post Hands-On: ‘Overturn’ is a Serviceable Fast-paced VR Shooter But Missing a Standout Feature appeared first on Road to VR.

Wind Simulation Accessory for VR Headsets Hits $30k Crowdfunding Goal on Day One

Launched on November 2nd, the crowdfunding campaign for ZephVR, a VR fan accessory that “adds realistic wind at the right moments” achieved its $30,000 goal on Kickstarter in just a few hours, with 30 days remaining in the campaign. VR hardware startup Weasel Labs aims to deliver the first ZephVR units to customers in May 2018.

As described on the campaign page, ZephVR is designed to work with all VR games and experiences by reacting to audio cues, using machine learning to trigger the two fans at appropriate moments, i.e. traveling at speed, or when a bullet whistles past your ear. If the cue is louder in one audio channel, one fan will spin faster.

Image courtesy Weasel Labs

This audio-based approach—if it works well—means the hardware should function with all VR headsets; the PSVR-compatible version is more expensive, as it requires an additional audio processing box and cable. There is a cheaper version for just Oculus Rift and HTC Vive: the ‘earliest bird’ version going to the first 150 backers for $50. At the time of writing, some ‘early bird’ offers remain, but the full price appears to be $90 for the Vive/Rift version and $120 for the PSVR/Vive/Rift version.

ViveNchill—a simpler dual-fan device meant to keep players cool—also managed a successful crowdfunding project via Indiegogo in July, and recently began shipping to backers. ZephVR’s two fans can also be run at a constant rate for cooling rather than reactive ‘wind’, but hanging below the headsets rather than above might make it less effective as a cooling solution compared to ViveNchill.

According to the campaign page, ZephVR has the potential to improve over time as the software recognises more specific audio cues, and Weasel Labs hopes to “team up with game developers to create customized experiences for their games,” meaning that more precise fan activation could be supported in the future with direct integration into VR experiences.

The video above gives a visual example of how the algorithm detects the sound of wind in the game Windlands. The text scrolling on the left is red when the fan isn’t active and green when it is; the left stream represents the left fan and the right stream represents the right fan.

The post Wind Simulation Accessory for VR Headsets Hits $30k Crowdfunding Goal on Day One appeared first on Road to VR.

Google Launches Poly API & Toolkit to Make 3D Objects Easier to Find & Use in AR/VR

Google launched Poly earlier this month, a new platform for browsing and downloading 3D objects and scenes. While Poly enjoys full integration with Google’s popular VR creative tools Tilt Brush and Blocks, now the company has released Poly API to make accessing the growing collection of 3D assets even easier, and Poly Toolkit to make importing them into AR/VR projects easier too.

Update (11/30/17): Google today announced Poly API, a tool to make it easier for users to access Poly’s growing collection of creative commons 3D assets and interact directly with Poly to search, download, and import objects dynamically across desktop, mobile, AR/VR apps. As per the video, you can see several apps have integrated the API including Mindshow, TheWaveVR, Unity EditorXR, Normal, AnimVR, Modbox, and High Fidelity.

Google also announced Poly Toolkit for Unity and Unreal Engine, which is essentially Google’s next evolution of Tilt Brush Toolkit, allowing you to import 3D objects and scenes from Poly directly into a work-in-progress. For AR devs, Google provides samples for both ARCore and ARKit, providing you with everything you need to use Poly assets in AR apps.

Original article (11/01/17): Until today, Tilt Brush and Blocks creations have been shared in separate galleries, via ‘Sketches’ and ‘Objects’ respectively. Poly appears to be a combination of the two existing sites, offering a more coherent and wider collection of 3D objects and scenes. As a result, there are already thousands of free models available to view and download, along with new search functionality.

Image courtesy Google

Many objects are ‘remixable’ too, if published under a ‘CC-BY 3.0’ licence; clicking the ‘like’ button allows the user to import a remixable version of the object into Tilt Brush or Blocks to make changes, automatically crediting (and linking to) the original creator when the model is republished.

The new site also allows for direct upload through a browser, with drag and drop functionality for OBJ and MTL files, using a familiar design language consistent with other Google platforms. This could mean that Poly becomes a popular new destination for 3D models created using many other software tools, perhaps eventually offering an alternative to SketchFab, the current leader in ‘universal’ 3D and VR object sharing. Much like SketchFab, Poly allows object viewing in VR (currently Cardboard and Daydream are supported), but also offers quick GIF creation to share models more easily.

The post Google Launches Poly API & Toolkit to Make 3D Objects Easier to Find & Use in AR/VR appeared first on Road to VR.

‘I Expect You To Die’ Gets New Level & Limited Time 20% Discount

New content for the popular VR puzzle game I Expect You To Die (2016) arrived today on all platforms – a free level named ‘First Class’ available to all current and future players. Schell Games has also begun a week-long 20% discount for the title on all platforms.

The stylish, escape room puzzle game I Expect You To Die originally launched for the Oculus Rift in late 2016, quickly expanding to the PlayStation VR platform in December, and finally to the HTC Vive via Steam in April 2017. While receiving positive reviews for its highly polished gameplay and classic spy-caper presentation, the overriding criticism was for its short length, with only four levels to dig into, leaving most players wanting more.

Thankfully, a fifth level named ‘First Class’ is now available for free, where players will experience a ‘relaxing train ride’ through India, as shown in this teaser trailer:

Schell Games Design Director Shawn Patton acknowledged the need for more content in a recent post on the PlayStation Blog. “We understand this issue is the game’s biggest weakness, and frankly, we feel like it’s a good problem to have,” he writes. “In the end, however, we knew we had to address it.”

Patton goes on to describe the team’s approach to developing First Class, hinting at new interactions and gameplay ideas, but carefully avoids giving away the level details. “Just know that there’s nothing more relaxing than the click-clack of a train as it rolls lazily through the majestic mountains of Northern India,” he continues. “Enjoy a soothing beverage or a local delicacy as you read the morning paper. We believe trains are the best place to meet new friends and catch up with old ones.”

I Expect You To Die has enjoyed sales success; the company announced the title surpassed $1 million in revenue across all sales channels in August 2017, including the Oculus Store, PlayStation Store, Steam, and Amazon.

“I’m so glad we are able to offer fans new content for I Expect You To Die,” stated Jesse Schell, CEO of Schell Games. “They have been so supportive of the game. We really think they’ll enjoy this new free level, and their vacation experience should be unforgettable.”

The post ‘I Expect You To Die’ Gets New Level & Limited Time 20% Discount appeared first on Road to VR.

VR “Bullet Hell” FPS ‘Evasion’ Announced for Rift & Vive, Catch the Trailer Here

Sci-fi shooter Evasion from VR studio Archiact has been revealed, said to feature ‘next generation’ VR locomotion and ‘high intensity’ co-op multiplayer combat for up to 4 players. The game is due to launch for HTC Vive and Oculus Rift in early 2018.

Described on the official website as “an intense bullet hell spectacle”, Evasion has been designed from the ground up for VR, featuring co-op multiplayer combat for up to 4 players, with multiple hero classes to choose from to suit “all play styles” – two of which are already detailed on the site.

As shown in the announcement trailer, the game features ‘bullet hell’ action combined with destructible environments. According to the press release provided to Road to VR, a core development goal was to enable “incredible locomotion freedom.” Players will be able to choose from a few locomotion types, including ‘free move’, that can be customised for “play style and comfort level.”

Evasion aims to deliver several cutting-edge technologies, including ‘next gen’ full-body avatars powered by IKinema’s inverse kinematics animation system, “best in-class physics and destruction” enhanced for Intel Core i7 and i9 processors, and high-quality visuals and 3D spatial audio powered by Unreal Engine 4. The game’s ‘Swarm A.I’ enemy behaviour system claims to add “a new level of tension, making tactical movement and teamwork critical to your squad’s survival,” resulting in greater mission replayability with “randomized objectives and enemy encounters.”

Vancouver-based developer Archiact has been behind several virtual reality titles for mobile VR and location-based VR, but with the exception of penguin-puzzler Waddle Home (2016), available on SteamVR and PSVR (alongside a Gear VR version), the studio has yet to make a splash in the high-end consumer VR space. In a brief message posted on the official site yesterday, the team described Evasion as their “passion project”, having worked on the game for “over and year and a half.”

“Archiact has been crafting immersive VR entertainment since 2013,” said Kurt Busch, Studio Head at Archiact. “With Evasion, the team is using everything we’ve learned and weaving our experience into a truly genre-defining AAA title. With innovative locomotive movement and intense FPS gameplay, we’re convinced Evasion will prove a stand-out VR experience and we can’t wait for gamers to play it themselves.”

The post VR “Bullet Hell” FPS ‘Evasion’ Announced for Rift & Vive, Catch the Trailer Here appeared first on Road to VR.

Oculus Explores 8 Experimental Locomotion Methods, Adds Samples to SDK

Following a two-part blog series in partnership with Crytek, where the studio shared some of its research into VR locomotion comfort, Oculus has now added eight new experimental locomotion methods to the SDK. A recent entry on the official Oculus blog provides an introduction to the techniques.

Update (10/27/17): Oculus has posted their talk given at Connect earlier this month on the new locomotion experiments. Tom Heath shows each method in action and explains the thinking behind it. The video has been added to this article.

Oculus encourages developers to explore the experimental locomotion options, noting that certain techniques that appear to detract from immersion don’t necessarily result in reducing the impact or enjoyment of VR, as players can become “more accustomed, tolerant, or acclimatized” to various locomotion comfort techniques.

There are a few variations on a ‘static world’ technique to ensure the player is always aware of a static VR world in addition to the moving one, a form of ‘cockpit view’, motion-controlled locomotion as used in Lone Echo and various ‘climbing’ games, as well as some more unusual ideas involving mismatched visual styles, and ‘artificial tilt’ that “departs from any notion of stasis”.

Here’s a list of the newly included locomotion types:

  • Artificial Tilt
  • Counter Optic Flow
  • Unreal World Beyond Static Cockpit
  • Process Reducing Relevance of Mismatch
  • Ski-pole World Manipulation
  • Portals into a Static World
  • Window into the Moving World
  • Emerging Static in the Periphery

Some of these techniques are particularly difficult to explain or imagine; Oculus has provided the source code for a test application for developers to experience each one for themselves. Check out the full blog entry here for more details.

The post Oculus Explores 8 Experimental Locomotion Methods, Adds Samples to SDK appeared first on Road to VR.

Valve Says New Calibration Software Makes Lower-Cost LCD Panels Viable for High-End VR

New SteamVR optical technologies now available to VR hardware manufacturers includes the use of both LCD and OLED custom panels. According to the press release provided to Road to VR, Valve’s work with display manufacturers and recent advancements in LCD technology combined with VR-specific calibration “now make it a viable technology choice for high end VR systems.”

Valve recently announced the availability of new core VR technology components for device manufacturers, namely displays, optics, and calibration tools. These essentially combine with the existing free license for SteamVR tracking and input technology, meaning that the key hardware elements for high-end VR are now all available through Valve.

“World class VR requires highly precise tracking, matched optics and display technologies, and a software stack that weaves together the interactions between these components,” said Jeremy Selan of Valve. “For the first time, we’re making all of these technologies available to anyone who wants to build a best in class VR system for the millions of Steam customers accessing over 2,000 SteamVR compatible titles.”

image courtesy Valve

As shown on the SteamVR licensing page, both LCD and OLED panels are being recommended for VR. While OLED technology offers a number of advantages for VR use cases – an essential specification of the popular first generation, high-end VR headsets available today – it is now possible to use LCD, thanks to recent advancements in the technology and optimisations to calibration software.

The confidence in LCD for high-end VR is demonstrated in the upcoming Pimax ‘8K’ VR system, which uses SteamVR technology – as well as the upcoming ‘high quality’ mobile VR solution from Oculus.

“Fast-switching liquid crystals, low persistence backlights, and high PPI displays” of the latest LCD panels are, according to Valve, “well matched” to high-end VR. They continue to recommend OLED as an “excellent option for new head mounted displays”, pointing out that both display technologies have “inherent artifacts unique to head-mounted usage”, which are being solved at both a hardware and software level as part of the SteamVR technology suite.

Valve’s custom lenses available for purchase work with both LCD and OLED panels, which also benefit from their calibration and correction software. They support a field of view between 85 and 120 degrees, and are designed for the “next generation of room-scale virtual reality.”

The post Valve Says New Calibration Software Makes Lower-Cost LCD Panels Viable for High-End VR appeared first on Road to VR.

Adobe Premiere Pro Now Includes VR Editing Interface ‘Project CloverVR’

Adobe’s VR editing interface for Premiere Pro is now available as part of this week’s Creative Cloud release. Project CloverVR is optimised for editing immersive media within Premiere while wearing a VR headset.

Revealed at the annual creativity conference Adobe MAX in Las Vegas today, Project CloverVR is now integrated into the latest Creative Cloud release as part of the Immersive Environments feature set in Premiere Pro.

The VR interface was originally previewed at last year’s Adobe MAX in San Diego, one of 11 experimental technologies demonstrated as part of the ‘Sneaks’ session. Not all ‘Sneaks’ make it to release, but Project CloverVR has, as it addresses a fundamental problem with editing 360 video or other immersive media in Premiere.

Currently, editing immersive video content using software designed for conventional monitors is tedious; it’s not only difficult to visualise the imagery correctly, but you constantly have to check your changes with a headset. CloverVR aims to improve the workflow for this type of content, allowing users to access familiar Premiere tools and timeline within VR, with the ability to perform edits using an interface optimised for motion controllers.

The feature is fully available to anyone with Creative Cloud as part of this year’s Adobe MAX release. Adobe MAX is already underway in Vegas, following two days of preconference labs, with the main sessions, labs, and creativity workshops starting today through October 20th. This includes several sessions relating to VR, such as “Creating Virtual Reality Video” and “Project Felix: 3D for Graphic Designers and a Journey into AR/VR”.

According to this recent blog entry from Nvidia, also published on the Adobe MAX blog, Nvidia GPUs will be used to demonstrate “real-time 8K editing in Adobe Premiere Pro CC” and “10x faster motion graphics and 360/VR design in After Effects CC”.

The post Adobe Premiere Pro Now Includes VR Editing Interface ‘Project CloverVR’ appeared first on Road to VR.