‘Warpaint’ Brings Tabletop Gaming-inspired Turn-based Strategy & Customization to VR

Recently launched on Steam, Warpaint is a turn-based strategy game that lets you customize your troops by painting them to your taste. The game supports both Mac and PC, and features VR support via SteamVR (Vive, Rift, and OSVR). Taking inspiration from board and tabletop games like hexagonal chess and Warhammer, Warpaint’s gameplay is easy to pick up but hard to master.

Real-time strategy gaming seems to suit VR pretty well, but few developers have chosen the turn-based approach; Warpaint shows that the more sedate pace of turn-based strategy is a natural fit for VR too. Commanding an army of dwarves with different movement abilities, the gameplay is mostly tactical, and played with a surprisingly high degree of tension, thanks to the ever-present threat of catapults.

With red projectiles reminiscent of the balls from board game Weapons and Warriors, the catapults introduce a dextrous aspect to Warpaint’s gameplay that, like many action board games, benefits from a skilled aim and a bit of luck. Catapults have the potential to change the momentum of a game—pulling off a double kill with a lucky bounce for example—but there is always a danger of taking out your own pieces too.

Friendly fire triggers some amusing ‘sorry!’ and ‘whyyy?’ dialogue from the dwarves; the voice acting is a stand-out feature, adding a healthy slice of charm to what is otherwise a rather plain presentation. I’d like to hear a wider selection of dialogue, perhaps even a battle announcer. Certainly a narrator for the tutorial would be welcome.

Warpaint doesn’t have the graphical chops to produce the most enticing screenshots or footage, but the straightforward style at least works effectively in VR with clean edges and high performance. No doubt the game would make a better first impression with a few additional effects, combined with a more integrated visual design for the UI and in-game motion controller models.

Otherwise, the game presents itself as a solid production, with well-balanced gameplay and a great set of features, including ranked matchmaking. You can play online or locally, against friends or AI, with VR users and monitor users playing together. The VR implementation isn’t attempting to reinvent the wheel here, it’s simply an effective and compelling option. While the game allows instant teleporting around the play area from multiple scales; the most useful tactical view is from above and at a distance, meaning that monitor users shouldn’t feel at a disadvantage, although I did find it easier to gauge my catapult shots using a headset.

It would be useful to have a way of adjusting the distances between different camera scale toggles, as the lower option sometimes feels too close, and the next height up sometimes feels too far away, and perhaps an option to rotate the view during or after a teleport could help those using ‘standing’ VR mode.

Aside from the joy of firing catapults in first person, the Army Painter system is the most interesting use of VR in the game, which allows dwarf customisation in the same way one might paint Games Workshop miniatures. Rotating the piece in one hand while airbrushing the fiddly bits with the other captures the feel of the hobby in a satisfying, impressively robust way. A system allowing for full customization, including limb-posing and accessorizing with different pieces of armor and weapons would take things to a whole new level and we hope it will be considered for future versions of the game.

Warpaint’s modest asking price is perhaps representative of the fairly small amount of content available, but it is a game made with care that deserves your attention. Developer Adam Thompson has been actively responding to initial feedback, having already rolled out fixes and improvements: the full details are available on the game’s Steam page.

The post ‘Warpaint’ Brings Tabletop Gaming-inspired Turn-based Strategy & Customization to VR appeared first on Road to VR.

‘Lone Echo’ Developer Shows Impressive Procedural Hand-posing System for VR

Lone Echo, Ready At Dawn’s upcoming space-based VR game, employs a physics-based movement system where the player pushes and pulls themselves around a zero-gravity environment. A video presented at GDC highlights their procedural animation system that allows for convincing grip postures on arbitrarily shaped surfaces.

One of several exclusive games anticipated to launch on Oculus Rift and Touch hardware this year, Lone Echo is a first-person action game in a science fiction setting comprised of a story-driven single player mode and a team-based ‘esport’ multiplayer mode. Ready At Dawn’s technical prowess showcased in The Order: 1886 (2015) continues here, with the studio once again adapting to a new platform, developing an impressive physics-based movement system that plays to the strengths of VR motion controllers.

Jake Copenhaver, lead gameplay programmer at Ready At Dawn, recently gave a presentation at GDC about Lone Echo’s animation and locomotion system, entirely focused on the first person perspective. A teaser video (heading this article) was provided, which shows the inspiration (real footage of astronauts on board the ISS), the early prototype of the movement system, and the current implementation.

Zero-gravity movement is ideal for VR’s current motion tracking solutions, as astronauts mainly propel themselves around using their arms rather than their legs, grabbing handles or surfaces to pull or push. Lone Echo achieves this form of locomotion by positioning the head 1:1 relative to the hand as soon as it grabs something. This connection allows the player to pull towards or push away from objects in any direction, intuitively navigating the environment with physics-based interactions.

Copenhaver went into detail about the issues that needed to be solved, as the system had to work just as well on floating physics objects (of wildly different masses) as it did on fixed geometry. The next step was to develop a procedural animation system for the hands that would convincingly sell the movement and collision physics, as the player is free to grab ‘any and every surface’ in the environment. Only the ‘gun grip’ in the footage is pre-authored; every other animation is responding to objects and surfaces procedurally.

 

lone-echo-ready-at-dawnIn addition to the complex finger joint optimisation, there was the challenge of estimated inverse kinematics to animate the untracked joints, such as the elbows and shoulders, as well as the rest of the untracked body. A separate procedural animation is at play for the spine and legs, which move based on player velocity, combining with a propagating motion from player head and arm movement, as well as an additives layer for some animator control.

SEE ALSO
Hands-on: 'Lone Echo' Multiplayer is a Totally Surprising Triumph for Competitive Zero-G Locomotion

The full presentation slides are available to view or download.

The post ‘Lone Echo’ Developer Shows Impressive Procedural Hand-posing System for VR appeared first on Road to VR.

Oculus Publishes Touch CAD Files for Custom-made Accessories

Oculus has provided the ‘Touch Accessory Guidelines 1.0’ for download, which contains 3D CAD files of the Touch VR controller. This data can be used to help designers and manufacturers create new accessories that integrate with the Touch hardware.

Available for download on the Oculus developer website, the Touch Accessory Guidelines 1.0 include technical drawings and STEP files of the controller’s exterior surfacing and battery compartment. In addition, it includes data for the Rock Band VR connector, an adapter included with every Oculus Touch package, enabling the design of devices which could use the adapter to attach a Touch controller.

SEE ALSO
Oculus Touch Insides Revealed in Detailed Teardown

You can take a look at the CAD files here for the Rock Band adapter, the exterior surface, and the battery compartment. The battery compartment model is the most complex, as it includes many of the internal components and surfaces, which can be highlighted using the Model Browser tool.

oculus-touch-review-9
The Rockband adapter holds the Touch controller neatly to a Rockband guitar, but it could be used to attach the controller to other accessories too.

The new guidelines add to the existing Rift Accessories Guidelines documentation, which include sections for the headset, Audio Module and Facial Interface. While the Touch section doesn’t offer much in the way of controller-specific tips for accessory makers (perhaps ‘don’t obstruct the tracking ring’ was too obvious!), only detailing the electrical specifications, the general guidelines written for the Rift headset can still be interpreted and applied to Touch accessories. Avoid using LEDs in mounted accessories (to prevent tracking conflicts), keep in mind comfort is paramount, and keep in mind that the fit of accessories not only impacts physical comfort but can also impact how users experience content in VR.

Valve has been proactive in opening their tracking technology to third parties for accessory development, and HTC are offering a dedicated Vive Tracker’ for tracked accessories. Oculus is well behind on its promise to open up its tracking API to third-parties, but using the Touch controllers as a self-contained tracker at least gets the ball rolling.

oculus-touch-review-15Interestingly Touch is not much larger than the Vive Tracker, as Tactical Haptics has shown. Perhaps one of the biggest issues with using Touch as a device to track a dedicated VR peripheral is the lack of input/output options between the peripheral and the controller. Peripherals made for use with Touch would need to make use of the controller’s own buttons to input/output to the game at hand, or likely a separate wireless connection to the host PC.

SEE ALSO
Oculus Touch Review: Reach into Rift

The post Oculus Publishes Touch CAD Files for Custom-made Accessories appeared first on Road to VR.

Google Patent Shows Smartphone Packaging that Doubles as a VR Headset

A recently published Google’s patent application describes an ‘Integrated mobile device packaging and virtual reality headset’. The concept is to provide a ‘relatively low cost’ VR headset solution by shipping the smartphone in packaging that doubles as a VR viewer.

Google introduced Cardboard, their inexpensive VR solution to the world in 2014, producing a VR viewer enclosure for smartphones made from Cardboard. Since then, over 10 million Cardboard viewers have shipped, along with many similar products, ubiquitous to the point of being distributed free as promotional items. Google’s patent describes integrating an enclosure similar to that of Cardboard as a novel packaging solution for a smartphone, an especially interesting idea given that the much improved VR performance that comes with ‘Daydream Ready’ phones can also extend to VR apps made for Google Cardboard.

SEE ALSO
Google Job Listings Point to New "Mass Production" AR/VR Hardware in the Pipeline

A Cardboard-like headset that ships with capable smartphones could act as a ‘VR lite’ option, with those especially interested in VR able to upgrade to a more capable VR viewer like Google’s Daydream View. Such an approach could help the company achieve their goal of ‘hundreds of millions of users in a couple of years’ using VR on Android.

google-cardboard-vr-headset-smartphone-packaging-2A few companies have experimented with integrating a VR viewer into packaging, such as Coca-Cola’s DIY viewer made from 12-pack boxes, and McDonald’s ‘Happy Goggles’ made from a Happy Meal box. More substantial VR headset shells (closer to a Gear VR in build quality and materials) are sometimes bundled with smartphones, and Alcatel’s Idol 4S packaging goes a step further, using the shell as part of the ‘unboxing’ experience in some regions, as shown in this video.

Originally filed on February 24th 2016, Google’s patent refers to cardboard or heavy paper stock for the main portions of the unit, and ‘glue’ and ‘tape’ are suggested several times as appropriate adhesive material; the quality of the enclosure they have in mind is probably closer to a Cardboard viewer than their Daydream View unit, although plastics and fabrics are also mentioned.

It isn’t clear if this patent is related to a Google’s recent hiring spree for AR/VR hardware expertise that appears to point to significant new AR/VR hardware on the way from the company.

The post Google Patent Shows Smartphone Packaging that Doubles as a VR Headset appeared first on Road to VR.

The Guru 360° is a Clever Portable Stabiliser for 360° Cameras

The Guru 360° is the first 3-axis gimbal designed specifically for 360-degree cameras. Adding to the flexibility of the MOZA interchangeable gimbal system, the unit works with all lightweight 360-degree cameras with minimal obstruction to the field of view.

360-degree or spherical video has played a significant role in the growth of modern VR, introducing the concept of ‘free-look’ to the mainstream consumer through social media, thanks to 360 video support on platforms like YouTube and Facebook. Accessible on even the most inexpensive, Cardboard-style VR headsets, combined with the advent of capable entry-level cameras like Samsung’s Gear 360 and Ricoh’s Theta S, 360 video’s popularity is growing rapidly.

Unfortunately, the limitations of current 360 capture mean that, viewed in VR, there are several optical inaccuracies at play, even when using more advanced stereoscopic hardware; some would argue that it doesn’t really qualify as VR. As such, some footage can be quite uncomfortable, causing nausea. Video stability is a major culprit, and 360 cameras exacerbate the problem of handheld capture severely. Watching footage of someone walking with a 360 camera can be a horrendous experience in a headset.

Major advancements are being made in software, such as the real-time stabilising feature coming to Insta360 devices, but the most reliable hardware solution is a motorised gimbal, which actively stabilises the shot. Typically these are designed with smartphones and action cameras in mind, meaning that parts of the mechanism obstruct a chunk of a 360 camera’s field of view.

The Guru 360° by GimbalGuru, introduced in this video, is the first 3-axis gimbal designed specifically for 360-degree cameras, minimising any obstructions. It supports lightweight 360 cameras like the Gear 360, Nikon Keymission, Ricoh Theta S and Kodak Orbit, correcting unwanted movement, rotation, roll, and horizon drift, offering ‘follow’, ‘lock’, and ‘head lock’ filming modes. Viewing footage captured using this device with a VR headset should be considerably more comfortable, and should improve image quality, with cleaner stitching and reduced artifacting.

Designed around the proven MOZA interchangeable gimbal system, the handle is compatible with the Mini-C and Mini-G heads, meaning users can easily upgrade their gimbals to support 360 cameras, smartphones and action cameras. The Guru 360° is currently available for order on OwlDolly.com and GimbalGuru.com for $299, a limited first run stock, shipping in the middle of March.

The post The Guru 360° is a Clever Portable Stabiliser for 360° Cameras appeared first on Road to VR.

CNN Launches New Immersive Journalism Initiative, CNNVR

CNN has officially launched CNNVR, a new immersive journalism unit and virtual reality platform with CNN Digital. Capturing major news events in 360 video, the global team will produce weekly virtual reality experiences and live streams.

News organisations have tested the VR waters over recent years, for instance CNN’s work with NextVR on a live presidential election debate, ABC News’ collaboration with Jaunt VR exploring Syria, and The New York Times’ work with Within on Walking New York, and over the past year, CNN has produced over 50 news stories with associated 360 video, generating over 30 million views on Facebook alone, the company says.

cnn-vr-daydreamIn a major new commitment to the medium, CNN has officially launched CNNVR, a new journalism unit that aims to produce immersive VR content across multiple platforms and devices. The CNNVR app has been available on Google’s Daydream VR platform since its November launch last year, and is now publishing content to the Samsung VR app on Gear VR and Oculus Video app on Rift too. In addition, CNN’s iOS and Android apps now support 360 video, along with the desktop site CNN.com/VR.

Speaking at VR Day Atlanta in January, CNN’s Head of VR, Ed Thomas, described VR and 360 video as a ‘reinvention of the news story telling process’. He highlighted the fact that CNNVR really satisfies the promise of CNN’s long-running ‘Go There’ brand campaign, being able to place the user in hard-to-reach places, or making them feel like an active participant within a news story.

CNN’s VR output is ramping up to a daily production cycle, with a key aim of delivering regular VR live streams of global events; teams are already set up in New York, Atlanta, London, Hong Kong, San Francisco, Dubai, Johannesburg, Tokyo, and Beijing, the company says.

SEE ALSO
VR Video Transition Pioneered by News Channel is the Best We’ve Seen Yet

The launch was marked with a powerful, thought-provoking use of 360 video capture—a compelling production about the controversial traditions of bull running and bull fighting in Pamplona, Spain.

The post CNN Launches New Immersive Journalism Initiative, CNNVR appeared first on Road to VR.

‘Obduction’ Adds Motion Control, Coming to Vive and Oculus Touch This Month

Cyan’s spiritual successor to Myst is launching with all-new motion control support on March 22nd for HTC Vive and Oculus Touch. The game originally released on Steam in August 2016, receiving initial VR support for the Rift in October.

Following in the legendary footsteps of Myst and Riven, Obduction presents an ideal VR setting, taken at a slow pace, encouraging players to study the environments carefully, finding clues to solve puzzles in a curious new world. The original VR support for Oculus Rift began as a stretch goal during the game’s 2013 Kickstarter campaign, and arrived in October 2016, a couple of months after the standard game launched on Steam. The game received a free update and launched on the Oculus Store at the same time, and was praised for its visuals and puzzle diversity.

obduction2Using the ‘blink’ teleport feature, the game feels the most like Myst, although a freeform movement with snap turning was also available, which was then updated in November to include a smooth turning option for those unaffected by this contributor to VR sickness. Since then, Cyan have focused on bringing the experience to other headsets, announcing the game would come to PlayStation VR and HTC Vive in 2017, with the major addition of motion controller support.

SEE ALSO
'Obduction' VR Review

The new version arrives on HTC Vive and Oculus Touch on March 22nd on Steam, GOG, Humble Store, and the Oculus Store for $29.99. Existing owners will receive the update for free. Motion control should be a perfect fit in a game scattered with detailed objects to study, and involves extensive button and lever interactivity.

“We have over 200,000 fans on our Steam wishlist, many who have been asking for hand controls for Obduction. As a VR-centric studio, we’re thrilled to be delving even further into these platforms, bringing ever deeper immersion to our worlds and pushing the edge of what’s possible”, says Rand Miller, CEO, Cyan.

obduction1Visitors to PAX East this weekend will have a chance to preview the Oculus Touch version in the Indie MEGABOOTH, and there is a further opportunity to try the game at the Indie Corner of the SXSW Gaming show floor, from March 16th to 18th, 12-8pm at the Austin Convention Center, Exhibit Hall 2 – plus Rand Miller will be taking questions on the SXSW Gamer’s Voice stage at 7:45pm on March 17th.

The post ‘Obduction’ Adds Motion Control, Coming to Vive and Oculus Touch This Month appeared first on Road to VR.

‘Waltz of the Wizard’ ‘Ghostline’ Analytics Reveal Some Surprising Player Behaviour

Aldin Dynamics has released a detailed breakdown of user data gathered during over 300,000 sessions of Waltz of the Wizard gameplay. Launched in May 2016, the motion-controlled VR game was developed with the insights gained from using Aldin’s own data visualisation tool Ghostline.

ghostline-logoDedicated to VR software development since early 2013, Aldin Dynamics is one of the most experienced studios in the world, launching software on Oculus developer kits and Gear VR. In May 2016, their motion control game Waltz of the Wizard launched on Steam for free, and quickly became a popular showcase for the HTC Vive, having seen over 300,000 sessions from over 100,000 players. It is currently the highest-rated VR app on Steam.

However, the wizardry is more than skin deep. The game acts as a test bed for Aldin’s real flagship software, Ghostline. This large-scale analytics and visualisation tool has been in development in since January 2015, served a vital role in prototyping Waltz of the Wizard’s level design and gameplay, and now acts as a rich data source for VR user habits within the released game. As described in this Polygon feature, Ghostline has the ability to record the actions of every user (via automatic, anonymous data collection), which can be replayed and viewed from any perspective, including from the original first person view. This ‘user ghost’ visualisation is far more efficient and less intrusive than shooting video of people playing in VR, and has many other benefits of in terms of detailed analysis of usage patterns and behaviour.

waltz of the wizard2Aldin Dynamics has now shared some of the data created within Waltz of the Wizard using their Ghostline technology. As a game designed to demonstrate room-scale VR while accommodating standing VR, some of these stats aren’t too surprising – with 87% of players using a room-scale space. Understandably, play time is higher in room-scale, with session lengths 19% longer and lifetime averages 72% longer; the game is simply more engaging when given more freedom to move around.

Ghostline's Analytical View of WotW's Playspace
Ghostline’s Analytical View of WotW’s Playspace

The detailed room-scale space breakdowns by country follow some logical patterns too, as countries with vast land mass like China, USA and Canada have the largest average play areas (China is highest at 5.9m²), and the densely-populated Japan has the smallest at 4.4m². Some of the less-specialised stats such as audience and hardware data are already available through Steam’s own tools, but Ghostline’s ability to combine every metric in such detail is unprecedented.

One of the most critical stats is that room-scale players physically look around 18% more than standing players, which has many implications for level/gameplay design – trying to cater to the standing player who is on average more reluctant to turn their head. The vast quantity of interaction/movement data available to Ghostline allows for a granular analysis of players’ physical behaviour. Within each scene from the game, it displays data relating to the amount of physical locomotion, button presses and head movement in degrees. The Wizard’s Tower scene, which contains the spell mixing table, scores the highest on interactivity, while the Hallway, which presents a sudden change of atmosphere ‘designed to induce a fight or flight response’, results in the highest level of physical movement.

waltz of the wizard4The granularity continues into the more amusing stats. Of course, nobody can resist causing damage – over 19 million crossbow bolts have been fired, and over 14 million fireballs have been cast. The wizard’s assistant has been shot over 29,000 times, and ‘Skully’ has been thrown out of the window by 5% of players, and drowned in the cauldron 17% of the time. Some stats may seem trivial, but as Aldin explains, “the smallest of details can make or break an experience. For this reason it is absolutely vital to pay careful attention to the user experience and ensure that your content is having the exact impact that you envision”. Aldin believes that analysing at the level Ghostline provides is key to making a great experience, and Waltz of the Wizards’ unmatched 99% approval rating on Steam is testament to that theory.

The post ‘Waltz of the Wizard’ ‘Ghostline’ Analytics Reveal Some Surprising Player Behaviour appeared first on Road to VR.

‘AirMech: Command’ Gets Major Oculus Touch Update, Launches on Steam VR

VR motion control comes to AirMech: Command, and the game has been released on Steam with support for multiple VR headsets and controllers through OpenVR. The game originally appeared as a launch title for the Oculus Rift in March 2016.

Drawing direct inspiration from pioneering real-time strategy title Herzog Zwei, AirMech started life as a free-to-play game on PC in 2012, where it has remained an open beta. Optimised for gamepad control like the Mega Drive/Genesis game, AirMech naturally found its way to Xbox and PlayStation consoles in the form of AirMech Arena in 2015. As Oculus launched the Rift with a seated, gamepad-controlled focus, the game was again in an ideal position to transition to a new platform, and AirMech: Command became an exclusive launch title for the headset on March 28th 2016. The game was largely well-received, showcasing VR’s suitability for the RTS and MOBA genres.

Today, Carbon Games released a major update, adding support for Oculus Touch controllers (existing owners of the Rift version receive a free update). And with the timed exclusivity complete, the product has also launched on Steam with full OpenVR support. As shown in the teaser trailer, the motion controls allow for brand new ways of interacting with units and navigating around the battlefield, described by the creators as ‘a huge game changer for RTS games in VR’.

By using two virtual cursors, Carbon have devised a way of amplifying hand movements for faster control, and the zoom and rotate functions mean that you can play in a single spot like a board game (seated VR is still supported) or walk around a massive world in room-scale VR.

The post ‘AirMech: Command’ Gets Major Oculus Touch Update, Launches on Steam VR appeared first on Road to VR.

SteamVR Gets Beta Development Build with Support for Linux

The beta channels for the Steam client and SteamVR now support Linux. As a development build, the selection of Linux-supported SteamVR games is very limited, but Valve says the build aims to let developers begin making VR content that supports the open source operating system.

Valve recently provided some positive news for VR on Linux, which has been fairly limited since Oculus ‘paused’ its Linux development in 2015. Announced late last year, Valve has launched of a SteamVR developer build for Linux. According to the GitHub page, “This is a development release. It is intended to allow developers to start creating SteamVR content for Linux platforms. Limited hardware support is provided, and pre-release drivers are required”.

Linux support is long overdue (it was originally planned to be available at the HTC Vive’s launch), and according to Michael Larabel, founder of Phoronix, a leading source of Linux-related reviews and insights, the beta is—as Valve warns that it is a development release—‘more like an early alpha’, and took quite some time to configure and get things working. Larabel points out that Linux represents a very small percentage of the overall gaming market share, making Linux VR gaming currently a niche-within-a-niche situation that will take some time to gain traction.

Valve recommends using the issue tracker on the project’s GitHub page for reporting specific bugs, while general discussion and questions are handled on the Steam Community forum. In the long run, there is hope for significantly improved Linux support resulting from the recently-named OpenXR API, and industry-derived open standard for VR and AR from the Khronos Group, which aims to stem hardware and software fragmentation in as the industry continues to grow.

The post SteamVR Gets Beta Development Build with Support for Linux appeared first on Road to VR.