Massive Particle Simulation Playground ‘Chroma Lab’ Launches on SteamVR

VR design is still in the very early stages, but something that we know leads to immersion in VR is interactivity. One developer is taking interactivity to the extreme with Chroma Lab, a VR experience now available on SteamVR that lets you play with hundreds of thousands of simulated particles in real-time.

Update (8/23/17): Chroma Lab is now available. This article, which was originally published on 3/28/17 and overviewed the game before it was launched, has been broadly updated with the latest information.

VR gives us the opportunity to simulate the real world and its physical laws to step into situations that we couldn’t otherwise practically experience, like driving a racecar or flying a plane. But what about simulations of the impossible, like commanding hundreds of thousands of floating pulsating rainbow particles?

Developer Sean Tann has developed Chroma Lab to answer that question. The game launched this week on SteamVR with support for the HTC Vive and Oculus Rift, and will soon come to the Oculus store.

Tann calls Chroma Lab a “particle physics sandbox,” and says that more than 100,000 particles are simulated in VR at 180Hz using a custom GPU accelerated physics engine. He says the physics engine was written using HLSL compute shaders and that the game doesn’t rely on any vendor-specific rendering technology and is therefore optimized to run on any VR ready GPU, be it AMD or NVIDIA.

Image courtesy Sean Tann

The initial experience launched this week and is enjoying a 10% launch discount, pricing it at a modest $4.50. The experience is “a toy for now,” but the developer is considering adding game modes in the future and considering DLC as well. Here’s what Tann he calls the game’s key features:

  • Beautiful, psychedelic visuals
  • Particles react to background music (any external music player is compatible, may not work with Bluetooth headphones)
  • Tools to pick up, hit, pull, explode, shoot and paint the particles
  • Placeable force spheres which can also teleport the particles between one another
  • Save and load scenes and settings, also there are a few presets to get started with
  • Adjustable physics settings to change how the particles behave
  • Multiple different particle shaders, color pallets and other graphics settings to choose from
  • Throw blobs into orbit and create black holes
  • Optional gravity and “lava lamp” mode
  • Freeze the simulation and step through it or slow time to a crawl
  • Great as a first VR experience
  • MixCast support for easy mixed reality
  • Native Oculus Rift support
  • Number of particles can be automatically determined or manually chosen
  • Scale and bounding walls can be adjusted allowing Chroma Lab to be played from sitting to room scale

Without imposing the high fidelity rendering bar required for VR, Tann says the physics engine written for Chroma Lab is capable of simulating 1.6 million particles at 60Hz on an AMD R9 290 (a four year old GPU); today’s high-end cards would presumably be able to push that much further. A video from the developer, which shows the Chroma Lab physics engine at an earlier stage of development, shows 400,000 particles simulated at 60Hz:

Responding to comments on Reddit about the earlier Chroma Lab teasers, Tann explains a bit more about the simulation and its limitations:

There is no limit to the amount of particles apart from VRAM and size. Ignoring overhead, rendering and counting sort, computational time is O(N) for all particle stuff and interactions (well apart from one). Increasing play space also slightly increases comp time due to counting sort and it massively increases VRAM usage. I could use a repeating grid for infinite playspace but it is not necessary for my game and having the particles in a fixed grid will be useful for future physics additions.

Assuming the fluid is not compressed, internal particles interact with about 55 others per iteration consistently.

All the particle calculations and data manipulation runs entirely on the GPU. The CPU barely does anything.

Tann is a recent master of electrical engineering graduate who says he’d been developing the project in his spare time at University since January 2016, except for the last few months leading up to launch when he opted to go full time on it. I wouldn’t have guessed it, but Tann says it’s his first time developing a game, and that the impetus behind the project was to use it as a learning experience.

SEE ALSO
NVIDIA PhysX Demo Shows Rigid Body Physics Interactions on a Massive Scale

Chroma Lab‘s mesmerizing and reactive visuals remind us of Cabbibo’s work, who was recently interview by the Voices of VR podcast.

The post Massive Particle Simulation Playground ‘Chroma Lab’ Launches on SteamVR appeared first on Road to VR.

Doraemon ‘Anywhere Door’ Uses Simple Props for a Brilliant VR Experience

‘Doraemon’ is a popular manga and anime involving a robotic cat who travels back in time to help a young boy. Among Doraemon’s many tricks from the future is the ‘Anywhere Door’ which can be opened to transport people… well, anywhere. A VR take on this idea uses simple props to make a unique VR experience.

Bandai Namco’s ‘Project i Can‘ has made the Anywhere Door a (virtual) reality using the HTC Vive, Leap Motion, and a few simple props tracked with attached Vive controllers. The result is a compelling experience that turns the ordinary into the extraordinary. In the video heading this article you can get a glimpse of the Doraemon Anywhere Door experience which saw recent installations in out-of-home VR locations in Japan.

Doraemon-Anywhere-Door-vr-4The experience makes use of a real door and a desk. Both real life props are aligned with their virtual counterparts and serve initially to anchor the user in a rather mundane virtual room. But then the users are asked to open the door—which they do by physically reaching out and opening the real door—which reveals a portal into an entirely different space: a vast arctic environment filled with icebergs and ocean. As users step through the door frame they enter into that environment but can still see the room through the doorway behind them.

The Leap Motion hand-tracking sensor is used not only to let the users see their hands in VR—making it easier for them to grab the doorknob with their own hands—but also frees up the spare Vive controllers to be used as simple motion trackers to align the opening of the real door with the virtual door (and later the desk drawer too).

As users walk forward into the arctic space and onto the iceberg, the ice around them crumbles, which serves not only for a little scare, but also smartly keeps them from bumping into the boundary of their tracked space (which would otherwise break the illusion of the vast area before them).

As users walk back through the door and into the room, they can look on the other side of the door and see through it as if it’s just an empty door frame. The combination of the ‘real’ door prop and it’s ‘impossible’ virtual abilities makes for deep immersion.

Doraemon-Anywhere-Door-vr-2Another scene seen through the door puts the users on top of a train with a tunnel rushing right at them. As the tunnel nears, it becomes apparent that the users are on a collision course, though they can be saved by ducking.

The same real/virtual prop technique is used for the nearby desk, which has a drawer that, when opened, reveals a portal into a space that’s deeper than the drawer itself could possibly be. Smartly, the draw bottom has been removed, which allows the users to reach ‘through’ the bottom of the drawer, selling the illusion of impossible physical geometry.

The mix of real props and virtual reality is taken to the extreme at places like The VOID, but the Doraemon Anywhere Door experience shows that similar techniques can be applied even in a limited room-scale space with little more than some simple props and a heap of creative thinking.

The post Doraemon ‘Anywhere Door’ Uses Simple Props for a Brilliant VR Experience appeared first on Road to VR.

‘Starfighter Inc.’ Promises VR Space Combat Sim Built on a Hard Sci-Fi Foundation

Starfighter Inc., said to be the spiritual successor to X-Wing vs. TIE Fighter (1997), is one of the most interesting space combat simulation projects currently in development. Recently relaunched on Kickstarter, developers Impeller Studios’ new crowdfunding campaign has a smaller goal but much more to show.

Impeller Studios formed in 2013, launching their initial Kickstarter campaign for Starfighter Inc. in 2015, looking to raise $250,000 to assist the development process. Unfortunately, the campaign failed to reach its goal, stopping just 10% shy of the required funds. The team had already committed to completing the project, so development continued as promised, and now the project is back on Kickstarter, with a modest $150,000 goal, and a “stronger, clearer, and more detailed vision”. This time, the campaign’s path to success looks much more likely; with a healthy 21 days remaining, the project is nearing the halfway point of its fundraising goal.

Starfighter Inc. Kickstarter

The renewed appetite for space-based cockpit games over recent years has been partly fuelled by the resurgence of virtual reality technology, demonstrated by the success of Elite Dangerous, Star Citizen, and EVE: Valkyrie (2016). But Starfighter Inc. intends to tread a unique path, aiming to be the most realistic space combat simulation ever created.

Set in the year 2230, every piece of future technology has been imagined and extrapolated from a logical, engineering perspective, meaning “no stealth, no artificial gravity, no force fields, no FTL drives,” according project designer David Wessman, in an interview with The Escapist. Involved with all four iconic X-Wing games, Wessman leads a team of industry veterans at Impeller, alongside Jack Mamais, lead designer on Crysis (2007).

“Pure, unadulterated tactical simulation combat” is how Starfighter Inc. is described by the studio, with its gameplay focused around real Newtonian physics and heavy component damage simulation. The game aims to be more realistic than Star Citizen or Elite Dangerous, yet its PvP multiplayer space warfare centres around dogfighting, something that would be far more approachable with arcade-style physics, as found in EVE: Valkyrie. A brave combination, one that could be overly intimidating for the casual gamer, but Impeller’s hard science objectives are set in stone.

“All the game rules are based around hard science fiction, we’re not going to cater to a mass market, we’re going for a specific type of game”, insisted Mamais in a 2015 developer Q&A session. “When you finish our playing game, I think you’ll be able to really fly a ship in space, let’s put it that way.”

SEE ALSO
'Assetto Corsa' Racing Sim Adds Support for Vive & OSVR via Native OpenVR

Perhaps it will ultimately pay off, as virtual reality and simulation go hand in hand; experiencing realistic physics while using a VR headset intrinsically feels correct, so there’s no doubt that VR will be the recommended way to play the game. VR support is comprehensive, covering all the major PC hardware – the HTC Vive, Oculus Rift and OSVR. Whether the unforgiving combat systems, frictionless flight model and dedication to hard science consolidate into a compelling product remains to be seen, but Impeller Studios’ boundless enthusiasm suggests they’re onto something they believe in deeply.

The post ‘Starfighter Inc.’ Promises VR Space Combat Sim Built on a Hard Sci-Fi Foundation appeared first on Road to VR.

‘Assetto Corsa’ Racing Sim Adds Support for Vive & OSVR via Native OpenVR

This week’s v1.13 update to Assetto Corsa includes native OpenVR support, adding official Vive and OSVR support to the title. The popular racing simulator previously only supported the Oculus Rift, and HTC Vive owners had to use the Revive hack to launch the game in VR.

Assetto Corsa has a long history with virtual reality, having supported the first Oculus Rift DK1 development kit in 2013, back when the simulator itself was in its early stages. With major physics engine and content development to focus on, VR support was always in an ‘early’, unfinished state, but its core driving experience was so convincing, it was worth the effort to jump through the hoops required to race in VR.

SEE ALSO
4 Wheel Recommendations for Newcomers to VR Sim Racing

Rift owners have enjoyed a more effective VR implementation since developer Kunos Simulazioni applied the v1.6.1 update in May 2016, which allowed for full interaction with the in-game UI and the various HUD ‘apps’. Since then, the state of VR in the sim has remained largely the same, with no sign of native support for the HTC Vive, or indeed any VR menus. The Revive injector quickly enabled unofficial Vive support, but it has never delivered the same performance as the Rift, and seemed to exhibit odd world scaling issues.

Today’s v1.13 update brings some important improvements to the software, including much-requested multiplayer features such as reversed grids and mandatory pit stop functions for server admins, and most significantly for VR enthusiasts, the HTC Vive is now supported natively via OpenVR as well as the OSVR HDK. Full details of the update can be found on the game’s Steam page.

assetto-corsa-vr-2
Photo courtesy Kunos Simulazioni

The VR support is in ‘beta’, and while the performance on Vive has improved, the Rift remains smoother still, and OpenVR appears to exhibit similar world scale problems seen when used with Revive. In addition, the recent v1.12.3 update that allowed audio to follow head rotation is not functioning on OpenVR. So for now, the native Rift support remains the stronger VR experience, but with OpenVR actively receiving attention from Kunos, the gap will hopefully begin to close.

The post ‘Assetto Corsa’ Racing Sim Adds Support for Vive & OSVR via Native OpenVR appeared first on Road to VR.

‘Warpaint’ Brings Tabletop Gaming-inspired Turn-based Strategy & Customization to VR

Recently launched on Steam, Warpaint is a turn-based strategy game that lets you customize your troops by painting them to your taste. The game supports both Mac and PC, and features VR support via SteamVR (Vive, Rift, and OSVR). Taking inspiration from board and tabletop games like hexagonal chess and Warhammer, Warpaint’s gameplay is easy to pick up but hard to master.

Real-time strategy gaming seems to suit VR pretty well, but few developers have chosen the turn-based approach; Warpaint shows that the more sedate pace of turn-based strategy is a natural fit for VR too. Commanding an army of dwarves with different movement abilities, the gameplay is mostly tactical, and played with a surprisingly high degree of tension, thanks to the ever-present threat of catapults.

With red projectiles reminiscent of the balls from board game Weapons and Warriors, the catapults introduce a dextrous aspect to Warpaint’s gameplay that, like many action board games, benefits from a skilled aim and a bit of luck. Catapults have the potential to change the momentum of a game—pulling off a double kill with a lucky bounce for example—but there is always a danger of taking out your own pieces too.

Friendly fire triggers some amusing ‘sorry!’ and ‘whyyy?’ dialogue from the dwarves; the voice acting is a stand-out feature, adding a healthy slice of charm to what is otherwise a rather plain presentation. I’d like to hear a wider selection of dialogue, perhaps even a battle announcer. Certainly a narrator for the tutorial would be welcome.

Warpaint doesn’t have the graphical chops to produce the most enticing screenshots or footage, but the straightforward style at least works effectively in VR with clean edges and high performance. No doubt the game would make a better first impression with a few additional effects, combined with a more integrated visual design for the UI and in-game motion controller models.

Otherwise, the game presents itself as a solid production, with well-balanced gameplay and a great set of features, including ranked matchmaking. You can play online or locally, against friends or AI, with VR users and monitor users playing together. The VR implementation isn’t attempting to reinvent the wheel here, it’s simply an effective and compelling option. While the game allows instant teleporting around the play area from multiple scales; the most useful tactical view is from above and at a distance, meaning that monitor users shouldn’t feel at a disadvantage, although I did find it easier to gauge my catapult shots using a headset.

It would be useful to have a way of adjusting the distances between different camera scale toggles, as the lower option sometimes feels too close, and the next height up sometimes feels too far away, and perhaps an option to rotate the view during or after a teleport could help those using ‘standing’ VR mode.

Aside from the joy of firing catapults in first person, the Army Painter system is the most interesting use of VR in the game, which allows dwarf customisation in the same way one might paint Games Workshop miniatures. Rotating the piece in one hand while airbrushing the fiddly bits with the other captures the feel of the hobby in a satisfying, impressively robust way. A system allowing for full customization, including limb-posing and accessorizing with different pieces of armor and weapons would take things to a whole new level and we hope it will be considered for future versions of the game.

Warpaint’s modest asking price is perhaps representative of the fairly small amount of content available, but it is a game made with care that deserves your attention. Developer Adam Thompson has been actively responding to initial feedback, having already rolled out fixes and improvements: the full details are available on the game’s Steam page.

The post ‘Warpaint’ Brings Tabletop Gaming-inspired Turn-based Strategy & Customization to VR appeared first on Road to VR.

‘Raw Data’ Early Access Review, Now with Oculus Touch Support

Raw Data, a first-person combat game from Survios currently in Early Access, is one of the most fast-paced and exciting games out for HTC Vive and Oculus Touch right now. Far from being a simple shooting gallery, Raw Data gives you an impressive range of abilities and physical agency, making you feel like you’re in real danger. And if you can master the controls, you’ll feel like a superhuman badass too.


Raw Data Details:

Official Site
Developer: 
Survios
Publisher: Survios
Available On: HTC Vive (Steam), Oculus Touch (Home)
Reviewed On: HTC Vive. Oculus Touch
Release Date: July 15th, 2016 (Vive) – March 16th, 2017 (Touch)


Note 03/16/17: The article has been updated to include impressions of the game’s recent support for Oculus Touch. You’ll find those impressions in a section at the bottom. The article is otherwise untouched, save the insertion of ‘Oculus Rift/Touch’ where needed. Because of the herculean effort of updating every Early Access review to reflect changes, you’ll see that initial impressions are left intact, but you’ll also find a section below discussing updates since the game’s July 2016 launch on Steam.

Note 07/18/16: This game is in Early Access which means the developers have deemed it incomplete and likely to see changes over time. This review is an assessment of the game only at its current Early Access state and will not receive a numerical score.


Eden Corp, your standard “we’re not evil” evil corporation, is oppressing the world, and it’s your job as a member of hacker group SyndiK8 to infiltrate them. Choosing your character—the gun-wielding ‘gun cleric’ Bishop or the katana-swinging ‘cyber ninja’ Saija—it’s your job to extract massive amounts of data and defend vulnerable data cores so you can expose Eden Corp for what they really are, a “we say we’re not evil, but in all actuality we’re super evil, and you probably should have known that already” type of company.

Oh. And they have killer robots.

Gameplay

Although Raw Data is essentially a wave-shooter, it’s anything but simple, as it presents an engaging blend of tower defense elements, special unlockable moves, and a multiplayer mode that will have you battling alongside your friends on Steam or Oculus Home. Yes, that’s cross-platform, folks.

There’s a real sense of immediate danger in Raw Data too. I don’t know if it’s the fact that the game’s robot adversaries are well over 2 meters tall, or that they creep forward with seemingly no regard for their own safety, or that they’re constantly firing lasers at my face, or that when they come up to you they start punching you in the face—but it’s safe to say that Raw Data put me in a real panic the first few times I played.

bishop-and-saija-raw-data-social

If you choose Bishop, it’s best to practice with your pistol back at the starting point before you jump right in, because once you’re in a mission the learning curve gets steep fairly quickly. Because robots. Are. Everywhere. And if you don’t immediately understand how to reload consistently, you’re due for a robo-beating.

Later on in the game I learned how to reload my pistols instantly by touching them to my hip/ lower back, but the early manual reloading—using one hand to pull out a magazine and slide it into my pistol—was pretty frustrating. Several times while ducking behind a barrier to hide from an onslaught of baddies, I ended up swapping my empty pistol into my non-dominant shooting hand somehow, which is super frustrating when you have a load of enemies firing laser and punching you in the face. It happened consistently enough to make me more aware of how to carefully reload, and also keep an eye on my bullet counter so I didn’t run dry of bullets in time of need.

Then again, if you do screw up somehow by reloading, you can always punch them. No, really. You can punch a robot in the face to death. This is great when it works, which isn’t all the time though, and the same goes for Saija’s swords.

Raw Data - Screenshot - Dual Wield

Using the sword should probably be the easiest, and most gratifying of the two, and Saija’s energy katanas sound good on paper if you’re the sort of person who wants to dispatch your enemies up close and personal ninja-style. I didn’t feel like they always worked as they should though, as slashing at a target sometimes didn’t register a hit. Thankfully you can also fire range weapons like ethereal shurikens, and even toss your swords like boomerangs, which are both reliable. If only up-close combat was.

Whether you’re slicing or shooting though, detaching an evil robot’s head from its body and seeing purple fluid spurt out gives me a clear sense of accomplishment. And getting through all four, which took me well over 3 hours, was an even bigger one, requiring me to recruit the help of a friend to accomplish.

Since it’s in Early Access, there are currently only two heroes (see update section), but Survios told us that at least two more are coming out with the game’s full release. They also gave us a better look at the individual abilities and weapons in our deep dive with the Raw Data devs if you’re interested in a more detailed look at the game.

Immersion

As far as VR first-person shooters go, Raw Data is probably the most feature-rich out there. The world is cohesive and clearly approaching what I would call ‘AAA level’ of polish. That said, there are a few things that may thwart your attempts at feeling fully immersed in the space, all of which are no real fault of the game itself.

bishop-and-saija-raw-data-social
See Also: 5 Minutes of Blistering ‘Raw Data’ Gameplay, Steam Early Access July 14th

Avatars in multiplayer are kind of wonky. Because both the Vive and Oculus Rift only has three tracking points (the headset and two controllers), Raw Data is essentially making its best guess at the position of your full body. It does this by using inverse kinematics (IK)—a method of predicting how your joints bend—and then cleverly blending animations to smooth out any accompanying strangeness. That doesn’t always stop elbows and knees from bending the wrong way though in VR, making you look weird to your friends in multiplayer. This is however pretty much unavoidable when dealing with full body avatars using the Vive’s provided gear, so you certainly can’t knock Survios for putting their best effort forward.

Robots sometimes clip through you. On one of the levels (I won’t say which as to avoid spoiling the fun) you’re introduced to crawling, zombie-like robots. Their beady glowing eyes stare at you as they crab-walk in from the darkness, predictably scaring whatever bejesus you may still have retained from the previous level. That is until they jump at you and clip through your body, breaking the illusion. It’s clear that AI just isn’t good enough yet to guarantee that enemies will react to your physical movements, or anticipate where you’ll be next.

These are relatively minor gripes when talking about immersion, and aren’t unique to Raw Data.

Comfort

Teleportation is one of the best ways to get around in VR in terms of comfort, and Raw Data has a special take on it that has some interesting trade-offs. You don’t actually blink-teleport, but rather you quickly glide to your chosen spot. Because the game uses plenty of particle effects, and the transition is quick enough, danger of motion-induced VR sickness (aka ‘sim sickness’) is pretty minimal, but more than you would experience with blink-teleportation. This, I felt, keeps you more present in the game by letting you keep an eye on the action as it happens around you so you can better plan your next split-second attack.

With the exception of Saija’s jump move, which launches you in the air for high-flying downward strike, the game is surprisingly comfortable for what is shaping up to be one of virtual reality’s greatest first-person shooters.

Oculus Touch Impressions

According to Survios, the Oculus version of Raw Data—which for now only seems accessible through Oculus Home and not Steam— has been “completely optimized and reengineered specifically for its two- and three-camera tracking and Touch controls.”

If you have three or more sensors, you’re likely to experience the game’s room-scale glory just like the Vive, letting you turn around and slash and shoot with nary a care for your IRL direction. However, if like most people you only have two sensors, you’re in for a bit of a learning curve to get past the Touch controller’s biggest out-of-the-box limitation: occlusion.

To combat this, Survios has enabled a 90-degree snap-turn, aka ‘comfort mode’ to go along with the game’s frenetic teleportation scheme as well as an ‘arrow guardian’ to help you recognize when you’ve turned completely around and are about to lose Touch-positional tracking. The arrow guardian isn’t at all annoying thankfully—i.e. no audio cues, or big ‘TURN AROUND’ signs to block your line of sight so you can take a quick shot at an incoming robot. It simply flashes a neon arrow to get you turned back around, something that may seem garish in any other game, but works well in the high stress, 360 environment of Raw Data.

Raw Data is still in early access, meaning small things like button mapping aren’t final. That said, I had trouble with this aspect of the Touch-compatible game.

oculus-touch-3

To snap right, you press the ‘A’ button on your right controller; and to snap left, the ‘X’ button on your left—logical and simple. In the thrill of the fight though, I kept instinctively wanting to use the joy stick for this like many other games. Also, because the left snap is mapped to ‘X’, I kept accidentally mashing ‘Y’ which brings up a menu screen, effectively rendering my reloading hand useless until I could figure out what I did wrong. I concede that sometimes I have what is called in the medical field as ‘dumb baby fingers’. Again, three sensor setups won’t suffer my dumb-baby-fingered plight, as you can play the game with the knowledge that your Touch controllers will be tracked in room-scale.

Despite the dumb-baby-finger learning curve and having to pay closer attention to the new arrow guardian, Raw Data on Oculus Touch can be just as fun as the Vive version.

Updates

Survios has pushed several updates for the game while still in Early Access, including a new shotgun-wielding hero (‘Boss’), greatly improved multiplayer, and a new mission called Cataclysm which the studio promises is “the most challenging level to date.” According to Survios, players on both platforms also gain access to several brand-new features, including a balancing of new and reworked abilities for heroes Saija and Boss.

You can check out all of those any more on Raw Data’s Steam announcements page.


Summary: Raw Data is a heavy-hitting, fast-paced game that’s more than just a simple wave shooter. While it presses all the right buttons with atmosphere and feel, the game is on the bleeding edge of virtual interaction, which sometimes doesn’t work as well as it should. Despite its technical flaws, it’s one of the best VR shooters for HTC Vive and Oculus Touch out currently.


road-to-vr-exemplar-ultimate-by-avaWe partnered with AVA Direct to create the Exemplar Ultimate, our high-end VR hardware reference point against which we perform our tests and reviews. Exemplar is designed to push virtual reality experiences above and beyond what’s possible with systems built to lesser recommended VR specifications.

The post ‘Raw Data’ Early Access Review, Now with Oculus Touch Support appeared first on Road to VR.

‘Obduction’ Adds Motion Control, Coming to Vive and Oculus Touch This Month

Cyan’s spiritual successor to Myst is launching with all-new motion control support on March 22nd for HTC Vive and Oculus Touch. The game originally released on Steam in August 2016, receiving initial VR support for the Rift in October.

Following in the legendary footsteps of Myst and Riven, Obduction presents an ideal VR setting, taken at a slow pace, encouraging players to study the environments carefully, finding clues to solve puzzles in a curious new world. The original VR support for Oculus Rift began as a stretch goal during the game’s 2013 Kickstarter campaign, and arrived in October 2016, a couple of months after the standard game launched on Steam. The game received a free update and launched on the Oculus Store at the same time, and was praised for its visuals and puzzle diversity.

obduction2Using the ‘blink’ teleport feature, the game feels the most like Myst, although a freeform movement with snap turning was also available, which was then updated in November to include a smooth turning option for those unaffected by this contributor to VR sickness. Since then, Cyan have focused on bringing the experience to other headsets, announcing the game would come to PlayStation VR and HTC Vive in 2017, with the major addition of motion controller support.

SEE ALSO
'Obduction' VR Review

The new version arrives on HTC Vive and Oculus Touch on March 22nd on Steam, GOG, Humble Store, and the Oculus Store for $29.99. Existing owners will receive the update for free. Motion control should be a perfect fit in a game scattered with detailed objects to study, and involves extensive button and lever interactivity.

“We have over 200,000 fans on our Steam wishlist, many who have been asking for hand controls for Obduction. As a VR-centric studio, we’re thrilled to be delving even further into these platforms, bringing ever deeper immersion to our worlds and pushing the edge of what’s possible”, says Rand Miller, CEO, Cyan.

obduction1Visitors to PAX East this weekend will have a chance to preview the Oculus Touch version in the Indie MEGABOOTH, and there is a further opportunity to try the game at the Indie Corner of the SXSW Gaming show floor, from March 16th to 18th, 12-8pm at the Austin Convention Center, Exhibit Hall 2 – plus Rand Miller will be taking questions on the SXSW Gamer’s Voice stage at 7:45pm on March 17th.

The post ‘Obduction’ Adds Motion Control, Coming to Vive and Oculus Touch This Month appeared first on Road to VR.

‘Waltz of the Wizard’ ‘Ghostline’ Analytics Reveal Some Surprising Player Behaviour

Aldin Dynamics has released a detailed breakdown of user data gathered during over 300,000 sessions of Waltz of the Wizard gameplay. Launched in May 2016, the motion-controlled VR game was developed with the insights gained from using Aldin’s own data visualisation tool Ghostline.

ghostline-logoDedicated to VR software development since early 2013, Aldin Dynamics is one of the most experienced studios in the world, launching software on Oculus developer kits and Gear VR. In May 2016, their motion control game Waltz of the Wizard launched on Steam for free, and quickly became a popular showcase for the HTC Vive, having seen over 300,000 sessions from over 100,000 players. It is currently the highest-rated VR app on Steam.

However, the wizardry is more than skin deep. The game acts as a test bed for Aldin’s real flagship software, Ghostline. This large-scale analytics and visualisation tool has been in development in since January 2015, served a vital role in prototyping Waltz of the Wizard’s level design and gameplay, and now acts as a rich data source for VR user habits within the released game. As described in this Polygon feature, Ghostline has the ability to record the actions of every user (via automatic, anonymous data collection), which can be replayed and viewed from any perspective, including from the original first person view. This ‘user ghost’ visualisation is far more efficient and less intrusive than shooting video of people playing in VR, and has many other benefits of in terms of detailed analysis of usage patterns and behaviour.

waltz of the wizard2Aldin Dynamics has now shared some of the data created within Waltz of the Wizard using their Ghostline technology. As a game designed to demonstrate room-scale VR while accommodating standing VR, some of these stats aren’t too surprising – with 87% of players using a room-scale space. Understandably, play time is higher in room-scale, with session lengths 19% longer and lifetime averages 72% longer; the game is simply more engaging when given more freedom to move around.

Ghostline's Analytical View of WotW's Playspace
Ghostline’s Analytical View of WotW’s Playspace

The detailed room-scale space breakdowns by country follow some logical patterns too, as countries with vast land mass like China, USA and Canada have the largest average play areas (China is highest at 5.9m²), and the densely-populated Japan has the smallest at 4.4m². Some of the less-specialised stats such as audience and hardware data are already available through Steam’s own tools, but Ghostline’s ability to combine every metric in such detail is unprecedented.

One of the most critical stats is that room-scale players physically look around 18% more than standing players, which has many implications for level/gameplay design – trying to cater to the standing player who is on average more reluctant to turn their head. The vast quantity of interaction/movement data available to Ghostline allows for a granular analysis of players’ physical behaviour. Within each scene from the game, it displays data relating to the amount of physical locomotion, button presses and head movement in degrees. The Wizard’s Tower scene, which contains the spell mixing table, scores the highest on interactivity, while the Hallway, which presents a sudden change of atmosphere ‘designed to induce a fight or flight response’, results in the highest level of physical movement.

waltz of the wizard4The granularity continues into the more amusing stats. Of course, nobody can resist causing damage – over 19 million crossbow bolts have been fired, and over 14 million fireballs have been cast. The wizard’s assistant has been shot over 29,000 times, and ‘Skully’ has been thrown out of the window by 5% of players, and drowned in the cauldron 17% of the time. Some stats may seem trivial, but as Aldin explains, “the smallest of details can make or break an experience. For this reason it is absolutely vital to pay careful attention to the user experience and ensure that your content is having the exact impact that you envision”. Aldin believes that analysing at the level Ghostline provides is key to making a great experience, and Waltz of the Wizards’ unmatched 99% approval rating on Steam is testament to that theory.

The post ‘Waltz of the Wizard’ ‘Ghostline’ Analytics Reveal Some Surprising Player Behaviour appeared first on Road to VR.

‘AirMech: Command’ Gets Major Oculus Touch Update, Launches on Steam VR

VR motion control comes to AirMech: Command, and the game has been released on Steam with support for multiple VR headsets and controllers through OpenVR. The game originally appeared as a launch title for the Oculus Rift in March 2016.

Drawing direct inspiration from pioneering real-time strategy title Herzog Zwei, AirMech started life as a free-to-play game on PC in 2012, where it has remained an open beta. Optimised for gamepad control like the Mega Drive/Genesis game, AirMech naturally found its way to Xbox and PlayStation consoles in the form of AirMech Arena in 2015. As Oculus launched the Rift with a seated, gamepad-controlled focus, the game was again in an ideal position to transition to a new platform, and AirMech: Command became an exclusive launch title for the headset on March 28th 2016. The game was largely well-received, showcasing VR’s suitability for the RTS and MOBA genres.

Today, Carbon Games released a major update, adding support for Oculus Touch controllers (existing owners of the Rift version receive a free update). And with the timed exclusivity complete, the product has also launched on Steam with full OpenVR support. As shown in the teaser trailer, the motion controls allow for brand new ways of interacting with units and navigating around the battlefield, described by the creators as ‘a huge game changer for RTS games in VR’.

By using two virtual cursors, Carbon have devised a way of amplifying hand movements for faster control, and the zoom and rotate functions mean that you can play in a single spot like a board game (seated VR is still supported) or walk around a massive world in room-scale VR.

The post ‘AirMech: Command’ Gets Major Oculus Touch Update, Launches on Steam VR appeared first on Road to VR.

‘Bigscreen’ Social Computing Space Metrics Show Big Value for VR Power Users

darshan-shankarBigscreen VR announced that they raised $3 million dollars for their “social utility” VR application. Bigscreen gives you access to your computer screen in VR, which is a deceptively simple idea but one that is unlocking new ways of working on your computer and enabling collaborative social environments that range from virtual 2D video game LAN parties to productive work meetings.

LISTEN TO THE VOICES OF VR PODCAST

I had a chance to catch up with founder Darshan Shankar at Oculus Connect 3 last October to talk about his founding story, and how he’s designed Bigscreen with privacy in mind through encrypted peer-to-peer networking technology that he developed. It’s a formula that seems to be working since he reports that “power users spend 20–30 hours each week in Bigscreen,” making it what Shankar calls, “one of the most widely used ‘killer apps’ in the industry.”

Those are astounding numbers for any social VR application, and the key to Bigscreen VR’s success is that they’ve been providing a more immersive and social experience of 2D content ranging from games to movies, and pretty much anything else you can do on your home computer.

The latest release of Bigscreen enables you to have up to three monitors in VR, which could provide an even better experience of working on your computer than in real life. You can stream Netflix or YouTube on a giant movie screen while playing a video game, designing an electrical circuit, browsing Reddit, or creating a 3D model in Maya. In Bigscreen, you can basically do anything that you can do on your computer screen, but in VR.

bigscreen-vrThe limited resolution of today’s headsets for comfortably reading text is the biggest constraint for now, but there are plenty of other tasks that people have found are more enjoyable in VR than in real life. It’s not just the immersive nature, improved focus, and unlocking the spatial thinking potential of your brain, but in Bigscreen you can do it with friends.

Adding a social dimension to computing in a private way is one of the keys to Bigscreen’s success. You can use Bigscreen by yourself without anyone else; you can create a private room using peer-to-peer technology such that what you’re actually doing in Bigscreen isn’t even being passed through any servers on Bigscreen’s side. And if you want to have a public cafe experience and connect with hardcore VR enthusiasts from around the world, then create a public room and see who comes through. It’s a wide range of people looking to do everything from connect socially and casually to recreating the cafe experience of increased focus that can come from working in public spaces away from the private context of your home.

Taking that all into account and based upon my own direct experiences of using Bigscreen over the last couple of weeks I can say that Bigscreen VR is definitely the leading contender to becoming one of the first killer applications of VR. It’s a social utility with the potential to connect you to friends, family, romantic, and business partners, as well as complete strangers who spend a considerable amount of time living in the early days of the metaverse.


Support Voices of VR

Music: Fatality & Summer Trip

The post ‘Bigscreen’ Social Computing Space Metrics Show Big Value for VR Power Users appeared first on Road to VR.