TPCast Announces New ‘Plus’ Version of Wireless VR Adapters for Vive and Rift

TPCast, manufacturer of wireless adapters for HTC Vive and Oculus Rift systems, has announced an upgraded version of its eponymous wireless system called TPCast Plus. According to the company, the new Plus version, which will be demonstrated at CES this year, offers several hardware and software improvements, including a new mounting system that supports hot-swap batteries.

Cutting the cords on high-end VR is an important step for the future, but some early adopters have looked to third-party solutions such as TPCast that already offer products to transform a Vive or Rift into a wireless system, albeit at a high price. TPCast’s initial hardware launch hasn’t been the smoothest of rides, both in terms of delayed product availability, and the rather complicated setup process. The current hardware also had issues with microphone support, although there are workarounds.

TPCast’s first generation device, image courtesy TPCast

The TPCast Plus product family aims to address many of the problems based on customer feedback, claiming ‘full support’ for microphones, and “increased stability and anti-interference.” The setup process is said to be simplified in various areas, most notably with a “plug-and-play” USB wireless adapter rather than a router. According to the press release provided to Road to VR, the new adapter “supports automatic restart, wireless interference reduction, and automatic wireless channel detection that substantially improves the stability of the wireless VR connectivity.” It claims the restart time has also been ‘dramatically decreased’. The performance of the connectivity appears to be unchanged, with the same ‘2K resolution per eye’ at 90 fps with sub 2ms latency.

An improved integration with the headset is achieved with a “built-in backplane” that incorporates the battery and wireless adapter into a single area, while supporting a hot-swap battery. A 4-cell battery charging cradle, is said to be part of a series of complementary products that enhance TPCast Plus product family. Production of the existing TPCast solution continues, and the expected release date for the TPCast Plus, which will be available for both HTC Vive and Oculus Rift, is in “the first half of 2018.”

“The launch of the TPCast Plus Adapter positions TPCast as the leader in the wireless VR market. The consolidation of the adapter components into one wireless VR unit allows a fully immersive user experience,” says Michael Liu, CEO of TPCast. “We expect that this upgraded product family will lead to a rapid increase of VR content that takes advantage of wireless VR experiences, thus bringing more consumer and enterprise customers into the wireless VR market.”

The post TPCast Announces New ‘Plus’ Version of Wireless VR Adapters for Vive and Rift appeared first on Road to VR.

NextVR Plans 6DOF, Increased Quality, and AR Support for Live VR Video in 2018

Today, live event broadcasting specialists NextVR announced three technology advancements to their platform coming this year: six degrees of freedom-enabled content, higher resolution output, and augmented reality support. A sneak peek of the technologies is being shown to media at CES 2018.

Positional tracking is the dream for immersive video content, but it is a complex hardware and software challenge. If done correctly, the improvement over common 3DoF content is significant, both in terms of comfort and presence. NextVR claim that their 6DoF solution will make obstructed views “a thing of the past” and that users will be able to naturally shift their vantage point to look around a referee or spectator as they would in reality.

The company haven’t provided details about their process, but 6DoF support has been on their road map for some time, having spoken about the use of light field technology for this purpose in 2015. High quality volumetric video has been demonstrated with enormous camera rigs from companies such as HypeVR and Lytro, but NextVR’s solution is likely to be more compact for the practicalities of event capture and broadcast; on-demand 6DoF content in 2018 is expected be followed by live 6DoF broadcasting.

“VR is the most demanding visual medium ever created and we’re just beginning to deliver on its potential to convincingly create experiences that mimic reality,” says David Cole, NextVR Co-Founder and CEO. “The ability to move naturally inside the experience and the increased ability to see detail add a critical level of presence and realism.”

Higher fidelity output is coming to NextVR early this year, as a result of platform optimisations. The company says it has “exploited and enhanced the detail capture capability of its proprietary VR cameras and encoder infrastructure,” which enables “much higher resolution and higher detailed playout on compatible VR headsets.”

In addition, NextVR plans to “broadly support” AR devices in mid-2018. Exactly how NextVR’s popular live event content will be presented in augmented reality is unclear, but the company says “this cohesive blend of real and transmitted reality allows for real life social engagement while still delivering an unmatched entertainment experience.”

Launched in 2009, NextVR has many years of experience in live broadcast, transitioning from stereoscopic 3D content delivery as Next3D to a VR-focused platform. In recent years, the company has concentrated its efforts on mobile VR platforms such as Gear VR and Daydream, and only recently introduced support for 6DoF-capable hardware in the form of Windows Mixed Reality and PlayStation VR apps in October 2017. While the Oculus Rift and HTC Vive surprisingly still lack support, the company has further plans to support new hardware this year, “including affordable and powerful all-in-one mobile headsets.”

We have feet on the ground at CES, so check back for all things virtual and augmented.

The post NextVR Plans 6DOF, Increased Quality, and AR Support for Live VR Video in 2018 appeared first on Road to VR.

Tactical Haptics’ New Prototype VR Controller Shapeshifts to Fit Your Game

Tactical Haptics’ newly developed haptic controller prototype uses mechanical sockets that allow them to be mated in different configurations on-the-fly, in order to match a particular virtual interaction more closely than standard VR motion controllers. The controllers incorporate the company’s ‘Reactive Grip’ technology, a unique form of haptic feedback.

Image courtesy Tactical Haptics

San Francisco-based Tactical Haptics is debuting its reconfigurable haptic controllers together with new demo content at CES this week. The hardware is based on the haptic controllers used for Justice League: An IMAX VR Exclusive that has operated at the Los Angeles IMAX VR Centre since November 2017, but with the ability to be mated together in common interaction configurations, such as ‘gamepad’, ‘steering wheel’, or ‘machine gun’ poses.

Image courtesy Tactical Haptics

As described in the press release provided to Road to VR, the mechanical sockets (which appear to be fitted with magnets) “provide a mate-point … to form a semi-rigid coupling between the controllers that allows the users to effortlessly maintain the mated poses.” The images are shown with Oculus Touch controllers for tracking purposes, but they also have mounts for Vive Trackers.

Image courtesy Tactical Haptics

Colony Defense, a new game developed by Tactical Haptics to demonstrate the hardware, is a first-person experience with building and combat elements. The player is asked to join the two controllers to create a ‘physics gun’, then separate the controllers to operate a jet pack and weapon each hand, and the option to combine the controllers in the ‘machine gun’ configuration to operate a heavier blaster. The company says that “significant effort” was put into optimising the placement of the sockets to result in ergonomic poses and to aid on-the-fly reconfiguration while wearing a headset.

Image courtesy Tactical Haptics

A new “brick breaker” style game called Cyber Smash is also at the show, which the company says demonstrates “feeling the inertia of throwing smash-balls and settling of the ball after it rebounds and is caught by the player.” As highlighted by the IMAX VR Centre partnership, the company is currently focused on location-based entertainment, and is working on multiplayer versions of both games for this purpose. It is seeking partner opportunities with high-profile LBEs while at CES this week.

Both games make use of Tactical Haptics’ core innovation: an advanced haptic feedback technology called Reactive Grip, showcased in various prototype controllers since 2013. Actuated plates in the controller’s handle apply friction and shear forces in the hand, creating various tactile illusions such as inertia and elasticity.

Stay tuned to Road to VR for further coverage of CES 2018, including a hands-on with the new Tactical Haptics controller prototype.

The post Tactical Haptics’ New Prototype VR Controller Shapeshifts to Fit Your Game appeared first on Road to VR.

‘Tribe VR DJ School’ Trains You to Mix Music on Real Equipment

Partnering with DJ Kyroman and music school Pyramind, Tribe VR’s DJ School aims to teach the art of live mixing with real DJ equipment modelled in VR, with the goal of allowing your virtually acquired skills and knowledge to transfer to real life mixing equipment. The app launched in Early Access on the Oculus Store in December.

San Francisco-based VR development startup Tribe VR is concentrating on immersive learning applications for virtual and augmented reality to enable users to learn real-world skills. Tribe VR DJ School is their first project, a VR application currently optimised for Oculus Rift and Touch. It was recently showcased on the official Oculus blog alongside live performance platform NOYS VR (Early Access, 2017) and interactive music video Show It 2 Me (2017) as three examples of immersive music experiences created for VR.

In its current form, the user is presented with two digital decks and a mixer based on high-end Pioneer DJ equipment, and the basics of operating the mixer, such as adjusting equalisers and crossfading is explained by a virtual ‘Mentor’. For now, the features are limited, as the single ‘lesson’ only teaches you to play around with two preloaded tracks that are already synchronised. The ‘free play’ mode allows a little more room to experiment with sounds, but the app is missing the crucial feature of being able to import your own music.

Image courtesy Tribe VR

Vinyl Reality (2017), another Early Access VR DJ app on Steam, appears to be further along the path of features, as it allows music import, but appears to be focused on simulating mixing with traditional turntables. Tribe VR DJ School, as the name implies, wants to lean heavily towards teaching, and the developer plans to implement “DJ masterclasses” and “extensive lesson content” over the coming weeks.

This is highlighted by Tribe VR’s partnership with leading San Francisco music production school Pyramind. According to the Tribe VR blog, the team is working together with Pyramind to “develop course content, music tools and services.”

“We see VR and AR as the next steps in improving the way people learn and create music,” says Gregory Gordon, Pyramind CEO and Founder. “We are excited to be working with Tribe to develop methods and approaches for people to learn immersively.”

“We are delighted to be working with Pyramind,” writes Tom Impallomeni, Co-Founder and CEO of Tribe VR. “Greg has built an amazing business and their deep knowledge of all things relating to Music Production and DJing is a massive help to us in our quest to improve the way people learn.”

DJ School is just one example of an immersive learning experience; Tribe VR seems to have ambitions for further learning-focused VR and AR projects.

The post ‘Tribe VR DJ School’ Trains You to Mix Music on Real Equipment appeared first on Road to VR.

‘MuX’ Lets You Build Wild Musical Instruments from Scratch, Now in Early Access

Described by Danish developer Decochon as a “revolutionary music sandbox for VR”, MuX features virtual, low-level synth components that can be connected together and adjusted to generate unique sounds. The various tools allow the creation of complex electronic instruments, which can be easily shared with the community.

Now in open beta via Steam Early Access, MuX is an intriguing addition to the growing library of creative audio software for VR. Currently, only HTC Vive hardware is officially supported, but Rift users have reported success operating in SteamVR mode, albeit with incorrectly-shaped virtual controllers.

Presented with a number of virtual tools in a room-scale space, the user is able to construct all manner of sound generators, using a fundamental oscillator component combined with various modifiers. Most components feature adjustable dials for fine tuning, and serve as the building blocks for the creation of potentially enormous virtual instruments. These can be played manually with motion control, or triggered by buttons, switches, or metronomes. Alternatively, a marble spawning system can be used, allowing the construction of Rube Goldberg-type music machines that trigger sounds as a result of the physics simulation, as shown in the video below:

MuX’s inviting visual presentation, with clean, flat-shaded geometry and a muted colour palette disguises its complexity, as the modular components currently allow for low-level access to the fundamentals of sound synthesis. Currently, a set of somewhat outdated tutorials (created for alpha testers) can be found on Decochon’s YouTube channel, but this is an area that needs serious attention if MuX is to become more accessible to a wide audience.

As explained on the Steam store page, the software is due to remain in Early Access for a year, as the experimental nature of the tools means that users are likely to do unpredictable things. “While developing and testing MuX, we found people using it in ways we hadn’t expected,” writes Decochon. “They also made music and sound that surprised us, things we couldn’t have made ourselves. As we continue to develop and expand MuX, we find Early Access an opportunity to become informed and inspired by what others might create. MuX is an instrument, ready to be played, explored, and enjoyed by others than just us.”

The post ‘MuX’ Lets You Build Wild Musical Instruments from Scratch, Now in Early Access appeared first on Road to VR.

Snapchat Launches Lens Studio for Making AR ‘Lenses’

Lens Studio is a new Mac and Windows desktop app from Snap Inc. for creating, publishing, and sharing augmented reality objects and experiences in Snapchat, as overlays known as ‘Lenses’. The free software has several templates for getting started, and advanced tools for experienced users and professionals.

Image messaging app Snapchat first incorporated camera-based AR with the ‘Lenses’ feature in 2015, which enabled dramatically enhanced selfies using facial tracking. A major update in April 2017 expanded this to the environment in the form of ‘World Lenses’, making use of smartphones’ rear facing cameras to place virtual objects into live, real-world video capture. This form of smartphone-based AR has been bolstered by the introduction of Apple’s ARKit and Google’s ARCore SDKs for iOS and Android, respectively.

Snapchat World Lens content was previously limited to Snap Inc’s own updates and promotional tie-ins, but with Lens Studio it is now possible for anyone to start creating their own World Lens objects and environments (selfie Lens creation is not currently supported), as explained in the announcement on the official site.

“Whether you’re just starting to dabble in 2D animation or are a professional artist interested in creating your own experiences, Lens Studio makes sharing your creation with the world fast and fun!” writes Team Snap. “With the launch of Lens Studio, we’re excited to make Lenses more accessible to creators, and experiences within Snapchat more personal and diverse.”

The new Lens Studio site provides an overview of the software’s capabilities, as well as thorough guides and templates for getting started, all the way through to optimising and submitting your Lens for approval.

SEE ALSO
Samsung Uses Snapchat's AR as a VR Advertisement

Lens Studio supports FBX and OBJ 3D model formats; the guide suggests those without access to modelling software like Maya or 3ds Max can grab assets from Sketchfab or Poly. For more experienced creators, it also explains the advanced scripting features enabled by the API. Once submitted, a unique Snapcode can be shared (digitally or printed on packaging, stickers, clothing, etc.) for anyone to unlock your Lens in Snapchat.

This is another indication that Snap Inc. intends to be a global force in augmented reality, but it has strong competition, as Facebook also just launched their AR Studio in open beta. Last year, Snap Inc. hired Hollywood effects artist Raffael Dickreuter as a “Concept and Augmented Reality Designer”, and CEO Evan Spiegel was seen wearing ‘prototype AR glasses’ soon after the company acquired Vergence Labs. The resulting Spectacles glasses, launched in November 2016, was seen as the first step towards augmented reality glasses, though function only as a wearable camera for now.

The post Snapchat Launches Lens Studio for Making AR ‘Lenses’ appeared first on Road to VR.

‘Raw Data’ Studio Survios Announces ‘Electronauts’ for Jamming Out in VR

Electronauts is a music creation tool built within an “interactive sonic environment” where users can make, remix, and perform music, regardless of skill level. This is the third VR project from California-based developer Survios, creators of VR action games Raw Data (2017) and the upcoming Sprint Vector.

Scheduled for release in 2018, Electronauts is a radical departure from Survios’ previous work. Described as “the next generation of music creation”, this built-for-VR software aims to give the user the ability to produce slick beats “without needing any prior musical knowledge, skill, or experience.” The title is planned to support SteamVR, HTC Vive, PSVR, and Oculus Rift.

Image courtesy Survios

As the music can be manipulated in real-time, it has strong potential as a live performance tool, allowing users to blend and shape tracks like a DJ. It also features “uniquely designed electronic instruments” that can be played directly, and functions as an elaborate music visualiser. This is achieved with their ‘Music Reality Engine’ that is a form of sequencer, but with non-traditional features optimised for VR.

According to a hands-on from Inverse, the player stands in front of three virtual DJ tables, and the motion controllers become virtual drumsticks. The sticks can be used to hit drum pads directly, but also function as pointing devices to interact with the sequencing tools. “You can interact with the music in a myriad of ways: play instruments, record loops and sequences, layer filters, rearrange pieces, and straight-up compose new parts of the songs,” writes Corey Plante for Inverse. “You can even change the speed at which you slide through the virtual world or adjust the color scheme on the fly.”

Image courtesy Survios

Music creation isn’t a totally new concept for VR. Indeed Inverse compares Electronauts to existing VR visualiser Fantasynth: Chez Nous (2017), and VR musical sandboxes EXA: The Infinite Instrument (Early Access, 2017) and SoundStage (2017). However, Survios’ brings a sizeable studio backed by significant funding to the genre, and is using that weight to attract established musicians and producers to collaborate.

SEE ALSO
This is What 'Sprint Vector' Looks Like Played by a Pro on PSVR

Grammy award-winning music producers Stargate are already on board, with more high-profile collaborations to come, the company says. According to the announcement on Survios’ main site, Stargate will help them “incorporate the music of A-class artists, producers, and DJs” into the software, allowing users to “remix songs and engage with these top artists’ music in a brand new way.”

Image courtesy Survios

Applications for the upcoming closed beta can be submitted on the official Electronauts website, which hopes to encourage musicians to produce music that could be featured in the full release.

“Never before have you been able to create music and interact with sound in a truly immersive way. This opens up for a totally new level of creative freedom and will inspire both seasoned artist and musicians as well as people with no musical training. Anyone can instantly create magic,” says Stargate’s Mikkel Eriksen. “The fact that you can play with material from today’s most talented artists makes it even more compelling. I truly believe Electronauts is groundbreaking, and a game changer in music creation.”

The post ‘Raw Data’ Studio Survios Announces ‘Electronauts’ for Jamming Out in VR appeared first on Road to VR.

VR Short Film ‘Asteroids!’ from Baobab Studios Now Available

Interactive VR short film Asteroids!, the follow-up to Emmy award-winning VR short Invasion! (2016) is now available for Gear VR, Daydream, and Windows MR headsets. From the director of the Madagascar franchise, the full 11-minute cut continues the journey of aliens Mac and Cheez (and robot sidekick Peas) on their deep space mission.

Baobab Studios announced Asteroids! at Oculus Connect 3 in 2016, and has since made an interactive preview of the animation available on Gear VR and Daydream platforms. A 360 video ‘sneak peek’ has also been available in various forms, including the iOS and Android Baobab app.

Today, the full version of Asteroids! is available for free across several platforms. The best experience can be found on Gear VR, Daydream, and Windows Mixed Reality devices, where the animation is rendered in real-time 3D. Sadly, unlike their first short, Asteroids! is not currently available on PSVR, HTC Vive, or Oculus Rift, despite its considerably more interactive design.

Invasion! was the team’s first foray into VR animation, and was a polished experience, but ended all too quickly. It was later slightly extended with an intro narrated by Ethan Hawke, but this felt like an afterthought, and didn’t offer the viewer what was really needed—more time with the characters. Asteroids! is a major improvement, being longer, with more complex animation and now interactivity.

“Different storytelling mediums all have a common goal—to tell great stories through characters that audiences connect with, care about, and maybe even come to love,” Baobab writes on their official site. “The great challenge and great potential for VR storytelling is not simply to achieve this goal, but also, to do it in a way that actually lets the viewer become a part of the story. With Asteroids!, this is a step towards achieving this goal.”

Follow these links to download the Gear VR, Daydream, and Windows Mixed Reality real-time versions. The non-interactive 360 video version of the full short can be viewed via YouTube 360 and Facebook 360, or via the Baobab app for iOS and Android (which also supports the Cardboard VR viewer), but for the reasons above, it is strongly recommended to experience one of the real-time versions first.

Baobab Studios is one of the largest independent VR film studios, having raised a total of $31 million in funding to date, most notably a $25 million Series B funding round in 2016 which also welcomed Larry Cutler (ex Dreamworks and Pixar) as CTO. Their current project, Rainbow Crow, an adaptation of a Lenape Native American tale, is their most ambitious, presented in several chapters, and features musician John Legend.

The post VR Short Film ‘Asteroids!’ from Baobab Studios Now Available appeared first on Road to VR.

Arcade Boxer ‘Knockout League’ to Land on PSVR February 13th

Sony has finally released word on when PSVR owners will be able to get their hands on the popular VR boxing title Knockout League; it’s officially coming to PlayStation VR February 13th. The game has been in Early Access on SteamVR, Viveport, and the Oculus Store for almost a year. It’s uncertain if the PSVR release will mark the game’s exit from Early Access on the platforms listed above.

Update (02/11/18): According to a recent PlayStation blog post, ‘Knockout League’ will be landing on PSVR headsets February 13th. It comes alongside PSVR titles ‘Drunk n Barfight’, ‘Sprint Vector’, ‘Pop Up Pilgrims’, and ‘CubeWorks’.

Original article (12/13/17): Announced at PSX 2017, the full version of Knockout League, with its 9 opponents and training modes will make the transition to Sony’s VR platform. Played from a first person perspective, with motion controllers enabling 1:1 fist tracking, boxing is a natural fit for the current generation of VR hardware. Knockout League differentiates itself from other VR boxing games with an arcade presentation reminiscent of Nintendo’s classic Punch-Out!! series, and opponents with wildly different personalities and fighting styles.

As we described in our early access review of the PC version, the gameplay is rather strenuous, requiring rapid, energetic movements to succeed, meaning Knockout League will likely be an effective workout title for PSVR too. It is currently rated on the Virtual Reality Institute of Health and Exercise as equivalent exercise to rowing for burning calories.

Developer Grab Games specialises in AR/VR experiences, having created John Wick Chronicles (2017), Knockout League, and other mobile apps. Their current major project is the ‘Grab AR Tabletop Platform’, which was showcased at Google’s hardware event in October.

The post Arcade Boxer ‘Knockout League’ to Land on PSVR February 13th appeared first on Road to VR.

10 Minutes of ‘Blood and Truth’ Gameplay Footage Revealed at PSX 2017

SIE London Studio’s upcoming PSVR title Blood and Truth was playable on the showfloor at PSX 2017. The footage captured during an interview with PlayStation Blog’s Sid Shuman at the show is the clearest look at the gameplay sequence we enjoyed during our hands-on in October.

Expanding on the tone and concepts introduced in the acclaimed ‘London Heist’ experience from their first PSVR title PlayStation VR Worlds (2016), London Studio is aiming to deliver a triple-A, first person action game in Blood and Truth. The team haven’t committed to a release window, but the game is likely to appear in 2018.

In the footage, you see their new node-based locomotion mechanic in full flow, along with situational moments of manual movement using motion controls, such as climbing a ladder, or shuffling along an air duct. With no teleportation system in place, players are able to stay engaged in the firefight as they slide to the next node.

“We’re doing a lot of movement. A lot of VR games are doing teleportation and that is not right for us,” said developer Mike Hampden. “We wanted a grounded experience; you move like a soldier, tactically from point to point. You can strafe between points as well – it gives you a lot of control.”

The post 10 Minutes of ‘Blood and Truth’ Gameplay Footage Revealed at PSX 2017 appeared first on Road to VR.