It’s almost easy to take for granted nowadays that virtual reality (VR) has become a much more comfortable technology enjoy, with developers having managed to steer experiences through tremendous amounts of experimentation. Yet it wasn’t that long ago that VR was the wild west of videogame development, possibly even more so when it came to 360-degree video content. Sony Interactive Entertainment (SIE) has been one company looking to push the VR fold, whether that’s through hardware like PlayStation VR or software. Last year SIE released Joshua Bell VR Experience, featuring the acclaimed violinist. Creating the interactive video proved to be highly problematic, as Ian Bickerstaff explained in a recent session at the Games Developers Conference (GDC).
Bickerstaff is the Director of Immersive Video Technology at SIEE, and they wanted to create a piece that showcased the benefits of positional tracking, especially when it came to audio. Becoming far more commonplace in modern VR development to aid immersion, technologies like spatial audio help cement a viewer in whatever they’re watching/playing, being not only able to judge direction but also depth and distance of the sound.
So to do this Bickerstaff employed the talents of Bell, a musician into his technology who had been shown an early Project Morpheus prototype (PlayStation VR’s codename) and was keen get involved.
So the team created an advanced 360-degree video using a pair of Sony 4k Action Cameras to record the stereoscopic video. The major issue with the system was the fact that the cameras had no Gen Lock – to make sure they were perfectly synced, recording the same content. Through some trial and error, using a flashgun and some well timed switching, the team managed to get both to sync up.
Once that process had finished the team filmed the recording studio at various light levels, with all the video lights on, then off, then at night, eventually merging them all for the best effect. Then it was onto recording Bell and his performance before recording classical pianist Sam Haywood’s performance.
Having created the first edition, Bickerstaff then found that while it was immersive the experience gave people headaches, a common complaint in early VR content as developers were learning what worked and what didn’t. And it’s all to do with movement, what the eye’s are seeing but other senses aren’t.
To solve the problem, Bickerstaff and his team came up with a novel approach to 360-degree video, allowing the viewer a degree of freedom to move as if they were actually there. Thinking they’d have to record it all again Bickerstaff found that they had the majority of the information to solve the issue without having to ask the award-winning violinist back.
One of the major issues to sort was Occlusion. The original cameras were set at a fixed point, whereas the final experience would allow viewers to look round items to areas that weren’t recorded. Luckily due to the many shots captured with and without the performers that data was easier to reproduce.
What was a little more difficult was the performers themselves. Bell proved to be the easier of the two, with the team using a method from the film industry called Rotorscoping, a technique employed when no green screen was used, meticulously cutting a performer out one frame at a time. Bell was in full view of the cameras but Haywood wasn’t, with parts of him – especially his legs hidden from view behind the camera. Bickerstaff had to recreate it using some similar shoes and his own legs. After meticulously cutting Bell out of the background and adding new parts for Haywood, it was then time to add the little details, shadows under the performers, light reflections in the glossy piano, all integral parts that you might not notice at first yet ground the whole experience in reality.
Having been available for PlayStation VR for the best part of a year, its certainly one piece of content that owners should experience for themselves, especially as it’s free. During the GDC 2018 session Bickerstaff also revealed work had begun on several other projects without going into further details. Once VRFocus learns more we’ll let you know.