HBO VR events and concerts start today on VRChat

“Garden of Eden,” the first in a series of three live VR art explorations, concerts, and shows organized by HBO, kicks off today at 10 p.m. Eastern time on VRChat, a social VR platform.

Those without a VR headset can watch the “Garden of Eden” livestream on YouTube and will be able to interact in real-time and impact the experiences through their interaction, which includes solving riddles.

During today’s event, guests and participants will explore Afrofuturistic art installations created by David Alabo, Devan Shimoyama, and Adeyemi Adegbesan.

The computer-generated Afrofuturistic works of art by the artists has a lot to do with fantasy, explores African beauty, and combines Afrocentric elements with surrealism to highlight African culture and diaspora.

Afrofuturism art. (Image courtesy David Alabo.)

The second of the three series events, a live poetry performance called An American Dream and which is inspired by the words of James Baldwin and performed by Jurnee Smollett, will be held from 10 p.m. Eastern time on Thursday, September 24. The third event, Music of the Cosmos, a music performance, will take place from 10 p.m. Eastern time on Monday, October 19.

A select total of 100 guests – who include artists and actors – will participate at the three events using Oculus Quest headsets. The actors from the series, and who will be performing at the three shows and concerts include Jonathan Majors, Courtney Vance, Michael Kenneth William, and Jurnee Smollett.

Unity Teams With Magic Leap, Reveals Technical Preview

Hot on the heels of an announcement revealing that their rivals at Unreal Engine will be supporting Magic Leap; as revealed through Magic Leap releasing their developer Software Development Kit (SDK) for the Magic Leap One. Unity have also revealed they are to be supporting the highly discussed augmented reality (AR) headset.

Unity logoDiscussing the matter on the Official Unity Blog, the developers revealed that they were launching both the Unity Technical Preview and the Lumin SDK onto Magic Leap’s own creator portal. Also confirming that several big names that were part of Magic Leap’s collection of Early Access partners were already working to bring applications to the Magic Leap One using the Unity Engine. Those names included Weta Workshop and The Mill, which last year at this time was revealing its role as part of project with Chevrolet and Unreal Engine to produce real-time rendering.

Included Magic Leap features into the Technical Preview (by way of Lumin OS) include:

  • Instanced Single Pass Stereo Rendering
  • World Reconstruction, such as world meshing, semantic labelling of floors, ceilings, and walls, and ray casting to retrieve intersection points with the world’s depth data
  • Physical World Occlusion Culling
  • Eye tracking through fixation point position based on where the user is looking
  • Support for the Control, including 6DOF tracking, trackpad, and lighting
  • Audio specialisation providing fine control over the response of the audio based on user’s movement and audio source position
  • Recognize the user’s hand poses (gestures) and track the position of identifiable points on hands such as the tip of the index fingers
  • Track the position and orientation of specified static image targets in the user’s environment
  • Zero Iteration with Magic Leap Remote (More details below under “Get Started With the Device Simulator”)

Magic Leap One“As we look towards the future, 3D digital content will be the way we interact with the world.” Stated Unity in their blog. “Unity believes the world is a better place with more creators in it, and platforms such as Magic Leap will unleash new forms of creativity which we can’t wait to see. ”

VRFocus will be bringing you more news from this year’s Game Developers Conference throughout the week and will update you on further developments with the much-anticipated Magic Leap One in the near future.

 

Clark Launches Roomscale VR Music Video on Steam

The Mill has announced a new release that will see HTC Vive’s roomscale technology supported in a 360 degree music video for the first time. Partnering with Clark of Warp Records, a new virtual reality (VR) music video set to the British Electro artist’s track Hoova is now available for free via Steam.

Hoova VR Experience screenshot

The Hoova VR Experience explores a dark, dystopian future and investigates topics of isolation, dependency and identity in VR. One of the first music video experiences of its kind, using the new medium for both immersive and interactive content, Hoova VR Experience gives the viewer a chance to engage with the story while hearing the track within a 360 degree spatial, auditory environment.

Directed by The Mill’s Chief Creative Officer Angus Kneale and Executive Creative Director Ben Smith, the Hoova VR Experience is also unique in its advanced use of videogame engine-manipulated human characters. The Hoova VR Experience takes an eerie look into a virtual world, wherein each character is part of a frightening cycle. One by one, the characters become consumed and trapped by the VR experience they are witnessing.

The experience was developed in Unreal Engine 4 by The Mill’s VFX team, who were led by Realtime Supervisor Joji Tsuruga. The characters were created using motion capture of live action talent, carefully choreographed and animated to be accurate to the human form down to a single finger.

London-based The Mill has previously worked on numerous VR experiences for various clients, including Chevrolet, and was acquired by Technicolor in 2016. Other VR music works include a 360-degree video featuring beatboxing phenomenon Reeps One.

Hoova VR Experience screenshot

The Clark Hoova VR Experience is now available to download from Steam, compatible with HTC Vive, for free. The experience also supports, rather oddly, the second iteration of the Oculus Rift development kit, the Oculus Rift DK2. It’s not yet clear how the roomscale experience has been adapted to this older hardware.

You can view a trailer for the VR edition of the music video here, and VRFocus will keep you updated with all the latest innovations in VR music videos.

Facebook Unveils Two New Volumetric Video ‘Surround360’ Cameras, Coming Later this Year

Facebook today announced two new additions to the Surround360 hardware initiative that are poised to make 360 video more immersive. Unveiled at the company’s yearly developer conference, F8, the so-called x24 and x6 cameras are said to capture 360 video with depth information, giving captured video six degrees of freedom (6DoF). This means you can not only move your vantage point up/down, left/right like before, but now forwards/backwards, pitch, yaw and roll are possible while in a 360 video.

Even the best stereoscopic 360 videos can’t provide this sort of movement currently, so the possibility of a small, robust camera(s) that can, is pretty exciting—because let’s face it, when you’re used to engaging with the digital world thanks to the immersive, positional tracking capabilities of the Oculus Rift, HTC Vive, or PSVR, you really notice when it’s gone. Check out the gif below to see exactly what that means.

Originally announced at last year’s F8 as an open source hardware platform and rendering pipeline for 3D 360 video for VR that anyone could construct or iterate on, Facebook is taking their new Surround360 reference designs in a different direction. While Facebook doesn’t plan on selling the 360 6DoF cameras directly, the company will be licensing the x24 and x6 designs—named to indicate the number of on-board sensors—to a select number of commercial partners. Facebook says a product should emerge sometime later this year.

The rigs are smaller than the original Surround360, now dubbed Surround360 ‘Open Edition’, but are critically smaller than rigs capable of volumetric capture like unwieldy rigs like HypeVR’s high-end camera/LIDAR camera.

Specs are still thin on the ground, but the x24 appears to be around 10 inches in diameter (257mm at its widest, 252mm at its thinnest), and is said to capture full RGB and depth at every pixel in each of the 24 cameras. It is also said to oversample 4x at every point in full 360, providing “best in-class image quality and full-resolution 6DoF point clouds.”

The x6, although not specified, looks to be about half the diameter at 5 inches, and is said to oversample by 3x. No pricing info has been made public for either camera.

Facebook says depth information is captured for every frame in the video, and because it outputs in 3D, video can be feed into existing visual effects (VFX) software tools to create a mashup of live-action capture and computer-generated imagery (CGI). Take a look at the gif below for an idea of what’s possible.

Creating good-looking 6DoF 360 video is still an imperfect process though, so Facebook is also partnering with a number of post-production companies and VFX studios to help build out workflows and toolchains. Adobe, Otoy, Foundry, Mettle, DXO, Here Be Dragons, Framestore, Magnopus, and The Mill are all working with Facebook in some capacity.

“We’ve designed with Facebook an amazing cloud rendering and publishing solution to make x24’s interactive volumetric video within reach for all,” said Jules Urbach, Founder & CEO Otoy. “Our ORBX ecosystem opens up 28 different authoring and editing tools and interactive light field streaming across all major platforms and browsers. It’s a simple and powerful solution this game-changing camera deserves.”

Keep an eye on this article, as we’ll be updating information as it comes in.

The post Facebook Unveils Two New Volumetric Video ‘Surround360’ Cameras, Coming Later this Year appeared first on Road to VR.

Behind The Scenes on ‘The Human Race’ Chevrolet Short Film

At the Epic Games keynote at GDC, car manufacturer Chevrolet and visual effect (VFX) studio The Mill took to the stage to show off ‘The Human Race’ – a short film featuring the Chevy Camera ZL1 and the Chevrolet FNR self-driving concept car.

Typically in movies, adverts and TV shows, if a car needs to be used and a real car is either impractical or unavailable, for example for a dangerous stunt or when a brand-new car was wrapped in secrecy, the job would be turned over to the VFX department. Then CGI artists must spend day to weeks of post-production designing, building and refining the VFX needed to make it look convincing to the audience. This would often chew up lots of processing power and man-hours and was usually very expensive.

Blackbird-Using-Mill-Cyclops

London-based VFX firm came up with a solution. The Mill Blackbird is a fully customisable car rig that can be configured to mimic the, length, width, physics and handling of a large variety of vehicles. It comes equipped with a specialised 360° camera called The Mill Cyclops as well as laser-tracking that can accurately keep track of the world around it and relay it back to it’s own specialised AR application that allows the real objects and the virtual ones to be flawlessly integrated together.

This enables the team at The Mill to place the Blackbird into any environment for filming, and in real-time add the skin of whatever vehicle they choose over the top. They can change the colour on the fly even as lighting, reflections and textures are all seamlessly re-calculated in real-time and displayed on the tablet app where they can be tweaked and altered by the user.

The-Mill---The-Blackbird

This technology was used to create the short film ‘The Human Race’, an exciting short film depicting a race between a human driver and an AI-controlled car. This short film forms the basis for a multi-platform campaign that marks the 30th anniversary of the Camero.

You can watch The Human Race and it’s behind the scenes making of mini-documentary below. VRFocus will keep you updated with further updates on AR and other related subjects.

The Mill, Chevrolet and Unreal Engine Combine To Create New AR Advertising

Chevrolet announced today on stage at the Epic Games keynote speech at the GDC the existence of a real-time rendered advertisements created by visual effects (VFX) company The Mill, using Unreal Engine and The Mill’s dynamic Blackbird car in combination with new augmented reality (VR) platform Mill Cyclops.

Blackbird is a fully adjustable car rig that can be used in place of a real car in filming situations where it is not possible, for whatever reason, to use a real car.

Cyclops is a new virtual production toolkit that allows you to see CGI object live on location, tracking in real-time. In combination with the Blackbird, it is possible to use laser tracking and the four cameras on top of the Blackbird to produce an AR image of a Chevrolet Camero ZL1 that displays in real-time, including live reflections.

Blackbird-Using-Mill-Cyclops

The Mill are well known for their award-winning visual effects, particularly for British sci-fi series Doctor Who. They have done previous work in the virtual reality (VR) and Augmented Reality (AR) arena, as previously reported by VRFocus.

The demo was performed using a Google Tango-enabled tablet-phone.

Chevrolet-Custom-Rendering.

Alistair Thompson, International Executive Vice President at The Mill had this to say; “Visual effects should never get in the way of stories it should help tell stories. The future of filmmaking is realtime.”

VRFocus will continue to bring you news from the Epic Games keynote and the rest of GDC.

GDC 2017: Epic Games Teams With Chevrolet On New AR Car Project

GDC 2017: Epic Games Teams With Chevrolet On New AR Car Project

Ever since Epic Games opened its Unreal Engine 4 technology to the world, new use cases have been coming in fast and furious. The latest example of just how far real-time video game engine technology has come will be on display at GDC 2017 this week. Epic Games partnered with visual effects and creative content studio The Mill to shoot a new Chevrolet video that utilizes UE4 technology and an augmented reality Blackbird motion tracking vehicle.

That electric car is the brainchild of The Mill. And before the top-secret vehicle was revealed, Epic Games CTO Kim Libreri got an early look under the hood.

“This high-tech car can run any performance envelope of any car, and it has all the electronic equipment that you need to be able to track it, so you know where it was relative to the camera car, and also generates the light and reflection information that you need to be able to light a computer-generated car,” Libreri said.

Libreri met with Vince Baertsoen, the head of R&D at The Mill, last year in Germany at FMX 2016. At the time, the one challenge Baertsoen had was that the director filming the car was still seeing the Blackbird and the transformation into whatever car they wanted it to become occurred in post production. The Holy Grail was for those shooting the sequence to see the final version of the vehicle in real-time.

At GDC, Epic is showcasing a 2017 Camaro ZL1 in a race against the Chevy FNR concept car, except the vehicles are actually photorealistic pixels running in real-time using UE4 technology. To prove the point, the ZL1 can be swapped out for a classic ’69 Camaro.

“Those cars are built like we would do a video game asset,” Libreri said. “Right now, it’s a specialized version of Unreal because we’ve just put the demo together, but these are features that are going to be available in regular Unreal. The only difference between this and a car that you would put in a video game is the amount of polygons in the car. We actually have a couple levels of detail to the car. The one that you see in the video is comprised of millions of polygons. We also have a low resolution version that would be a more normal game-level asset that would run on a PlayStation 4. The materials and lighting and most of the things you see in the video would run on a console in a more regular video game environment.”

These virtual vehicles were super-imposed on top of the Blackbird during a live action shoot on the Angeles Crest Highway in Los Angeles. The Blackbird uses professional grade AR, filming a 360 video from the center of the vehicle using Red Epic cameras. Everything that’s around the vehicle is filmed as if it was a panoramic 360 photography. And a spinning LiDAR scanner is scanning the surrounding environment.

“They take the output from these four cameras, stitch it into a panorama, and then beam it to Unreal Engine wirelessly,” Libreri explained. “And then we take that as lighting and reflection information that we can place on top of a car that they’ve tracked with a real-time tracking system developed by partner company Arraiy.”

Before joining Epic in 2014, Libreri spent 20 years working in the Hollywood visual effects industry at companies like Lucasfilm, Industrial Light & Magic, and Digital Domain. These vehicles are inserted into the live action compositing background plates, which are equivalent to the kinds of images ILM would use.

The director of the 60-second video is sitting inside a customized Mercedes ML that has a Russian Arm that can film the Blackbird from any angle. Inside the Mercedes, he can watch the UE4-generated vehicle in real-time and make filming adjustments on-the-fly. A PC running on a high-end consumer NVIDIA graphics card is set up inside of the Mercedes to transform the Blackbird into the Camaro vehicles.

“We’re using some pretty beefy hardware for the demo right now, but that hardware capability is going to be available in the cloud very, very shortly, so you’ll be able to run these kinds of graphics-on-demand projects from the cloud,” Libreri said.

In addition to handling the augmented reality, UE4 is also handling a lot of information simultaneously.

“Each of these shots is an individual shot like you would have in Premiere or Avid, where you can cut backwards and forwards, and trim, and add the audio tracks,” Libreri said. “It’s all running just like you were doing normal visual effects photography, but inside a game engine.”

The Mill officially revealed the Blackbird on stage at GDC during Epic’s keynote. And Chevy also used that event to debut the final version of the race, which offers a wow factor when the vehicles enter a tunnel and go all TRON-like to showcase the real-time visual effects UE4 opens up.

“At every GDC we like to do some project that not only blows people away and inspires them, but shows that together with a customer we take some of the best people on the planet using our technology and make our engine better,” Libreri said. “We do something that people thought was impossible, so that’s why we went to this next level.”

Tagged with: , , , , ,

GDC 2017: Epic Games Teams With Chevrolet On New AR Car Project

GDC 2017: Epic Games Teams With Chevrolet On New AR Car Project

Ever since Epic Games opened its Unreal Engine 4 technology to the world, new use cases have been coming in fast and furious. The latest example of just how far real-time video game engine technology has come will be on display at GDC 2017 this week. Epic Games partnered with visual effects and creative content studio The Mill to shoot a new Chevrolet video that utilizes UE4 technology and an augmented reality Blackbird motion tracking vehicle.

That electric car is the brainchild of The Mill. And before the top-secret vehicle was revealed, Epic Games CTO Kim Libreri got an early look under the hood.

“This high-tech car can run any performance envelope of any car, and it has all the electronic equipment that you need to be able to track it, so you know where it was relative to the camera car, and also generates the light and reflection information that you need to be able to light a computer-generated car,” Libreri said.

Libreri met with Vince Baertsoen, the head of R&D at The Mill, last year in Germany at FMX 2016. At the time, the one challenge Baertsoen had was that the director filming the car was still seeing the Blackbird and the transformation into whatever car they wanted it to become occurred in post production. The Holy Grail was for those shooting the sequence to see the final version of the vehicle in real-time.

At GDC, Epic is showcasing a 2017 Camaro ZL1 in a race against the Chevy FNR concept car, except the vehicles are actually photorealistic pixels running in real-time using UE4 technology. To prove the point, the ZL1 can be swapped out for a classic ’69 Camaro.

“Those cars are built like we would do a video game asset,” Libreri said. “Right now, it’s a specialized version of Unreal because we’ve just put the demo together, but these are features that are going to be available in regular Unreal. The only difference between this and a car that you would put in a video game is the amount of polygons in the car. We actually have a couple levels of detail to the car. The one that you see in the video is comprised of millions of polygons. We also have a low resolution version that would be a more normal game-level asset that would run on a PlayStation 4. The materials and lighting and most of the things you see in the video would run on a console in a more regular video game environment.”

These virtual vehicles were super-imposed on top of the Blackbird during a live action shoot on the Angeles Crest Highway in Los Angeles. The Blackbird uses professional grade AR, filming a 360 video from the center of the vehicle using Red Epic cameras. Everything that’s around the vehicle is filmed as if it was a panoramic 360 photography. And a spinning LiDAR scanner is scanning the surrounding environment.

“They take the output from these four cameras, stitch it into a panorama, and then beam it to Unreal Engine wirelessly,” Libreri explained. “And then we take that as lighting and reflection information that we can place on top of a car that they’ve tracked with a real-time tracking system developed by partner company Arraiy.”

Before joining Epic in 2014, Libreri spent 20 years working in the Hollywood visual effects industry at companies like Lucasfilm, Industrial Light & Magic, and Digital Domain. These vehicles are inserted into the live action compositing background plates, which are equivalent to the kinds of images ILM would use.

The director of the 60-second video is sitting inside a customized Mercedes ML that has a Russian Arm that can film the Blackbird from any angle. Inside the Mercedes, he can watch the UE4-generated vehicle in real-time and make filming adjustments on-the-fly. A PC running on a high-end consumer NVIDIA graphics card is set up inside of the Mercedes to transform the Blackbird into the Camaro vehicles.

“We’re using some pretty beefy hardware for the demo right now, but that hardware capability is going to be available in the cloud very, very shortly, so you’ll be able to run these kinds of graphics-on-demand projects from the cloud,” Libreri said.

In addition to handling the augmented reality, UE4 is also handling a lot of information simultaneously.

“Each of these shots is an individual shot like you would have in Premiere or Avid, where you can cut backwards and forwards, and trim, and add the audio tracks,” Libreri said. “It’s all running just like you were doing normal visual effects photography, but inside a game engine.”

The Mill officially revealed the Blackbird on stage at GDC during Epic’s keynote. And Chevy also used that event to debut the final version of the race, which offers a wow factor when the vehicles enter a tunnel and go all TRON-like to showcase the real-time visual effects UE4 opens up.

“At every GDC we like to do some project that not only blows people away and inspires them, but shows that together with a customer we take some of the best people on the planet using our technology and make our engine better,” Libreri said. “We do something that people thought was impossible, so that’s why we went to this next level.”

Tagged with: , , , , ,