Baahubali is one of India’s biggest films, and it is getting a sequel, Baahubali 2: The Conclusion, which is one of the most technologically sophisticated films ever made. It is the world’s first “trillion pixel film,” and it is getting a virtual reality experience that debuted this week at the Tribeca Film Festival in New York.
The VR project, dubbed The Sword of Baahubali, shows that virtual reality is having an impact around the world, not just in the U.S. In India, the Baahubali 2: The Conclusion film will debut on April 28 worldwide, and it will be accompanied by a big VR campaign as well that will promote the film and tell a side story in a VR-animated universe.
Arka Media Works, which is making the film, is collaborating with Advanced Micro Devices and Amazon on the VR experience, and they are showing it this week at the Tribeca Immersive program.
Raja Koduri, senior vice president and chief architect of AMD’s Radeon Technologies Group, said in an interview with GamesBeat that the company started working on the project a couple of years ago after one of his visits to India. The filmmakers were re-creating an entire ancient kingdom across 200 acres of land for their film setting. Koduri got to see it, and he wanted AMD to get involved.
“I was blown away at how they re-created this kingdom,” Koduri said. “I thought we had to get it all into VR. We had to re-created it in a game engine to do that, because the sets were created for a film.”
The film’s computer-generated effects are being created on workstations with AMD’s Radeon graphics chips. AMD’s teams also had to create content that could run on high-end hardware in VR in real-time.
“It really looks photorealistic, and that’s what we wanted to achieve in a real-time VR experience,” Koduri said.
The companies are also rolling out VR installations, or location-based entertainment, at lots of cinemas across India using HTC Vive VR headsets. Koduri said he hopes the VR experience will drive more interest in the film, which is already very popular. The trailer for the film already has more than 100 million views, and it’s one of the most viewed trailers of all time.
In the VR experience, fans can ride horses, fight battles, and interactively engage with Baahubali, making this the first full VR experience for any film in India. The fans see the medieval kingdom that is depicted in the film. Two characters (who are not in the film) will greet you, and they have to find the Sword of Baahubali. The character in the film needs to get that sword to be victorious. You get to witness the king’s court, and you have to deal with the attack of a big elephant. The VR experience lasts around 10 minutes.
“In the final sequence of the VR experience, you’ll come face to face with the hero of the film,” Koduri said. “I think that location-based VR will drive interest. For consumers, it’s a five-year journey for us to make VR ubiquitous.”
For the tech, film, and gaming exhibits SXSW, VR and 360-degree media will be leaving a major imprint. From promotional work for different brands to VR roller coasters, you’ll be hard pressed not to run into some sort of immersive content. Graphics chip giant AMD will be represented in many of the different sessions taking place, but they’re specifically going to have some new technology on display that could change journalism in a big way.
Journalists have started taking advantage of 360-degree media, even going so far as having dedicated shows that utilize the format exclusively. Cameras are becoming more accessible and powerful, but there are still strides to be made when it comes to the production, editing, and streaming of such content. AMD is trying to take steps forward with their Radeon Loom 360 stitching technology, lowering the complexity of stitching together the footage pulled from the many cameras needed for 360-degree video to enable stitching in real-time. UploadVR discussed the new technology over email with AMD’s head of software and VR marketing, Sasa Marinkovic, ahead of their SXSW panel “Virtual Reality: The Next News Experience” taking place on March 14.
UploadVR: How will real-time stitching impact 360-degree production crews the most?
Sasa Marinkovic: Previous hardware and software limitations meant that it would take many hours or days to stitch a high-resolution 360-degree video, which clearly posed a major challenge for 360-degree production crews. While some stories can wait, many cannot, especially those we consider “breaking news” events.
Radeon Loom, AMD’s open-source 360-degree video-stitching framework, helps to resolve this issue, empowering production crews to capture and stitch content in real time and produce immersive news stories with almost the same immediacy as other digital or broadcast mediums.
There are still a few challenges that need to be resolved, both in terms of the kind of equipment being used and how the equipment is being used. There are various types of journalism, from news reporting (short to intermediate time frame) to investigative storytelling and documentaries (intermediate to long term). The type of the story will dictate how quickly the story needs to be published, and therefore the kind of editing that needs to be applied.
For a real-time setup, placement of all the equipment also needs to be considered. Each situation is unique, but you can imagine several different scenarios, such as filming an interview with a single rig or broadcasting a live concert with multiple camera rigs. With 360-degree cameras, you don’t generally get to have a camera operator behind the camera since he/she would be seen. So you probably want to locate the stitching and/or viewing PCs far away or behind a wall or green screen, for example.
UploadVR: Could this be the type of technology that could help cable news networks break into the 360-degree platform for their live showings?
Sasa Marinkovic: As previously mentioned, previous hardware and software limitations meant that it would take many hours or days to stitch a high-resolution 360-degree video, which meant that 360-degree content was often reserved to longer lead mediums, like investigative reporting and documentary journalism.
Radeon Loom, in combination with our Radeon GPUs and the fastest CPUs, enables both real-time live stitching and fast offline stitching of 360 videos. What this means is that even breaking news stories can be shown in 360-degree video, creating the ultimate immersive experience in events as they unfold.
This has the potential to invigorate journalism, and really all storytelling, adding immediacy to immersion. Just imagine how impactful it would be to watch street demonstrations live in 360 degrees, such as those that have taken place in Cairo’s Tahir Square.
UploadVR: What are the immediate benefits of using VR for journalism? Any perceived obstacles that must be overcome?
Sasa Marinkovic: This innate desire by people to immerse themselves in 360-degree images, stories and experiences is not new to our generation. In VR, however, for the first time ever, the spectator can become part of the action – whether it’s recreating a historic moment or attending a live sporting event – from anywhere in the world. The experience in VR is like nothing we have ever experienced before, aside from real life. However, there are still some basic problems that need to be worked on, such as parallax; camera count vs. seam count; and exposure differences between sensors.
Parallax: First, the parallax problem. Very simply, two cameras in different positions will see the same object from a different perspective, just as a finger held close to your nose appears with different backgrounds when viewed from each of your eyes opened, one at a time. Ironically, this disparity is what our brain uses when combining the images to determine depth. At the same time, it causes problems when we try merging two images together to fool your eyes into thinking they’re one image.
The number of cameras vs. the number of seams: Using more cameras to create higher resolution images with better optical quality (due to less distortion from narrower lenses as opposed to fisheye lenses), also means having more seams. This creates more opportunities for artifacts. As people and objects move across the seams, the parallax problem is repeatedly exposed with small angular differences. It’s also more difficult to align all the images when there are more cameras, and misalignment leads to ghosting. More seams also mean more processing time.
Exposure variances: Third, each camera sensor is observing different lighting conditions. For example, taking a video of a sunset will have both a west facing camera looking at the sun and an east facing camera viewing a much darker region. Although clever algorithms exist to adjust and blend the exposure variations across images, it comes at the cost of lighting and color accuracy, as well as overall dynamic range. The problem is amplified in low light conditions, potentially limiting artistic expression.
Storage: The amount of data that 360-video rigs generate is enormous. If you’re stitching 24 HD cameras at 60 fps, you generate around 450 GB /minute.
We want to unleash the creativity in the industry for cinematic VR video experiences, and the mechanics of creating high-quality 360-video to become common place. We’re going to do everything possible to make it as easy as possible for developers and storytellers to create great content, in real-time.
AMD had a press conference at GDC to demonstrate their engagement with the gaming community and had some major VR related announcements. AMD has long supported VR through their own Liquid VR technology and have been evangelizing VR for quite some time. So these announcements at their Capsaicin and Cream event made complete sense. GDC is a developer-focused conference so its worth remembering that many of these announcements will not have a direct impact on consumers, but rather an indirect effect as a result of decisions made by developers.
The first major announcement from AMD was that they have worked with Valve to support asynchronous reprojection, which is Valve’s own feature that exists to improve the VR experience and eliminate judder. This feature is akin to Oculus’ asynchronous time warp but for Valve’s Vive platform. The hardware manufacturer will support this feature through a driver update and Valve will support it through an update to SteamVR which is the company’s component of the Steam gaming platform. Valve actually launched this feature back in November along with NVIDIA, but now AMD is bringing support for this feature to their GPUs in March, which is a welcome addition for anyone running an AMD GPU with a Vive.
AMD also added support for a forward rendering path with Unreal Engine 4, which is one of the most popular engines in the world and is commonly used by some of the top game developers in the world. This forward rendering path feature is yet another VR-related feature that improves overall image quality in VR since HMDs are not the same as computer monitors and behave differently. As a result, lots of applications support forward rendering to deliver faster and better looking VR applications. Not all developers necessarily find forward rendering to be their cup of tea, but having support for the option is important for AMD to be relevant in VR.
Last, but not least, AMD announced their biggest partnership of the year and possibly in the company’s history with game developer and publisher Bethesda. This partnership will very likely stretch outwards into areas like VR, which is why it’s such a big deal. After all, Bethesda is releasing Fallout 4 in VR and it sounds like it will very likely ship with Vulkan which is a very good low level API that can squeeze the most performance out of virtually any CPU and GPU combination. However, AMD’s partnership with Bethesda is clearly designed to help them get better support for their GPU and CPU features in games and to accelerate performance in VR and other applications.
AMD did not announce anything regarding their new GPU code named Vega other than the fact that it will commercially be called Vega. Many people have been anticipating AMD’s newest GPUs using the new Vega architecture, but in the meantime, NVIDIA has announced their own GTX 1080 Ti which appears to once again raised the bar for AMD to compete with them
Disclosure: Anshel’s firm, Moor Insights & Strategy, like all research and analyst firms, provides or has provided research, analysis, advising, and/or consulting to many high-tech companies in the industry, including AMD, NVIDIA and others. He does not hold any equity positions with any of the companies cited.
At Oculus Connect 3, the Facebook-owned VR company revealed a new minimum specification for VR-ready PC rigs that would take advantage of its Asynchronous Spacewarp (ASW) technology. In doing so, the company has also partnered with CyberPowerPC to debut an AMD-fueled VR-ready rig that will come in at the low-price of $499 when bundled with an Oculus Rift headset at Best Buy or Amazon (this link’s coming soon, not available yet). However, it’s worth noting that the bundle price as a whole is actually set at $1,099.98.
Even though the rig itself was teased back in October, we didn’t know much about it at that time. Now, thanks to AMD and CyberPowerPC, we’ve got more details. For starters, the $499 price on the PC is only available if you buy a bundle from one of those two exclusive retail partners and the bundle is only available with a Rift.
And if you were to buy the PC separately, it would actually end up costing you approximately $649, meaning this bundle is shaving off about $150 from the cost of the PC. It’s designed for people new to VR that want to jump in with a full rig and headset all at once.
The cost of VR-ready hardware has continued to go down in the year since consumer headsets launched, making now a better time than ever to get into the market. Just last week at CES Asus debuted the VivoPC X, which is a compact, light-weight desktop built for portability without sacrificing power. Last year in August, CyberPowerPC had a VR-ready desktop for $720 and this new rig pushes that price down even lower.
AMD have dubbed the new minimum spec bundled computer as the ‘Gamer Ultra VR Desktop PC’ and it actually boasts relatively impressive specs:
CPU
AMD FX 4350 @ 4.2GHz
GPU
Radeon RX 470 4GB
System RAM
8GB DDR3 memory
Hard Drive Capacity
1TB 7200RPM
Optical Drive
DVD+RW, DVD-RW
Keyboard + Mouse Included
Yes
USB Connectivity
3x USB 3.0, 7x USB 2.0
Model #
GUAVR3000B1
With bundles like this, the argument for building your own PC becomes more and more difficult to make. You’d be hard-pressed to make something of this quality at this price point, including a keyboard and mouse, without sacrificing something else.
The battles between Intel, AMD, and NVIDIA are a constant clash of giants for the hearts and minds of the computing world and, more specifically, gamers. Intel has edged out AMD in recent years in terms of processing and the two will collectively continue to dominate for years to come in that field, but AMD may have something up their sleeves to tip the scales, Today, the company is announcing their new high-end Ryzen processor.
AMD has detailed 16 AM4 motherboards from five manufacturers while also exhibiting third-party CPU cooling and extreme performance PC designs based on the Ryzen archiecture at CES 2017.
AMD had a large collection of companies exhibiting PC’s powered by Ryzen, from Cybertron PC, Mindfactory, Origin PC, and many more. The new motherboards in these PCs were powered by the Ryzen X370 for more robust builds with the option to overclock and X300 Ryzen processor for more compact PC builds. Both feature dual-channel DDR4 memory, NVMe, M.2 SATA devices, Gen 1 and 2 USB 3.1, and PCIe® 3.0 capability.
These reveals will hopefully inspire manufacturers to start shaping builds from the ground up with Ryzen in mind and create some more competition across the table from Intel’s i7 within the extreme performance communities. In the provided press release, SVP of AMD’s computing and graphics group Jim Anderson declared 2017 would be an “unforgettable year for AMD”. Depending on the reception to this showing at CES, it just may be one of their most important to date.
While of course not ignoring the extreme performance community, AMD is also aiming to reel in VR pioneers with the Ryzen. As virtual creations continue to become more and more sophisticated and complex, there will continue to be a power struggle as AMD and Intel attempt to position themselves as the definitive processor solution for developers worldwide.
Be sure to stay tuned to UploadVR for more CES coverage over the next few days.
The continued growth of virtual reality requires that AMD, NVIDIA, and other major forces in computing come to the table with significant updates and upgrades centered around the technology. VR isn’t commonly consumed yet on a massive scale, but enthusiasts and trailblazers will want to be able to handle the new experiences as they come. AMD shared details with us on the latest Radeon update, 16.12.1, which has gone live and comes packed with a stout set of features including a few focused on VR.
The update, called Crimson ReLive Edition, was shaped with three pillars in mind: Features, professional grade stability, and performance. The team is coming off of a high point with Crimson Edition receiving a 90% approval rating from users, the highest they’ve ever received. The ReLive feature is the most prominent, obvious by having found its way into the title of this new update. It provides AMD users with a robust and lightweight on-board tool to create PC highlights, a social experience that is growing in importance. At this time it already interacts with multiple platforms like YouTube, Twitch, Panda.TV, Huya, and more. Another significant addition is the dynamic Chill feature that adjusts frame rate performance based on the intensity of the action taking place on screen.
Underneath all of these things are a couple enhancements to Liquid VR, AMD’s collection of features around seamless compatibility and optimization of VR specific experiences. Multi-view and Multi-res rendering both collaborate to reduce excessive and redundant processing to enhance performance. TrueAudio Next provides immersive sound based on real-time physics, and Affinity adds a scalable experience for multiple GPUs by having each GPU focus on a separate lens in VR.
You can find release notes for the Crimson ReLive Edition here and read up on a few additional updates surrounding HDR gaming, supported cards, fixed/known issues in the update, and more.
In mid-October we learned about an upcoming VR experience was in development to accompany the theatrical release of the Assassin’s Creed feature film starring Michael Fassbender (X-Men: Days of Future Past, Inglourious Basterds). As it turns out, the project is a large-scale collaboration between AMD, Alienware, Practical Magic, 20th Century Fox, New Regency, and Ubisoft. Here’s how they made it happen.
The 360-degree experience was filmed by Practical Magic in cooperation with 20th Century Fox and New Regency. The theatrical activation will feature kiosks with Oculus Rift headsets powered by Alienware Aurora PCs using AMD Radeon RX 480 graphics cards.
The five-minute experience promotes the upcoming Ubisoft Films and 20th Century Fox movie starring Michael Fassbender, which opens Dec. 21. Matthew Lewis, CEO and founder of Practical Magic, told UploadVR from the outset the goal was raising the visual quality bar for VR videos.
Michael Fassbender as Aguilar de Nerha in the Assassin’s Creed film.
“The film Justin Kurzel and Adam Arkapaw shot is beautiful, and we wanted to make sure the VR experience kept up,” Lewis said. “This meant we were going to be building a lot of new production and post-production technology, which is what ended up happening at Practical Magic.”
While the Spanish Inquisition scene in the big budget film was shot on location in Malta, the VR experience was shot in Los Angeles. Lewis said his team went out with drones and scanning equipment to painstakingly scan the set, props, and other elements from the film production in Malta and London.
“Over the course of a few days, we scanned the world of the movie, and took it back with us to Los Angeles,” Lewis said. “We were then able to recreate the set both physically in the art department, and in the computer at extremely high resolution.”
A cast of 50 to 60 people assembled in Los Angeles to bring the Spanish Inquisition to life. Because of the 360-degree nature of the experience, Lewis said a lot of the background talent ended up featured very prominently in the sequence.
The Animus from the original Assassin’s Creed video game.
“There’s action happening all around you,” Lewis said. “If you watch it more than once, you definitely see things you missed the first time that add to the experience. Right at the very beginning, you get a full true 360 view of the Animus — every last inch of it — so you can study it in great detail and see things you might have missed in the movie. It’s a gorgeous set full of props and eye candy, and it’s the same exact set you see in the film.”
Practical Magic produced the show in segments over the course of 2016 in London, Malta, and Practical Magic’s VR studio in Burbank, and it took a laundry list of new technology to pull it off.
“We weren’t happy with any 360 camera rig at all, so almost all of the action was captured using motion control rigs, including some of our own invention,” Lewis said. “We used mostly RED Dragon cameras and shot multiple passes of everything, with a baseline resolution of 6K. The hallway fight scene is actually 26 passes of 6K images composited together covering different angles of the scene. When you see it at full resolution, it feels cinematic — it’s rich, sharp and detailed. The dynamic range is there — it doesn’t feel muddy or overly-compressed. That was hugely important to us. We also really need to call out Litegear, our lighting supplier, who provided literally hundreds of individually controllable LED light fixtures that allowed us to perfectly manage the world light during motion control. We couldn’t have done it without them.”
Post-production was done in-house at Practical Magic, using Nuke and CaraVR for compositing, Maya for 3D and Vray for rendering, and After Effects for a few key tasks, along with some custom software, plugins, and tools of their own. Lewis said his studio has a pretty solid on-premises render farm that is built specifically to deal with VR, so every frame seen on screen was generated there specifically.
Michael Fassbender fighting as Aguilar de Nerha in the Assassin’s Creed film.
“Editing itself was only a fraction of the post-production work,” Lewis said. “The visual effects component was very complex, and took months of work. The post-production pipeline for VR industry-wide is very immature and the software is alpha quality at best. We were also pushing our hardware to the absolute limit — imagine trying to work with 26 video streams of 6K footage at the same time in the same shot. We needed the best hardware you can get your hands on, and that’s what it took to get the job done. Otherwise, we’d still be sitting here watching progress bars.”
Fassbender plays Aguilar de Nerha in the film, an original character that’s part of a new story that ties into the universe of Ubisoft’s bestselling video game franchise. While filming last year at Pinewood Studios, Practical Magic shot Fassbender for this exclusive VR experience.
“We shot him on stage in London and he was a great sport,” Lewis said.
While Fassbender is the central character in the big screen adventure, the VR experience allows users to step into the boots of an original character.
A screenshot from the Assassin’s Creed VR Experience trailer embedded above.
“The viewer is not playing Aguilar — that’s a job best left to Michael Fassbender’s talents,” Lewis said. “I don’t want to give too much away, but yes the viewer is an Assassin.”
Gamers will also recognize Easter Eggs in the VR piece, according to Lewis. These occur mostly in the first scene, which was shot in the Animus set from the movie. There are other elements inspired by the video game franchise, as well.
“We move the camera a lot, which means there are a number of major parkour-type moves the viewer will experience,” Lewis said. “There are plenty of classic, tried-and-true Assassin’s Creed moves. There’s one part that always makes people scream a bit, which is exciting to watch.”
Lewis said the team knew the games and immediately everyone went to,“We have to do the Leap of Faith in VR!” So naturally, Lewis jokes, “I don’t want to give it away by saying we did a Leap of Faith in VR, but I mean, we did a Leap of Faith in VR, obviously.”
Lewis said the last few years of experience working on complex cinematic projects like Capture for The CW have been invaluable.
“We like to move VR cameras while we’re shooting, which is traditionally considered very difficult — so moving VR cameras has kind of become our thing at Practical Magic,” Lewis said. “A couple of years ago we built a cinematic VR camera rig for Google that Justin Lin used to produce Help!, which won the Gold Lion for VR at Cannes this year. We’ve continued to build all manner of cinematic VR rigs since. If we didn’t have engineering and rapid prototyping in-house to build our own VR gear, and a lot of really experienced technical people, we couldn’t have pulled any of this off.”
Gamers will get a first look at the Assassin’s Creed VR Experience today for free through the Oculus Video app on both Oculus Rift and Samsung Gear VR, as well as a 360-degree video on Facebook. Additionally, moviegoers will see a national theatrical roll-out of the experience at AMC theaters in San Francisco, Los Angeles, Austin, and New York City between Dec. 2 and Jan. 1.