Take a Trip Aboard the ISS in Latest VR Film From Lauded Immersive Filmmakers Felix & Paul

Felix & Paul, the studio known for its pioneering work in creating cinematic immersive films, is set to release the first installment in a new VR trilogy shot from the International Space Station (ISS), which is slated to offer stunning views of Earth from low orbit.

Called Space Explorers – Blue Marble, the first in the series is set to launch on Earth Day, April 22nd. The trilogy will be available for free on Meta Quest and Quest 2 headsets.

The immersive film series aims to provide a deeper understanding of our planet’s place in the universe and the importance of protecting it for future generations.

The first episode offers up an unobstructed, 360-degree view, filmed at the nadir of the ISS, which points directly at the Earth below.

The studio’s goal is to offer the viewer a sort of virtual ‘Overview Effect’, or a phenomenon that occurs when astronauts view the Earth from space and experience a profound shift in their perspective and understanding of the planet.

Founded by Félix Lajeunesse and Paul Raphaël in 2013, the studio has created a number of original immersive film productions, including The Space Explorers series, Traveling While Black, and Strangers with Patrick Watson. The studio has also created productions with existing franchises, including Jurassic World, Cirque du Soleil and Fox Searchlight’s Wild and Isle of Dogs.

Felix & Paul has won a host of awards over the years, including five Canadian Screen Awards, two Primetime Emmy Awards, and a Daytime Emmy, along with numerous other awards and nominations.

You can catch Space Explorers – Blue Marble for free exclusively on Quest devices, with the first episode launching Saturday, April 22nd.

This Pioneering Fractal Artist is Returning to VR with a New Album Soon

It’s been a while since we last wrote about Julius Horsthuis, a visual effects artist who authored a series of fractal 360 videos for VR which are simply mind-blowing—both now and back in the good ol’ days of the Oculus Rift DK2 when we first experienced them. Now the Dutch artist announced a new album is coming to VR headsets sometime this year that’s slated to throw us head-first back into his contemplative fractal worlds.

Called ‘Recombinaton’, the VR album is set to fractal visuals created in Mandelbulb3D—par for the course with Horsthuis’ pareidolia-inducing creations that recall alien worlds, life, the universe, and everything.

The experience isn’t real-time rendered—that simply wouldn’t be possible given the level of detail. Instead, it’s been pre-rendered and captured in 180-degree, stereoscopic 3D video, which features 4,096 × 4,096 pixels per-eye at 60fps.

Back when Horsthuis announced the project in December, he said it was taking “a hell of a long time to render,” which is no surprise considering the level of detail and overall length. Recombination is set to be his longest VR experiences to date, clocking in at “well over 30 minutes.”

But what does it all mean? Horsthuis says Recombination explores “various themes in Physics, Math and Biology.” We’ll just have to wait find out more than that, it seems.

Horsthuis published his first fractal video back in December 2013, however we’re more familiar with Foreign Nature from 2015, which was his first 360 fractal video made especially for VR headsets. Since then, Horsthuis has published nine VR shorts, ranging from five to ten minutes.

There’s no firm release date yet on Recombination—just “2022”. In the meantime we’ll be keeping eyes glued on the artists YouTube channel and Twitter.

The post This Pioneering Fractal Artist is Returning to VR with a New Album Soon appeared first on Road to VR.

NBA League Pass Games Return to Quest in ‘Horizon Venues’ This Month

The NBA officially kicked off its 2021-22 regular season in mid-October after having last year’s derailed due to the COVID-19 pandemic. Starting this week, Meta (formerly Facebook) is welcoming Quest users back to Horizon Venues for some more courtside action.

You’ll be able to catch the first game on November 14th, which features the Golden State Warriors vs. the Charlotte Hornets. Check out November’s full NBA VR lineup below:

To watch, users need the paid NBA League Pass, which comes with a few other caveats. Meta says in a blogpost that League Pass games will only be open to users based in the US, and to those that are outside of a 50-mile radius of the two teams in a given game. That’s the wonderful world of broadcast syndication for you.

The newly rebranded Horizon Venues (previously just Venues) offers up multi-user spaces for large event viewing, which means you can watch games courtside with friends and key into exclusive play-by-play commentary from NBA champion Richard Jefferson, sportscaster Adam Amin, and more.

Meta says it’s going to publish more participating games in the future, however here’s November’s upcoming schedule. Click the links below to subscribe for an event reminder.

You can check out the full line-up of other Venues events here.

The post NBA League Pass Games Return to Quest in ‘Horizon Venues’ This Month appeared first on Road to VR.

Canon Introduces 180° Stereoscopic Lens to Support a “bright future for VR content creation”

Canon, one of the world’s leading camera makers, today introduced a new dual-optic lens which captures 180° stereoscopic views through a single sensor on the company’s high-end EOS R5 camera.

Canon today announced what it calls the EOS VR System which includes its new dual-optic camera lens, new firmware for its EOS R5 camera to support immersive capture, and new software for handling post-processing.

The new RF5.2mm F2.8 L Dual Fisheye lens is interesting because it captures both views onto the single image sensor in the Canon EOS R5 camera. Although this divides the resolution (because both views are captured in the same frame), it also stands to simplify the process of capturing 180° imagery because both views will necessarily have matching time sync, alignment, color, calibration, and focus. If any of these factors aren’t matched they can have a negative impact on the vieweing experience because it’s uncomfortable for the eyes to reconcile the discrepancies between each view. Capturing this way also means that the output is a single file for both eyes, which can streamline post-production compared to cameras which capture each eye’s view in a separate file (or many views which need to be stitched together).

Image courtesy Canon

The lens has an aperture of f/2.8 to f/16 and can be focused as close as 8-inches. The distance between the lenses is fixed at 60mm to be close to the typical human IPD. The company plans to update its Canon Connect and EOS Utility programs to offer a remote live-view through the lens for monitoring and shooting at a distance. Canon says the lens will be available in late December and priced at $2,000.

Around that time the company will also release two pieces of subscription-based software, an EOS VR Utility and EOS VR plug-in for Adobe Premiere Pro.

The EOS VR Utility will be able to convert the captured files from dual-fisheye to an equirectangular projection (which is supported by most immersive video players), as well as make “quick edits” and choose the resolution and file format before exporting.

The EOS VR plug-in for Premiere Pro will enable equirectangular conversion right inside of Premiere and allow the footage to be easily managed within other Adobe Creative Cloud apps.

The company has yet to announce pricing for either utility.

Canon calls the new lens “an important milestone in our company’s rich history as a lens manufacturer,” and says it “welcomes a bright future for VR content creation.”

“This new RF lens produces a stunning 8K virtual reality image and sets itself apart through its simplified workflow. Our goal is to make immersive storytelling more accessible for all,” says Tatsuro “Tony” Kano, EVP and GM of Canon Imaging Technologies & Communications Group.

Live-action immersive video was thought by many to be the next-generation of filmmaking to in the early days of modern VR, but it hasn’t seen nearly as much traction as pre-rendered CGI or real-time rendered content. Complicated immersive camera systems surely didn’t help, and to that end, Canon hopes its new lens and software tools can make a difference.

However, most live-action immersive video also lacks volumetric capture, which means the view can rotate (3DOF) but can’t also move through 3D space (6DOF), which tends to be less comfortable and immersive than VR content which can. Several companies have been working toward volumetric live-action capture, but several key players—like Lytro and NextVR—ultimately didn’t survive and were sold off before finding a market fit.

Whether or not simplified capture and production pipelines are enough to reboot 3DOF live-action immersive content remains to be seen.

In addition to its new lens, Canon has also experimented with XR headsets, most recently the MREAL S1 which it showed off earlier this year.

The post Canon Introduces 180° Stereoscopic Lens to Support a “bright future for VR content creation” appeared first on Road to VR.

Baobab Studios Nabs 9th Emmy Award Thanks To Baba Yaga

Baba Yaga won an Emmy for Outstanding Interactive Media for a Daytime Program at the 48th Annual Emmy Awards, bringing Baobab Studios’ total Emmy awards wins to nine.

Baba Yaga also previously won two other Emmy awards — Outstanding Directing Team for Animated Program (Eric Darnell, director, and Mathias Chelebourg, co-director) and Outstanding Individual Achievement in Animation for Character Design (Karl Athannossov).

It was already such an honor and thrill for Baba Yaga to be awarded Emmys for both character design and direction,” said Baba Yaga director and writer Eric Darnell. “And now, to be honored with the award for best interactive is truly spectacular and a reflection of the team’s monumental efforts to give the audience the opportunity to be the hero of their own story.”

The immersive VR movie released in January exclusively for Oculus Quest after premiering at various film festivals in 2020, including the Venice Film Festival almost exactly a year ago. In our review at the time, we found it to be entertaining but slightly safe. It certainly wasn’t the ambitious step-up we were hoping to see from Baobab’s next offering:

This is a likable, sweet but ultimately safe adventure, highlighted by some great VR novelties but also equally limited by them. I want to see something from this studio that feels truly dynamic — like my presence has genuine impact in the stories it creates. Baobab remains one of VR’s most promising storytellers but, four years on from its debut, I feel like I’m still waiting for it to really hit its stride.

You can read our full review from last year here.

Baba Yaga is available now on Oculus Quest for $5.99.

Stunning View Synthesis Algorithm Could Have Huge Implications for VR Capture

As far as live-action VR video is concerned, volumetric video is the gold standard for immersion. And for static scene capture, the same holds true for photogrammetry. But both methods have limitations that detract from realism, especially when it comes to ‘view-dependent’ effects like specular highlights and lensing through translucent objects. Research from Thailand’s Vidyasirimedhi Institute of Science and Technology shows a stunning view synthesis algorithm that significantly boosts realism by handling such lighting effects accurately.

Researchers from the Vidyasirimedhi Institute of Science and Technology in Rayong Thailand published work earlier this year on a real-time view synthesis algorithm called NeX. It’s goal is to use just a handful of input images from a scene to synthesize new frames that realistically portray the scene from arbitrary points between the real images.

Researchers Suttisak Wizadwongsa, Pakkapon Phongthawee, Jiraphon Yenphraphai, and Supasorn Suwajanakorn write that the work builds on top of a technique called multiplane image (MPI). Compared to prior methods, they say their approach better models view-dependent effectis (like specular highlights) and creates sharper synthesized imagery.

On top of those improvements, the team has highly optimized the system, allowing it to run easily at 60Hz—a claimed 1000x improvement over the previous state of the art. And I have to say, the results are stunning.

Though not yet highly optimized for the use-case, the researchers have already tested the system using a VR headset with stereo-depth and full 6DOF movement.

The researchers conclude:

Our representation is effective in capturing and reproducing complex view-dependent effects and efficient to compute on standard graphics hardware, thus allowing real-time rendering. Extensive studies on public datasets and our more challenging dataset demonstrate state-of-art quality of our approach. We believe neural basis expansion can be applied to the general problem of light-field factorization and enable efficient rendering for other scene representations not limited to MPI. Our insight that some reflectance parameters and high-frequency texture can be optimized explicitly can also help recovering fine detail, a challenge faced by existing implicit neural representations.

You can find the full paper at the NeX project website, which includes demos you can try for yourself right in the browser. There’s also WebVR-based demos that work with PC VR headsets if you’re using Firefox, but unfortunately don’t work with Quest’s browser.

Notice the reflections in the wood and the complex highlights in the pitcher’s handle! View-dependent details like these are very difficult for existing volumetric and photogrammetric capture methods.

Volumetric video capture that I’ve seen in VR usually gets very confused about these sort of view-dependent effects, often having trouble determining the appropriate stereo depth for specular highlights.

Photogrammetry, or ‘scene scanning’ approaches, typically ‘bake’ the scene’s lighting into textures, which often makes translucent objects look like cardboard (since the lighting highlights don’t move correctly as you view the object at different angles).

The NeX view synthesis research could significantly improve the realism of volumetric capture and playback in VR going forward.

The post Stunning View Synthesis Algorithm Could Have Huge Implications for VR Capture appeared first on Road to VR.

Arcturus Raises $5 Million to Expand Volumetric Video Toolset & Streaming

Arcturus, a company building tools for editing and distributing volumetric video, today announced it has raised a $5 million seed investment.

Distinct from stereoscopic video, volumetric video is fully three-dimensional and can be viewed from all angles, which makes it potentially well suited for use in augmented and virtual reality. Volumetric video isn’t yet widespread, owed to challenges with capture, storage, editing, and distribution.

With its ‘Holosuite’—HoloEdit, HoloCompute, and HoloStream—Arcturus hopes to streamline the use of volumetric video, by making it easy to edit, manage, and stream.

The company today announced a $5 million seed investment led by BITKRAFT Ventures with participation HBSE Ventures, NTT Docomo Ventures, Build Ventures, Marc Merril and Craig Kallman.

Arcturus says the funds will be used to “scale the software development team, focus efforts on sales growth, and expand the product line with an emphasis on live-streaming features.”

“Arcturus’ mission is to create a future where digital human holograms are captured from reality, customized and even interact with the viewer in real time. This can take the form of digital customer service agents, human avatars, virtual 3D concerts and fashion runways, or giving access to the perspectives of professional athletes in broadcast sports,” says Arcturus CEO, Kamal Mistry. “With the backing of BITKRAFT Ventures, true leaders in games and XR investments, we are confident Arcturus will serve as a catalyst to enable widespread accessibility to volumetric video creation, enabling millions of users to create a new form of interactive content.”

Capturing live-action volumetric video remains a complex process, often requiring dedicated light-stages with tens if not hundreds of cameras surrounding the subject. The resulting datasets are also massive compared to traditional or even stereoscopic video.

Microsoft’s Mixed Reality capture stage | Image courtesy Microsoft

But that could well change in the future thanks to developments in both hardware and software.

Researchers in recent years have shown compelling results using machine learning approaches to reconstruct volumetric video from traditional video footage. Hardware built specifically for capturing volumetric data—like Microsoft’s Azure Kinect or Apple’s LiDAR-equipped phones & tablets—could streamline the capture process and expand the use-cases of volumetric video from dedicated capture stages to less complex productions.

Arcturus doesn’t deal in the actual capture of volumetric video, but it’s counting on the growth in the demand for volumetric video and wants to be ready with its suite of tools for creators to store, edit, and stream the content. But with the freshness of this tech it isn’t something individual users will be using for some time to come—that much is clear from Arcturus’ Holosuite pricing, which runs a cool $7,500 per year, per user.

The post Arcturus Raises $5 Million to Expand Volumetric Video Toolset & Streaming appeared first on Road to VR.

Baobab Studios’ Baba Yaga Releases January 14, Exclusively For Oculus Quest

Baobab Studios announced that its next animated VR film Baba Yaga will release on January 14 exclusively for Oculus Quest and Oculus Quest 2.

The VR film is about two sisters who journey into a haunted forest to seek a cure for their sickly mother, where they will have to confront the mysterious witch, Baba Yaga. The film is a first-person experience, where you embody one of the two sisters and participate in some minor interactive moments. A new trailer was released ahead of launch, which you can watch below:

Baba Yaga has a star-studded cast with Daisy Ridley playing your sister, Magda, with Glenn Close as the mother, Jennifer Hudson as the forest and Kate Winslet as Baba Yaga herself. The film premiered at several film festivals last year and Jamie Feltham and I both got a chance to watch the film ahead of release. We came to the same conclusion — it’s a charming, if safe, experience from the Baobab team that left us craving just a little bit more. Here are the closing remarks from Jamie’s review:

I want to see something from this studio that feels truly dynamic — like my presence has genuine impact in the stories it creates. Baobab remains one of VR’s most promising storytellers but, four years on from its debut, I feel like I’m still waiting for it to really hit its stride.

You can read the full review here.

Baba Yaga runs for approximately half an hour and will be available on the Oculus Store for Quest and Quest 2 from January 14 for $5.99.

Virtual-Reality-Entspannungsprogramm von Telekom und Magic Horizons (sponsored post)

Die Telekom lädt euch auf ein exklusives Virtual-Reality-Entspannungsprogramm ein. Dieses ist erhältlich bei Magenta VR, der interaktiven VR-App des Telekommunikationsanbieters und beinhaltet mehrere 360-Grad-Videos, die euch an beeindruckende Orte mit besonderen Klängen bringen.

Virtual-Reality-Entspannungsprogramm mit binauraler Musik für intensive Erholung

Mit den neuen Relax-VR-Videos zeigt die Telekom in Kooperation mit dem Softwareunternehmen Magic Horizons, wie entspannend schon wenige Minuten in einer virtuellen Umgebung sein können. Sowohl bei der Arbeit als auch im privaten Umfeld werden viele Menschen mit stressigen Situationen konfrontiert. Langfristig können diese Belastungen die Lebensqualität einschränken und den allgemeinen Gesundheitszustand negativ beeinflussen. Es ist also wichtig, hin und wieder den Alltag zu vergessen, um Stress abzubauen und hierfür hat Magic Horizions die passenden Videos produziert. Eine Studie der Humboldt Universität zu Berlin bestätigt die positive Wirkung von Virtual-Reality-Anwendungen:

“Die Studien zeigen, dass das für eine hohe Entspannungswirkung entscheidende Maß an gefühlter Präsenz mit VR die zurzeit bestmögliche Technologie ist. Unterstützt wird die visuelle Intensität durch die stereoskopischen Inhalte, die auch Magic Horizons bietet. Die hohe Renderqualität der Magic Horizons Erfahrungen spielt dabei ebenfalls eine entscheidende Rolle und verstärkt den Präsenzeffekt.“ bestätigt Dr. Christian Stein, Lehrbeauftragter Basisprojekt Virtuelle und reale Architektur des Wissens, IBI an der HU Berlin. “Der Ansatz von Magic Horizons, binaurale, orchestrale musikalische Erfahrungen in Verbindung mit stereoskopischen Bildern in Virtual Reality zu schaffen, zielt damit auf Entspannungswirkung, Stress- sowie Angstreduktion ab.”

Hier noch ein paar Infos zu den veröffentlichten Videos:

Day at the River

Ruhe finden am Flusslauf der Isar mitten im bayerischen Karwendelgebirge. Lass einfach los und beobachte, wie das türkisfarbene Wasser des Flusses durch diese ursprüngliche Landschaft fließt. Das kühle, klare Wasser lässt Alltagsgedanken in den Hintergrund treten. Die türkisgrüne Farbe hat auch hier gemäß der Humboldt-Studie für Magic Horizons eine besonders beruhigende Wirkung auf die menschliche Psyche.

Dolphins’ Dream

Hast du schon immer davon geträumt, mit Delfinen zu schwimmen und zu tauchen? Jetzt kannst du dir diesen Wunsch mit Magic Horizons in Virtual Reality erfüllen. Tauche ein in einen Schwarm von Delfinen und beobachte deren magischen Tanz.

Gorge Walk

Energie gewinnen in der ursprünglichen Landschaft der Alpen. Entspanne auf einer geführten Höhenwanderung und lasse die Seele an klaren Bergbächen baumeln. Das viele Grün hat eine wohltuende, entspannende und befreiende Wirkung auf die menschliche Psyche.

Golden Autumn

Durchatmen an einem wunderschönen Herbsttag in Litauen. Goldgelbes Laub, lauschige Wälder und romantische Seen laden zum Verweilen ein. Die Schönheit dieser Landschaft gepaart mit der binauralen Musik führt schnell zu einem entspannten Zustand und lässt Alltagssorgen zurück.

Alle Entspannungsvideos sind ab sofort kostenlos in der Magenta Virtual Reality App erhältlich. Die App ist verfügbar für iOS, Android, Oculus Go und für Cardboards wie Samsung Gear VR.

Der Beitrag Virtual-Reality-Entspannungsprogramm von Telekom und Magic Horizons (sponsored post) zuerst gesehen auf VR∙Nerds. VR·Nerds am Werk!

Dragon Ball Movies And 30+ Paramount Titles Coming To Bigscreen

The Bigscreen team has shared some updates on new content and features coming to the platform in the next month, including the addition of three Dragon Ball movies.

On Sunday, September 12, the three latest Dragon Ball anime movies will premiere in Bigscreen:

  • Dragon Ball Z: Battle of Gods — 10am PT
  • Dragon Ball Z: Ressurection ‘F’f — 3pm PT
  • Dragon Ball Super: Broly — 7pm PT

In addition to the premiere screenings, the movies will be available to rent on-demand from Bigscreen’s movie rental catalog. The Dragon Ball films will be available for users in the US, UK, Canada, Ireland, Australia, and New Zealand for $3.99 each.

There’s also a tie-in Dragon Ball contest with some tickets to the Dragon Ball screenings up for grabs. To win, users simply need to find all seven Dragon Balls in the mini game posted to the Bigscreen Twitter and Facebook pages at 8am PT on September 2. The first to complete the game and the person to complete it the fastest (on their first attempt only) will win a free ticket to the screening.

However the Dragon Ball movies aren’t the only titles being added to the rental catalog — Bigscreen is expanding its partnership with Paramount Pictures and adding over 30 new movies available as on-demand rentals. This includes the legendary Godfather trilogy and the Mission Impossible series, as well as Jackass 3D, which will be the first 3D movie to premiere in the United Kingdom.

The team will also launch an accessibility update, adding English Closed Captions for the deaf and hard of hearing community. Bigscreen is also currently developing support for hand tracking on the Oculus Quest, which the team hopes will allow users to communicate in ASL (American Sign Language) while using Bigscreen.

Just last week, Bigscreen added DLNA support, allowing users to remotely stream video and audio content from their media server into the app.