Exclusive: Lytro Reveals Immerge 2.0 Light-field Camera with Improved Quality, Faster Captures

Lytro’s Immerge light-field camera is meant for professional high-end VR productions. It may be a beast of a rig, but it’s capable of capturing some of the best looking volumetric video that I’ve had my eyes on yet. The company has revealed a major update to the camera, the Immerge 2.0, which, through a few smart tweaks, makes for much more efficient production and higher quality output.

Light-field specialist Lytro, which picked up a $60 million Series D investment earlier this year, is making impressive strides in its light-field capture and playback technology. The company is approaching light-field from both live-action and synthetic ends; last month Lytro announced Volume Tracer, a software which generates light-fields from pre-rendered CG content, enabling ultra-high fidelity VR imagery that retains immersive 6DOF viewing.

Immerge 2.0

Immerge 2.0 | Image courtesy Lytro

On the live-action end, the company has been building a high-end light-field camera which they call Immerge. Designed for high-end productions, the camera is actually a huge array of individual lenses which all work in unison to capture light-fields of the real world.

At a recent visit to the company’s Silicon Valley office, Lytro exclusively revealed to Road to VR latest iteration of the camera, which they’re calling Immerge 2.0. The form-factor is largely the same as before—an array of lenses all working together to capture the scene from many simultaneous viewpoints—but you’ll note an important difference if you look closely: the Immerge 2.0 has alternating rows of cameras pointed off-axis in opposite directions.

With the change to the camera angles, and tweaks to the underlying software, the lenses on Immerge 2.0 effectively act as one giant camera that has a wider field of view than any of the individual lenses, now 120 degrees (compared to 90 degrees on the Immerge 1.0).

Image courtesy Lytro

In practice, this can make a big difference to the camera’s bottom line: a wider field of view allows the camera to capture more of the scene at once, which means it requires fewer rotations of the camera to capture a complete 360 degree shot (now with as few as three spins, instead of five), and provides larger areas for actors to perform. A new automatic calibration process further speeds things up. All of this means increased production efficiency, faster iteration time, and more creative flexibility—all the right notes to hit if the goal is to make one day make live action light-field capture easy enough to achieve widespread traction in professional VR content production.

Ever Increasing Quality

Lytro has also been refining their software stack which allows them to pull increasingly higher quality imagery derived from the light-field data. I saw a remastered version of the Hallelujah experience which I had seen earlier this year, this time outputting 5K per-eye (up from 3.5K) and employing a new anti-aliasing-like technique. Looking at the old and new version side-by-side revealed a much cleaner outline around the main character, sharper textures, and distant details with greater stereoscopy (especially in thin objects like ropes and bars) that were previously muddled.

What’s more, Lytro says they’re ready to bump the quality up to 10K per-eye, but are waiting for headsets that can take advantage of such pixel density. One interesting aspect of all of this is that many of the quality-enhancing changes that Lytro has made to their software can be applied to light-field data captured prior to the changes, which suggests a certain amount of future-proofing available to the company’s light-field captures.

– – — – –

Lytro appears to be making steady progress on both live action and synthetic light-field capture & playback technologies, but one thing that’s continuously irked those following their story is that none of their light-field content has been available to the public—at least not in a proper volumetric video format. On that front, the company promises that’s soon to be remedied, and has teased that a new piece of content is in the works and slated for a public Q1 release across all classes of immersive headsets. With a bit of luck, it shouldn’t be too much longer until you can check out what the Immerge 2.0 camera can do through your own VR headset.

The post Exclusive: Lytro Reveals Immerge 2.0 Light-field Camera with Improved Quality, Faster Captures appeared first on Road to VR.

Now Available – Light-field Captured ‘Hallelujah’ is a Stunning Mix of Volumetric Film and Audio

Hallelujah is a new experience by VR film studio Within that’s captured using Lytro’s latest Immerge light-field camera which captures volumetric footage that makes for a much more immersive experience than traditional 360 video. Hallelujah is a performance of Leonard Cohen’s 1984 song of the same name, and mixes the latest in VR film capture technology with superb spatial audio to form a stunning experience.

Update (9/22/17, 3:49PM PT): A spokesperson for the project has confirmed that the version of the experience just released is mastered from the original light-field capture, but unfortunately takes the form of a 360 video rather than true volumetric video, even on the desktop VR headsets that support positional tracking. We’ve inquired if and when the volumetric version will be made available.

Update (9/22/17): The light-field captured piece, Hallelujah, is finally available to the public through the Within app on just about every mobile and desktop VR platform, for free. Head to the Within website to be directed to the app for your platform of choice.

Photo courtesy Lytro

Original Article (4/23/17): Lytro’s Immerge camera is unlike any 360 camera you’ve seen before. Instead of shooting individual ‘flat’ frames, the Immerge camera has a huge array of cameras which gather many views of the same scene, data which is crunched by special software to recreate the actual shape of the environment around the camera. The big benefit of which is that the playback puts the viewer in a virtual capture of the space, allowing for a limited amount of movement within the scene, whereas traditional 360 video only captures a static viewpoint which is essentially stuck to your head. Not to mention the Immerge camera also provides true stereo and outputs a much higher playback quality. The result is a much richer and more immersive VR film experience than what you’ve seen with traditional 360 video shoots.

After a recent visit to Lytro check out the latest version of the Immerge camera, I concluded, “All-in-all, seeing Lytro’s latest work with Immerge has further convinced me that today’s de facto 360-degree film capture is a stopgap. When it comes to cinematic VR film production, volumetric capture is the future, and Lytro is on the bleeding edge.”

In that article I talked a lot about the tech, but I couldn’t say much about the experience that I saw which left me so impressed. Hallelujah was that experience, and now I can talk about it.

Created by VR film studio Within, Hallelujah pairs Immerge’s volumetric capture capabilities with finely mixed spatial audio that forms the foundation for a stunning performance of Leonard Cohen’s 1984 song Hallelujah (perhaps most famously performed by Jeff Buckley). Lytro has provided a great behind-the-scenes look at the production here:

In Hallelujah, singer Bobby Halvorson starts as the solo lead of the song directly in front of the viewer against a pitch black background. As the only object in the scene, it’s easy to feel the volume and shape of the singer thanks to the volumetric capture. As you lean left and right, you see the sides of Halvorson’s face in a way that would be utterly impossible with traditional 360 capture techniques. With that sense of depth and parallax comes a feeling that the singer is really there right in front of you.

The entire song is sung a capella with no instrumentation, and copies of Halvorson are duplicated to the left, right, and behind the viewer, singing accompanying tracks.

Photo courtesy Lytro

There you are, in a void of blackness, with Halvorsons at arms length on all sides, singing a stirring version of the uplifting song. You can look all around you to see each version of Halvorson singing a different track as he creates all the notes of the song. As you turn your head, the careful audio mixing becomes apparent; each track accurately sounds like it’s coming from each of the respective singers. This excellently mixed spatial audio significantly enhanced the sense of Presence for me; Halvorson doesn’t just look and feel like he’s there in front of you, he also sounds like it.

As the song progresses, you may noticed a distinct and fitting reverb coming from Halvorson’s voice. It isn’t a digitally applied effect though; as the song progresses, the black void around him eventually fades away and you find yourself in the midst of a beautifully adorned church. Behind the singer is a full church choir that joins into the song all at once, adding a swelling of new voices to Halvorson’s self-accompanied solo.

Photo courtesy Lytro

Turning around to explore this newly unveiled environment, you can see the ceiling, walls, and windows in detail; behind you are rows of empty pews, flanked by huge columns supporting arches that run from the back of the church to the front. Returning your gaze forward, the song reaches its climax and eventual conclusion.

Lytro’s Immerge camera | Photo by Road to VR

The whole thing is but the length of the song (about 5 minutes), but the stirring performance feels like it was done for you alone, one which feels uniquely immersive thanks to some special technology and a carefully planned and well executed production.

Hallelujah is the kind of experience that is likely to be used for a long while as a demo not only to show the benefits of volumetric capture, but also to stir the imaginations of other creators working in the medium of VR film. Sadly, there’s no word yet as to when (or on what platforms) the experience will launch to the public.

The post Now Available – Light-field Captured ‘Hallelujah’ is a Stunning Mix of Volumetric Film and Audio appeared first on Road to VR.

Lytro’s Latest VR Light-field Camera is Huge, and Hugely Improved

In the last few years, Lytro has made a major pivot away from consumer-facing digital camera products now to high-end production cameras and tools, with a major part of the company’s focus on the ‘Immerge’ light-field camera for VR. In February, Lytro announced it had raised another $60 million to continue developing the tech. I recently stopped by the company’s offices to see the latest version of the camera and the major improvements in capture quality that come with it.

The first piece of content captured with an Immerge prototype was the ‘Moon’ experience which Lytro revealed back in August of 2016. This was a benchmark moment for the company, a test of what the Immerge camera could do:

Now, to quickly familiarize yourself with what makes a light-field camera special for VR, the important thing to understand is that light-field cameras shoot volumetric video. So while the basic cameras of a 360-degree video rig output flat frames of the scene, a light-field camera is essentially capturing data enough to recreate the scene as complete 3D geometry as seen within a certain volume. The major advantage is the ability to play the scene back through a VR headset with truly accurate stereo and allow the viewer to have proper positional tracking inside the video; both of which result in much more immersive experience, or what we recently called “the future of VR video.” There’s also more advantages of light-field capture that will come later down the road when we start seeing headsets equipped with light-field displays… but that’s for another day.

Lytro’s older Immerge prototype, note that many of the optical elements have been removed | Photo by Road to VR

So, the Moon experience captured with Lytro’s early Immerge prototype did achieve all those great things that light-field is known for, but it wasn’t good enough just yet. It’s hard to tell unless you’re seeing it through a VR headset, but the Moon capture had two notable issues: 1) it had a very limited capture volume (meaning the space around which your head can freely move while keeping the image in tact), and 2) the fidelity wasn’t there yet; static objects looked great, but moving actors and objects in the scene exhibited grainy outlines.

So Lytro took what they learned from the Moon shoot, went back to the drawing board, and created a totally new Immerge prototype, which solved those problems so effectively that the company now proudly says their camera is “production ready,” (no joke, scroll to the bottom of this page on their website and you can submit a request to shoot with the camera.).

Photo courtesy Lytro

The new, physically larger Immerge prototype brings a physically larger capture volume, which means the view has more freedom of movement inside the capture. And the higher quality cameras provide more data, allowing for greater capture and playback fidelity. The latest Immerge camera is significantly larger than the prototype that captured the Moon experience, by about four times. It features a whopping 95 element planar light-field array with a 90-degree field of view. Those 95 elements are larger than on the precursor too, capturing higher quality data.

I got to see a brand new production captured with the latest Immerge camera, and while I can’t talk much about the content (or unfortunately show any of it), I can talk about the leap in quality.

Photo by Road to VR

The move from Moon to this new production is substantial. Not only does the apparent resolution feel higher (leading to sharper ‘textures’), but the depth information is more precise which has largely eliminated the grainy outlines around non-static scene elements. That improved depth data has something of a double-bonus on visual quality, because sharper captures enhance the stereoscopic effect by creating better edge contrast.

Do you recall early renders of a spherical Immerge camera? Purportedly due to feedback informed by early productions using a spherical approach, the company decided to switch to a flat (planar) capture design. With this approach, capturing a 360 degree view requires the camera to be rotated to individually shoot each side of an eventual pentagonal capture volume. This sounds harder than capturing the scene all at once in 360, but Lytro says it’s easier for the production process.

The size of the capture volume has been increased significantly over Moon, though it can still feel limiting at this size. While you’re well covered for any reasonable movements you’d do while your butt is planted in a chair, if you were to take a large step in any direction, you’ll still leave the capture volume (causing the scene to fade to black until you step back inside).

And, although this has little to do with the camera, the experience I saw featured incredibly well-mixed spatial audio which sold the depth and directionality of the light-field capture in which I was standing. I was left very impressed with what Immerge is now capable of capturing.

The new camera is impressive, but the magic is not just in the hardware, it’s also in the software. Lytro is developing custom tools to fuse all the captured information into a coherent form for dynamic playback, and to aid production and post-production staff along the way. The company doesn’t succeed just by making a great light-field camera, they’re responsible for creating a complete and practical pipeline that actually delivers value to those that want to shoot VR content. Light-field capture provides a great many benefits, but needs to be easy to use at production scale, something that Lytro is focusing on just as heavily on as they are the hardware itself.

All-in-all, seeing Lytro’s latest work with Immerge has further convinced me that today’s de facto 360-degree film capture is a stopgap. When it comes to cinematic VR film production, volumetric capture is the future, and Lytro is on the bleeding edge.

The post Lytro’s Latest VR Light-field Camera is Huge, and Hugely Improved appeared first on Road to VR.

Lytro Gets $60M More for Light Field VR Capture, First Content Coming in Q2

Lytro have announced another hefty new wedge of funding, with a $60 million series D round led by Blue Pool. What’s more, they’ve partnered with Within, and the first 360 degree 3D light field content is now set to arrive from that partnership in Q2 of this year.

There are so many technologies that the advent of accessible virtual reality has encouraged to evolve. But few excite me as much as the potential for light field ‘video’. As a movie enthusiast, the idea that motion pictures can now be captured both in 3D and allow the viewer to ‘peek’ in, out and around upon viewing, blows my mind. Lytro promise to deliver VR film with six degrees of freedom and parallax at a potential resolution “greater then 6k per eye”.

SEE ALSO
Lytro's 'Immerge' 360 3D Light-field Pipeline is Poised to Redefine VR Video

We wrote in 2015 about Lytro‘s potentially groundbreaking Immerge system, then a gargantuan domed array of light field sensor slices that capture absurd amounts of data about the light light it sees, all in 360 degrees. And, as the angle and source of the light is captured, the data recorded can be used to recreate the camera’s surroundings in three dimensions too. Clearly the potential for immersive movie making with Lytro’s new kit is immense and a perfect fit for virtual reality viewing.

Now, Lytro have announced that in addition to the $50M in funding they netted in 2015 to develop Immerge, they’ve just received a further $60M in series D funding, in a round led by Blue Pool Capital to continue refining Immerge and, perhaps as importantly, producing content with it.

“We believe that Asia in general and China in particular represent hugely important markets for VR and cinematic content over the next five years,” said Jason Rosenthal, CEO of Lytro. “A key goal of this capital raise was to assemble a group of trusted capital partners to help us best understand and navigate this new market.”

On the content front, Lytro are also announcing today that they’ve formed a partnership with creative house and content platform Within (formerly Vrse), co-founded by one of the few directors out there to have already made a name for themselves in the embryonic medium of VR film, Chris Milk. The first production from the new partnership has already wrapped, and is currently in post-production. According to a press release from Lytro, they’re planning to launch this new content at some point in Q2 2017 – that’s not long at all.

SEE ALSO
Within and Fox Partner to Create VR Content, Spike Jonze to Co-Produce Original VR Film

So what else has the Lytro team been up to since we last heard from them? Well there’s been a fairly major change to the form factor and nature of the Immerge camera. Instead of the incredibly ambitious 360 ‘capture all angles at once’ system featured previously, the company have instead pivoted to a ‘planar’ (in other words, front-facing only) camera system. Lytro claim this change was made in response to feedback from their creative partners, allowing for more traditional ‘nehind the camera’ (not an option with 360 filming) director / talent collaboration and tighter control over the filmed volume. To be clear though, the system will still offer 360 capture, but instead of capturing all at once, the system can be rotated, filming those different angles one at a time.

lytro-planar-1

Whilst this does sound like an almighty pain, because the Immerge is dealing with light fields, it should be much easier to seamlessly blend each of those views when compared with conventional spherical camera array. So, whilst it’s not quite as neat and impressive as the company’s original vision, we still get high resolution light field films which can be adjusted for different IPDs, allows for parallax and some freedom of movement within the captured volume. In short, it’s still pretty bloody cool!

There is still a question however, and quite an important one. Even with ‘downscaled’ versions of the assembled films, there’s a lot of data required to deliver these experiences to a VR headset in the home. Lytro previously spoke about proprietary streaming software which downloaded data only for the portion of the movie you were looking at, but there were no further details on this kind of viewer in their latest press release. We’ll be following up on this.

The long wait for Lytro’s potentially groundbreaking form of VR video capture seems almost to be over, and although there are still have question on how they’ll get it all to our faces, I’m more excited to see the results in action for myself than ever.

Introduction to Light-fields

Light-field photography differs from traditional photography in that it captures much more information about the light passing through its volume (i.e. the lens or sensor). Whereas a lytro-light-fidle-diagstandard digital camera will capture light as it hits the sensor, statically and entirely in two dimensions, a light-field camera captures data about which direction the light emanated and from what distance.

The practical upshot of this is, as a light-field camera captures information on all light passing into its volume (the size of the camera sensor itself), once captured you can refocus to any level with that scene (within certain limits). Make the camera’s volume large enough, and you have enough information about that scene to allow for positional tracking in the view; that is, you can manipulate your view within the captured scene left or right, up or down, allowing you ‘peek’ behind objects in the scene.

 

The post Lytro Gets $60M More for Light Field VR Capture, First Content Coming in Q2 appeared first on Road to VR.

Light Field Capture Company Lytro Raises $60 Million, Working With Within

Light Field Capture Company Lytro Raises $60 Million, Working With Within

Though still experimental in nature, light field capture technology presents one of the most promising solutions for delivering photorealistic VR experiences, and one company working with the tech just raised a lot of money to help push it forward.

Lytro, a company working on a light field solution for cinematic VR, today announced that it raised $60 million in a series D round of funding, led by Blue Pool Capital.

The company’s solution, dubbed Lytro Immerge, was announced last year, and offers an end-to-end means of delivering light field captured VR experiences. That starts with its proprietary camera rig that captures information from light rays in a scene including colors and where the rays bounce. With this information, the system is able to capture much more than a 360-degree video. The data captured by Lytro would let you move your head around inside photorealistic environments. That’s something that you’re not able to do in current live-action VR experiences.

Captured content can be edited with existing tools and then viewed via streaming, removing the need for huge downloads of data on a PC or mobile headset. The result would be high-end VR content like the company’s Moon video, which we saw last year.

Since revealing Immerge, Lytro has been working with creative teams to showcase what it can do. Today, it’s announcing a partnership with Within, one of the better known 360 degree video platforms. The pair have completed production on one of the first full experiences to use the technology, which will be debuting in Q2 of this year.

Lytro isn’t alone in this field as a number of companies attempt to figure out how to efficiently capture, render and display light fields, including OTOY.

Though it is still a ways off from mainstream adoption, light field capture could be one of the most important technologies in VR’s future. Combining this type of content with the standalone headsets that are on the horizon could create incredibly immersive experiences that aren’t connected to PCs.

Tagged with: , ,

Lytro Shows First Light Field Footage Captured with Immerge VR Camera

Back toward the end of 2015, light field camera company Lytro announced a major turn toward the VR market with the introduction of ‘Immerge’, a light field camera made for capturing data which can be played back as VR video with positional tracking. Now the company is showing the first footage shot with the camera.

Lytro has made point-and-shoot consumer light field cameras since 2012. And while the company has had some success in the static photo market, the potential market for the application of light field capture has pulled the company into VR in a big way.

Lytro_Immerge_Coast
See Also: Lytro’s ‘Immerge’ 360 3D Light-field Pipeline is Poised to Redefine VR Video

Immerge, a 360 degree light field camera in the works by Lytro, captures incoming light from all directions. With not only the color of the light, but also its direction, the camera is capable of capturing data representing a stitch-free snippet of the real world, and (uniquely compared to other 360 degree cameras) the data which is captured allows for positional tracking from the user’s head (the ability to move your head through 3D space {‘parallax’} and have the scene react accurately).

 

This ability is one of the major advantages over standard film capture, and is seen as critical for immersion and comfort in VR experience. Now, Lytro is showing off the first light field footage shot by their Immerge camera; they say it’s the “first piece of 6DOF 360 live action VR content ever produced.”

Light field captures from Lytro’s camera also have a few other tricks, like the ability to change the IPD (distance between the stereo images, to align with each user’s eyes) and focus as needed in post-production.

The company says that Immerge’s light field data captures scenes not only with parallax, but also with view-dependent lighting (reflections that move correctly based on your head position), and truly correct stereo which works no matter the orientation of your head. Traditional 360 degree camera systems have issues showing stereoscopic content when the viewer tilts their head in certain directions, while Immerge’s light field captures retain proper stereo no matter the orientation of the head, Lytro says.

8i-vr-video-1
See Also: 8i are Generating Light Fields from Standard Video Cameras for VR Video

According to Lytro’s VP of Engineering, Tim Milliron, Immerge can render up to 8k per eye resolution, synthesizing the view from hundreds of constituent sub-cameras. Milliron says the company expects content creators to use Immerge’s light field captures like a high quality master file, from which a high-end 6DOF-capable experience could be distributed in app-form to desktop VR headsets, or other more basic 360 video files could be rendered for uploading and playback through traditional means.

Last year, Lytro raised a $50 million investment to pursue their VR interests. While the company initially expected to have Immerge ready in the first half of 2016, it’s just now in Q3 that we’re seeing the first test footage shot with the device. Felix & Paul Studios, Within (formerly ‘Vrse)’, and Wevr were initially said to be among the first companies outside of Lytro to get access to the camera to begin prototyping content. The company is also accepting applications for access to the prototype camera on the official Immerge website.

The post Lytro Shows First Light Field Footage Captured with Immerge VR Camera appeared first on Road to VR.