Building Empathy: ‘Under the Net’ Malaria Documentary Shows the Art and Power of VR

Building Empathy: ‘Under the Net’ Malaria Documentary Shows the Art and Power of VR

One of the surprising results of accessible virtual reality has been the rise of documentaries that use 360-video to transport you somewhere to experience a place and situation unlike your own. And that makes empathy for the people there more possible and accessible itself.

“In America, the Developed World, we take for granted being able to go to bed at night safely, without the fear of a deadly disease attacking us while we sleep. Making this film really opened my eyes,” Justin Perkinson, writer and director of Under the Net, told Upload VR in an interview.

In a 10th anniversary event, the Nothing But Nets campaign from the UN Foundation premiered the VR documentary Under the Net. The campaign is focused on providing protective nets to the people of Africa, where mosquitoes pass Malaria to people at night. The disease can be lethal to children.

The origin of the charity is not what you would expect. Sports journalist Rick Reilly, who has a long history of writing stories for Sports Illustrated and ESPN, found out about the threat of Malaria to the children of Africa and realized he could make a difference.

“There was a documentary on Malaria. It said that every 30 seconds a child was dying of Malaria. And it said there was such a simple fix: hang a net over them. The mosquitoes only comes out from midnight to four. And if you can get a kid to six or seven years old, they are probably not going to die of Malaria. I was like, ‘That’s all we need? A net? Nothing has to be cured?’ This is a no-brainer charitable act,” said Reilly

He wrote a column about it in SI, worked with the UN to get the $10 donations which would buy a net to help a family. They raised $100,000 in the first week alone. Soon, he worked with others, and Nothing But Nets became an ongoing campaign.

Basketball star Stephen Curry became involved, and Reilly and him went to Tanzania together to meet those that needed the nets. The help of a sports star got the word out about the charity, besides donating himself.

Reilly explained, “Steph gives three nets every time he makes a three pointer. That’s kind of like your 13-year old daughter giving three nets every time she checks her phone. He makes that many threes. I made the mistake this year of giving $10 for every three he makes and he’s killing me! He took the three-point record from 286 to 402. That’s insane.”

In the 10 years since NBN began taking donations, along with the work of UNICEF, Malaria No More, and the Bill & Melinda Gates Foundation, and others, deaths from Malaria have dropped by 60%. NBN has raised over $60 million, providing nearly 10 million nets to families. But, after all of that progress, Under the Net reminds the audience, a child dies every two minutes from the disease.

If star power worked to spread the word about Malaria and the work of Nothing But Nets, how much could tech power help? Working with Samsung and the Discovery channel, the team at NBN returned last summer to Tanzania, with Perkinson, to capture the reality there in a way that couldn’t be done when the charity began.

Crafting a documentary in virtual reality can be a powerful thing. Anyone who has seen Clouds Over Sidra or Notes on Blindness can attest to that. Even with a brief 8-minute running time, making such a VR doc takes a lot of work, a lot of planning, and a lot of fortune.

Parkinson and the team from production company Secret Location scouted locations and spoke with the people, all within the constraints presented by daily life there. Afterall, they were in Nyarugusu Refugee Camp, the largest refugee camp in Tanzania with over 130,000 living there, making it the 5th largest in the world.

“We were in Tanzania for a week and a half, but we only had five days of access to the refugee camp. And the first day we spent location scouting,” said Perkinson. “All the planning you put in informs what you are doing, but then when it’s a living and breathing organism, which is a refugee camp, things change minute by minute.”

The result is a visually striking film. It starts with swamp buzzing with insects. You soon see the camp and the people taking refuge there. You see the harsh conditions of living there. You visit an under-staffed hospital. And then you see how the nets make a difference. It is all filmed in a way that puts you there, living in the huts, riding a truck to the hospital, waiting on line to get the life-saving net.

Under the Net has a sharp and vibrant picture. The assembling of the 360-degree image is nearly flawless, with no signs of the seams where they stitched the image together in post-production. The years of others working on 360-video has helped mature this kind of production, both artistically and technically. The result is not only interesting and captivating to look at, but easy to watch and absorb.

“I want to go back. Something about VR, it makes me feel like I’m right there again, you know? I want to see the kids gain. For a situation that is so terrible, they were so fun and hopeful,” said Reilly.

To make it a powerful story and not just another VR tech demo, Perkinson and his team had to find the visuals they needed, the places and situations to document, and even the person whose story would make the plight of Malaria approachable.

“We knew we wanted a story that was driven by a child, that could be the voice to carry a Western audience. We spoke to several different kids, all of whom were special, but for various reasons they weren’t the right person,” said Perkinson. “We got to the end of the day, and then we met Amisa. She was the last one we saw. She had presence and just a special spirit. She just had Malaria a couple of months prior, her two siblings were exhibiting symptoms, and she wanted to be a nurse. You couldn’t write a story better than that.”

So the documentary follows Amisa, as we see her life there at camp, the ups and downs. We are there with her, witnessing what she goes through. But halfway through, Perkinson shifts the story’s tone from grim to joyous. Amisa and her family have the nets. They are in a camp. They are safe. She is going back to school. She shows all the resilience that children somehow find in the worst situations. You are left feeling that these people can be helped, that this work does make a difference.

And that is what Under the Net is ultimately about. Like other documentaries, it is to inform, to entertain, and maybe inspire action of some kind. When the short is released early next year, if you go watch it on the Gear VR because of it’s technical and cinematic qualities, it may inspire you to donate to NBN or do even more than that.

Reilly said, “I hope this video puts us out of business; we never have to fight Malaria again. I hope VR gets that big. ‘Oh! Who can forget the groundbreaking Justin Perkinson video?!’ ”


You can find more information about Nothing But Nets and the Under The Net documentary at the official websites.

Tagged with: , , , ,

NextVR Partners With Live Nation To Stream Thievery Corporation Concert In VR

NextVR Partners With Live Nation To Stream Thievery Corporation Concert In VR

Sporting and musical events are huge, engaging, and exciting pulls for audiences and the best seats at these events draw major price hikes. Now, though, front-row seats to those events may have legitimate competition by way of virtual reality.

NextVR is a well-known name when it comes to live broadcasting sporting events in virtual reality, for example having recently set a consistent VR stream schedule for the NBA, and now they’re moving to a different type of live event. NextVR will be partnering with Live Nation to broadcast a live performance by the artist and DJ collective Thievery Corporation as they perform in Atlanta, Georgia.

Live Nation is a live-event company based out of California. The company merged with Ticketmaster back in 2010 and signs artists to different deals as more of a promoter than an actual music label, while also being a hub for ticket purchases for a vast collection of venues. Coming during Thievery Corporation’s 20th-anniversary tour, NextVR will be providing the arguably definitive views of the show from front-row, behind the scenes, and even on stage.

Thievery Corporation produces a sound with a wealth of influences, from acid jazz to electronica, and comprises a DJ duo and collection of supporting artists. The live performance takes place in Atlanta, GA at the Tabernacle venue on Dec. 7th at 9pm EST, and it can be streamed in VR through the NextVR application available on Gear VR.

Depending on how well this partnership goes, Live Nation may have to introduce another section on their website catering to 360-degree media and virtual interaction with the concerts. The NBA and NFL have welcomed VR into their programs often by way of their cable subscription services, so it looks like there’s enough of an impact to continue including it in future plans. Maybe we’ll start to see VR ticket packages for the definitive view of these events.

Tagged with: , , ,

Upload and Nokia OZO Team Up For 360 Filmmaking & Live Streaming Masterclasses

Upload and Nokia OZO Team Up For 360 Filmmaking & Live Streaming Masterclasses

360 videos are hot commodities right now. Everyone from Facebook to YouTube to Snapchat is tripping over themselves to integrate spherical video into their platform. There’s a boom happening in demand for 360 content and the hardware to make it happen, but what about the techniques?

360 filmmaking is an infant in the world of cinematography. There are so many questions that need to be answered before a 360 video can become a good 360 video. Where should your subjects stand? Where should the action take place? How can you direct a scene for an audience with the power to look wherever they wish?

Another rapidly emerging branch of 360 filmmaking is live streaming events – specifically music festivals in VR. The Nokia OZO camera was built with live-streaming in mind. With their new OZO Live offering, the OZO team has taken it to events around the world, including recently at the Austin City Limits Music Festival. What’s involved in a 360 live streaming production? How does one tackle technical problems that arise in the heat of the action?

These are all hurdles that burgeoning 360 artists need to overcome in order to reach their goal of true immersive excellence. That’s where we come in.

Upload is partnering with Nokia OZO to put on 2 masterclasses:
“360 Filmmaking Best Practices” and “How to Live Stream a Concert in VR”. This December 13th and 15th at 7PM PST we will stream 2 live educational sessions via the Nokia OZO on YouTube 360. The classes will be taught by Alex Henning and Juan Santillan — two highly accomplished, cutting-edge content creators — and will distributed by UploadVR.

December 13th @ 7PM PST – OZO Masterclass #1, 360 Filmmaking Best Practices, by Alex Henning, Executive Producer of The Argos File

December 15th @ 7PM PST – OZO Masterclass #2, How to Live Stream a Concert in VR, by Juan Santillan, CEO & Founder of Vantage.tv

To watch, you must register online. Links to YouTube 360 and viewing instructions will be sent to those who register.

These events are your first step toward becoming a professional 360 video creator. See you in class!

About your educators:

Alex Henning 

Alex Henning is the Co-Founder of Magnopus. He’s an Oscar-winning visual effects supervisor who won for Best Achievement in Visual Effects in 2012 on Martin Scorsese’s Hugo. His work has also earned awards from the Visual Effects Society and the International Press Academy, as well as a BAFTA nomination.

A hands-on expert in the art and science of synthetic images, Alex has created and supervised visual effects on a variety of projects including feature films, television, commercials, music videos, and theme park rides. His film credits include Star Trek Into Darkness, After Earth, Alice in Wonderland, Shutter Island, The Golden Compass, Superman Returns, Sin City, and The Day After Tomorrow.

A native Californian, Henning was born and raised in Berkeley, and holds a degree in Art from the University of California at Santa Cruz. His resume includes engagements at many prestigious visual effects companies, including Digital Domain, Pixomondo, The Syndicate and The Orphanage. Alex’s work has taken him around the globe for filming and post-production, in addition to being a guest and keynote speaker at several industry events. Henning has lived and worked in Europe, Asia and South America.

Juan Santillan 

Juan Santillan is the CEO and Co-Founder of Vantage.tv, a platform for virtual reality live events and fans. Vantage.tv empowers artists and events to expand their reach using its technology to create virtual reality events with engaging social VR features for fans, turnkey VR distribution and monetization tools. Juan brings over 8 years of experience creating software and content for immersive experiences in sports and entertainment prior to Vantage.tv; including being the executive producer of immersive projects for Nascar, Wimbledon, PGA Tour, The Black Eyed Peas and AT&T events.

Founded in 2015, Vantage.tv has already recorded and created in-venue live VR broadcasts for more than 120 performances including such iconic music festivals as Outside Lands, Coachella, Lollapalooza and Austin City Limits providing fans with the best seat in the house.

This post is sponsored by Nokia.

Tagged with: , , , , , ,

Dizzying ‘Assassin’s Creed’ VR experience drops you into the Spanish Inquisition

20th Century Fox is bringing the Assassin’s Creed experience full circle. With the new Michael Fassbender film debuting Dec. 21, Practical Magic has created an original story for Oculus Rift users that borrows from many of the elements that made the franchise a global hit.

The post Dizzying ‘Assassin’s Creed’ VR experience drops you into the Spanish Inquisition appeared first on Digital Trends.

Visbit Raises $3.2 Million To Build Out 4K VR Platform

Visbit Raises $3.2 Million To Build Out 4K VR Platform

Visbit just raised $3.2 million in seed funding with participation from Presence Capital, ZhenFund, Colopl Next, Amino Capital and Eversunny Limited. Changyin Zhou, CEO and co-founder of Visbit, told UploadVR the money will be invested to complete the closed beta of its patented Visbit View-Optimized Streaming (VVOS) technology.

Zhou said this technology is the first to stream and play near-zero latency 360-degree VR videos in 4K to 8K resolution over regular Wi-Fi and LTE for mobile – and eventually tethered VR headsets.

“As a fan of the movie The Matrix, I am a deep believer of the future of VR and have spent a decade studying and researching VR-related areas for my PhD in computational photography and computer graphics,” Zhou said. For VR streaming, “one fundamental problem is the huge data size, which poses tremendous challenges to processing, display and transmission. There was no practical, consumer-level solution to these problems even a few years ago due to hardware limitations.”

With the proliferation of mobile VR platforms like Google Cardboard and the new Daydream View, as well as Samsung Gear VR and others, Zhou believes that now is the right time for VR to go mainstream. The former Google senior research scientist partnered with former TangoMe marketing manager Elaine Lu to co-found Visbit in 2015.

“Increased mobility is a power driver for society’s advancement and VR transportation is the next level of mobility,” Zhou said. “We have set out to revolutionize visual technology. Specifically, we now focus on developing the most efficient way to deliver huge VR data.”

Lu told UploadVR that Visbit’s main goal is to solve a huge roadblock that’s preventing VR from becoming mainstream.

“With most consumers’ initial experiences of VR being 360-degree video content, many are walking away unimpressed due to the inability for today’s networks to deliver high-quality streaming experiences,” Lu said. “The lack of engagement from the end-user side could potentially discourage content creators. This problem is critical to the whole industry. With no comprehensive solutions heading to the market, we have set out to solve this problem before more devices enter the market and more consumers experience a negative first impression.”

To put things in perspective, Zhou uses a standard vision chart that can be found in any optometrist office. Today on existing mobile VR platforms, most streamed videos are delivered at 1080p, which is equal to 6 pixels per degree or a vision of 20/200, which is very blurry.

“If empowered by our view-optimized streaming technology, with nothing else changed, users can immediately view the same videos at a 4K to 8K level, which is equal to 15 pixels to 30 pixels per degree, comparable to a vision of 20/80 to 20/40,” Zhou said.

Zhou also compared viewing a 360-degree video in VR at 1080p to watching a 480p resolution VCD from the old days, while viewing 360 in 4K is like watching it at 720p.

“If viewing 360 in 6K to 8K, users can get viewing experiences comparable to watching 1080p to 4K in regular video,” Zhou concluded.

“4K is the minimum for a quality 360 VR experience,” Lu said. “Today, the majority of consumers are already used to 720p Netflix-grade level of video quality when it comes to video entertainment. If we can’t stream 360 VR video in a minimum 4K level of quality, VR videos as a type of entertainment won’t gain mainstream acceptance.”

Besides VR filmmaking, Lu said 4K is also critical to VR applications. Many of the business sectors that Visbit’s streaming service supports involve consumer decisions based on their viewing experience. Being able to stream at 4K is critical to their decision-making process.

Lu pointed to one of the company’s closed beta partners, Variable Labs, which is working with AAUW (American Association of University Women) to develop a virtual reality application to help train people on salary negotiation. How real the user feels in the simulation impacts the training effectiveness, and streaming also becomes necessary with more and more training videos being added to the app.

“Another beta partner, Cloudwave, is building a travel-related app that includes VR videos to help generate travel leads,” Lu said. “How appealing the 360-degree destination videos are can highly impact the consumer’s purchasing decision. Similar situations happen to many other business sectors such as healthcare, education, real estate, shopping, etc.”

Lu referenced another unnamed closed beta company, which previously had a 300MB app that required users to download the videos to watch. This severely impacted their user acquisition effectiveness. With Visbit’s streaming service in place, their app size was reduced to that of a typical mobile app. Lu hopes to see this change impact the company’s user acquisition once the updated app is released to the market.

That’s the type of difference Zhou believes consumers will experience next year when this platform launches. Currently, great VR content has to go through a long pipeline before it arrives in a consumer’s hand.

“It’s a barrel effect in the fact that the final user experience is determined by the worst part in the VR pipeline, which we see as the data transmission part,” Zhou said. “The launch of our service will greatly improve the 360 VR video watching experience for consumers. With higher resolution, less buffering and smoother rendering, you will finally be able to enjoy a sports game or a concert in VR and feel really present.”

Visbit’s technology is designed to be device-agnostic. Zhou said porting the solution from mobile to desktop VR, console VR or standalone VR shouldn’t be hard. The company has already solved the issues for the most challenging platform, mobile, which has limited computation power and battery.

“At the end of the day, we would like to make our service across all major platforms, no matter if it is owned by Facebook, Google, Valve or Apple, and also across all devices whether they are mobile, standalone, desktop or console,” Zhou said. “We believe this will offer great values to the industry: Publishers won’t have to keep different copies on the cloud; and consumers can enjoy the same great experience no matter where they go.”

Tagged with: ,

How Nokia Broke Into Virtual Reality With Its Ozo Camera

How Nokia Broke Into Virtual Reality With Its Ozo Camera

Nokia has been searching for new businesses to break into ever since it retreated from the smartphone business. And after a few years of research, the Finnish company decided to move into virtual reality 360-degree capture cameras.

The company launched its groundbreaking Ozo in March for $60,000, and then it cut the price to $45,000 in August. It is now shipping the devices in a number of markets, and it is rolling out software and services to stoke the fledgling market for VR cameras.

We talked with Guido Voltolina, head of presence capture Ozo at Nokia Technologies, at the company’s research facility in Silicon Valley in Sunnyvale, California. Voltolina talked about the advantage the Ozo has in capturing and processing a lot of data at once, and he talked about the company’s plans for expansion in VR.

Here’s an edited transcript of our interview.

VentureBeat: Tell me why you moved into making the Ozo VR cameras.

Guido Voltolina: The whole project and division is called Presence Capture. The idea is that, as soon as we identified VR was coming — this was before the Oculus acquisition by Facebook — it was clear that one part of VR would be computer-generated experiences, games being the major example. But as we looked at it, we said, “Wait a minute. If this is a new medium, there will be more than just computer-generated experiences. People will want to capture something — themselves, their life, things happening in the world.”

We had to look at what would be the device that could capture as much data as possible in order to reproduce the sense of presence that VR allows you to have when you’re fully immersed. As a subset of VR, you also have 2D 360 images. That’s still happening. But that’s almost a side effect of solving the major problem we have to solve, these full three-dimensional audiovisual experiences that reproduce the sense of “being there.”

The team started thinking about a device purpose-built for that task. Instead of duct-taping different existing cameras into a rig — many people have done that — we designed a device specifically for the job. The Ozo is not a good 2D camera, but it’s an excellent VR camera. The shape ended up being the same as a skull, very similar dimensions, with the same interocular distance as a human being. It has eight cameras, and the distance is very close, with a huge overlap in the lens field of course. We’re capturing two layers of pixels to feed the right and left eye with the exact interocular distance you’d have yourself. Many rigs have a much wider distance. That creates a problem with objects that are very close to you in VR. The disparity is too great.

With this solution, we then integrated eight microphones, so the spatial audio is captured as the scene is happening. When I’m talking to you here, I have no reason to turn around. In most cases, the only reason we’d turn around is if we heard a loud sound, say from over in that corner. We’re very good at turning exactly at the angle that we thought the sound was coming from, even though we don’t have eyes in the back of our heads. Our ears are very good at perceiving the direction of sound. We integrated both 3D audio and 3D video because the full immersive experience needs both. We’re rarely moved to look around by an object moving around us. The best cue is always sound.

The way 2D movies tell you a story, they know you’re looking at the screen, and they can cut to a different image on the screen as they go, or zoom in and out as a conversation goes back and forth. In VR the audio is the part that has to make you turn to look at someone else or something else.

The concept is capturing live events. People can go to a place that’s normally not accessible to them for whatever reasons — financial reasons, distance, or maybe it doesn’t exist anymore. If something goes crazy and the pyramids in Egypt are destroyed, we’ll never see them again. But if there’s a VR experience of the pyramids, it would be like walking around and seeing the real thing. You can think of it like a time machine aimed at the past. You capture events and then you can go back and revisit them. In 20 years your son or daughter could revisit today’s Thanksgiving dinner, exactly as you did.

VB: Why is this a good move?

Voltolina: It’s very similar to what happened with pictures and video. The first black and white photographs were only accessible to a few. Wealthy people would have family pictures once a year. Now we all have a high-resolution camera in our phones. Video came along and people would hire someone to film a wedding, maybe. Then VHS and digital cameras arrived. But the one doesn’t replace the other. Pictures didn’t replace words and video didn’t replace pictures. We still text. We still share pictures. We still post YouTube videos. Different media for different things.

VR is just another medium. Being a new medium, we focus on how to capture real life in VR. With that, we also have to consider the technology related to carrying and distributing data for playback. After the Ozo we created Ozo Live and Ozo Player. These are software packages we license to companies in order for them to build their own VR players with a higher quality, or to live stream the signal that’s captured by multiple Ozo cameras.

We were at the Austin City Limits concert, for example. A production company there had, I believe, eight Ozos distributed in various positions around the stage. It’s not just one camera. That’s what we were trying at the beginning — the front-row experience, which is great — but I want to go to places I can’t normally access, right? I want to be on stage up there next to Mick Jagger or whoever. I can squeeze thousands of people up there next to him now. In real life, you just couldn’t do that, no matter how much you pay.

VB: How does it differ from the other 360 cameras out there? Facebook showed off a design for one as well.

Voltolina: The majority of the solutions you see announced are a combination of multiple camera modules. Either they have SSD cards or cables. But there’s one SSD card or one cable per camera. If a camera has 25 modules you’ll have 25 SSD cards. When you shoot, you don’t really see what you’re shooting through the camera. Then you have to export all the data, stitch it together, and see what comes out.

One of the big differences with Ozo is that, yes, there are eight cameras synchronized together, but we created a brain that takes all this data and combines it in real time. Ozo’s output is one single cable going into either your storage or a head-mounted display. You can visualize what the camera is seeing and direct from its point of view in real time. It’s like a normal viewfinder. For VR cameras, to be able to see what the camera is shooting in real time is key differentiator.

The other key characteristic is that it can operate as a self-contained device with a battery and just one internal SSD card. You can mount it on a drone, on a car, in different situations where you need flexibility and the size has to be compact. It’s about the size of a human head. The unobtrusive design is a big advantage. Some of these rigs with 16 or 25 cameras become quite invasive.

If you want to capture multiple points of view — let’s say you have a rig with 16 cameras, even small ones like GoPros. What if you need seven of those? What if you need to assemble a hundred and some cameras? One of them might malfunction or fail to synchronize or something. Once you start demanding large numbers of cameras, the delta becomes significant.

VB: How much does each one cost?

Voltolina: It’s $45,000. The development of components that didn’t already exist is what pushes up the price. Cameras, since day one, have been thought of as one lens and one sensor. All the components related to the electronics around it assume you’ll have very high resolution, but only one sensor. When you combine eight together, the SOC that has to coordinate all the sensors — that component didn’t exist. We were forced to create an FPGA to do something that hadn’t been done before. You’re synchronizing all those 2K by 2K sensors at 30 frames per second. The data rate is significant. There were no components that can encode, in real time, all eight streams in one SOC that’s produced at affordable volumes.

Also, the sensor itself — we use a square sensor, which is the best geometry for capturing the fisheye lens. Most sensors are rectangular. That leaves a lot of sensor cells that won’t be used. We also needed all the images to be fully synchronized. I don’t know if you’re familiar with a rolling shutter versus a global shutter, but if you have multiple rolling shutters, the exposure to light is never 100 percent synchronized for all of them. It creates eight images that all have slight differences, and when you stitch them together they don’t match. We had to introduce a global shutter, which has a smaller market and costs more.

The lenses are custom made, because the geometry of the camera didn’t exist before. All the components are pretty much designed for this new purpose. The fact that we don’t already have millions of cameras on the market using them makes those components more expensive.

VB: If it comes out at that price, what kind of users are attracted to it at this point?

Voltolina: We started selling the Ozo in February of this year, in North America. At this point we’ve reached most of the rest of the world, including Europe and China. Our main customers are studios, big and small, that are already producing VR content. People who’ve already tried to make professional VR content or 360 video. When they see the Ozo, they understand the benefit immediately. It’s more expensive, but the savings in time during production — particularly in the stitching and post-production — is so significant that it pays off rapidly.

You can imagine that when you set up with actors on location, you’re spending the most money per hour that you ever will during a video production. If you have one camera where you can see what’s happening and coordinate the actors accordingly — you move here, you move there — and then you have to shoot with another camera, you can’t review what’s happening until later. If you have to go back and shoot again that’s a huge amount of money.

Currently VR experiences are mainly additional marketing for movies, things like that. They’re not movies in their own right. It’s the VR experience of Pete’s Dragon or Jungle Book, features like that. Or a lot of commercials, of course. Live events are big. We’re seeing people experiment with live events almost every week. We just finished up in China for the Strawberry Festival, a music festival, streaming that to several countries.

VB: What’s proving the most popular so far? Is it the live streams, or produced recorded content?

Voltolina: It always depends on who’s in the movie. [laughs] It’s not so much about whether it’s live or not. It’s about who’s the star. There was a movie opening for Disney that we streamed, Alice Through the Looking Glass. Pink, the singer, did a concert live for the premiere. Of course that drove a lot of viewers. But we also did a music video with OneRepublic for a new single called “Kids.” They released a 2D music video and then a VR experience. That wasn’t live, but on social media all the fans were really intrigued by the fact that they could watch the video, and then see it from a different direction in VR. Fans would watch it again and again as they discovered new things in the experience. The flow, the story of the whole thing is much more than just the band gathered around the camera.

Another one that was very popular was Pete’s Dragon. In the VR experience you really get to fly on a dragon. You can look around at the wings and the tail. The video itself is like an airplane ride. You’re flying over New Zealand. But the fact that you’re on top of a dragon — for a lot of fans that was a big attraction. It’s always a combination of subject and story. And of course, it helps if you have a major star.

VB: What’s the next step for you? Do you have a road map going forward?

Voltolina: The next steps go in two directions. One is toward a more complete solution. If we’re capturing more and more data, we need to efficiently carry that data to viewing devices, which are getting better and better. Last year was the year of Cardboard. We started seeing the first Oculus and HTC headsets this year. Now we have PlayStation VR, Daydream. Many more devices will be released with higher resolution and better performance. The highest level of immersion keeps going up.

Also, the number of people who are at least familiar with 2D 360 video is going up. That gives more of an incentive to go for immersive VR. The technology to enable Ozo Live or Ozo Player for better immersive playback, that’s currently our next step.

VB: So that’s increasing resolution?

Voltolina: Definitely resolution, but it’s quality in general. Resolution is always mentioned because it’s an easy way to describe better quality, but at some point resolution gets to where you can’t even distinguish. Better visual quality in general is coming, including the quality of stitching. That’s already improved tremendously. With the camera, you get the Ozo Creator software, which does 3D stitching. We’ve released three new versions with significant improvements each time.

Another area is production with multiple Ozos for live streaming. We’ll support a way of producing a VR experience that doesn’t just use one camera. It’ll incorporate commentary, different locations, and so on.

VB: Will you be able to bring it down in price?

Voltolina: We started at $60,000 when we first announced it. By summertime we adjusted to $45,000. The reason is, the first few months we were producing the earliest units and we weren’t sure we could scale manufacturing up to serve the whole world. We started with one region, North America, to see if the product would be well-received, and then if we could scale manufacturing. That happened in August. That’s how we were able to bring the price down. Since it’s professional equipment, a lot of rental houses are carrying it now, too, just like high-end cameras from Sony or Panasonic.

VB: Are the components going to move to move to a application specific integrated circuit (ASIC) at some point, possibly? Do you think you’ll be able to reach economies of scale?

Voltolina: At some point, yes. The trade-off is always volume and time. As you know, for an ASIC to be efficient you need hundreds of thousands of units. You also need a product that doesn’t evolve very fast. If you look at digital cameras, yes, they’re evolving, but they’re not completely changing every generation. Maybe it was like that in the very beginning, but now the technology is stable, and the volume — hundreds of millions of phones with cameras are being produced every year now. The number of VR cameras in the marketplace today — we’re still at the very beginning. As soon as the economics justify the migration to SoCs or ASICs, that will happen.

VB: What’s a good way to measure growth? Can you measure how many hours of VR content are out there?

Voltolina: We keep track of three or four major areas. One is the installed base of the head-mounted displays. We include Cardboard in that, but we’re counting them in a separate category. Cardboard, you never know if someone’s using it. Maybe you give it to your kids and it ends up in the trash. But something like the Samsung Gear VR, we don’t know how much it gets used, but at least you have the capability. And when people start spending $500 or $700, those devices probably get used.

The installed base of high-performance head-mounted displays is important, then. Also, the amount of money going into VR productions. If it’s true that the majority of those experiences are for marketing purposes — marketing movies or products — marketing works if you have an audience. The bigger the audience, the more marketing is interested in addressing that audience. That’s another driving factor.

We also monitor the major VR content hubs, like the Oculus store, Little Star, Disney VR, and so on. How much is out there for you to watch? If you compare to a year ago the difference is astronomical. It’s gone from tens to hundreds, and pretty soon we’ll be hitting the thousands. A good chunk of it isn’t necessarily the most fantastic stuff, but you can see the quality level going up.

The quality of the top VR experiences is getting significantly better. I don’t know if you remember one of the first pieces of content that was popular, but it was a guy playing the piano with his dog. Everybody thought that was great at the time, but if you watch it now, you’ll be bored in less than a minute. So what? The new ones are real stories. It’s not just, “Oh, a 360 video.” Something is happening for you to follow.

A studio called Magnopus — they won an Oscar for Hugo — did a small experience called The Argos File. They won an award for it. That was shot with Ozo. It’s very much an action story, a crime drama. You see things through the eyes of the victim. It’s very fast. Watching that, you realize that this can be really intense. If you like the genre, it’s a fantastic experience.

VB: Does this look like it’s going to be a good business soon, or is it still more experimental?

Voltolina: We believe it’s going to be a good business. I don’t think we’re at a stage yet where it’s mature, by far. The best comparison I have is the brick phone, which a lot of people in the industry use. The first brick phones were fundamentally phones without wires. Comparing that to the iPhone, it’s night and day. The best you could do with that brick phone was dial a number and make a call, and the battery would last a couple of hours. Right now we’re at that stage.

Of course we’ll move rapidly to whatever will be the iPhone of VR, as far as integration and features and so on. But it’s hard to even imagine, to a certain extent. The concept that is fundamentally true is that — I grew up in a world where every picture, every video, was a rectangle. IMAX is still just a really big rectangle. The new generation, with Minecraft and other games, but also now with VR video, the rectangle is just laid over 360. All the kids growing up now won’t understand why were so limited before. Why did we never move that rectangle around?

For me, that concept is what will improve VR. Of course, we can’t stick with this form factor. Comparing a brick phone to an iPhone, the form factor is amazing. God knows what will transform this box, this mask on your head — the steps you need to take to go into VR right now are significant. But PlayStation VR is already a huge step forward. The setup is compatible to what’s already in your living room. It works immediately. A lot of the steps are getting smoother. That’s why we truly believe the industry is growing.

If we’re able to participate and remain a leader like we are now, well, that’s all up to us. We need to keep innovating and try things like Ozo. We don’t know everything yet. There’s some level of risk you have to take when you’re the first.

VB: The Sony Pictures deal is hopefully going to create more fresh content.

Voltolina: Absolutely, yeah. We did a deal with the Disney group, including all the studios owned by Disney — Marvel, Lucas, ABC, Disney Nature, there are 13 or 14 of them. That was already a huge step. And Sony Pictures also includes Sony Music. It’s going from movie features to TV shows to music videos. When we negotiate those agreements, we always look at a broader range of entertainment.

We also did a production with Warner Bros. for Major Crimes, the TV show. They used the OZO. But that was for a specific show. When the deal is done with a group, it’s much better, because we can go for many different kinds of entertainment.

VB: Are you seeing anybody start to do longer video productions?

Voltolina: We’re seeing things stretch over multiple episodes. The guy who directed Grease, Randal Kleiser, he created a series called Defrost. It’s 10 episodes in VR. The story is that you’re living the point of view of a person who’s been hibernating, and then he’s thawed out. Your family meets you again, but you don’t remember them. All the acting is around you. You’re in a wheelchair and things happen as they push you through the hospital. Each episode is probably 15 minutes? I can see things going in that direction before they go for 60 or 90 minute features. Live events are already at 60 minutes, though.

VB: What are some of the other ideas people have had for it? I’ve seen views from the top of the arena in a basketball game.

Voltolina: We had a guy climbing Mount Everest with an Ozo. We didn’t even sponsor it. This guy just bought an Ozo and climbed Everest. He re-created the experience of being at camp one and camp two and so on. Of course you have sports, putting the Ozo in the front row, behind the basket, on a race car. People have gone to fantastic locations, like inside a volcano in Guatemala.

The different compared to a normal documentary is that you have that freedom to look around. Of course, a story still has to be told. The experience has to be entertaining. If it’s just silent, it’s pretty boring. If you’re alone in the jungle, just walking around, after a while you lose interest. But if there’s somebody talking about things that you can look at, while you still have the freedom to look where you want, that’s much more entertaining.

For the news, you can imagine — if you’re in the middle of an event where even the journalist doesn’t know exactly what’s happening, having a document where you can look around without the limitation of a cameraman deciding where you can look, that’s huge. You can revisit an event and find so many things you never saw before. You have the whole set of data.

Red Bull is doing a lot of VR, of course, extreme sports and so on. Locations, news, live events, music — you name it. The VR experience isn’t a substitute for video, though, which is interesting. It’s complementary. Say you come to my house to watch a game. We’re watching the screen together, and then someone on social media says, “Check out the home team’s bench.” Then you can go in VR and see what’s happening there while we still watch the game on TV. It doesn’t always have to be a substitute.

VB: I like the way VR audio works, how audio draws you to a particular point of view.

Voltolina: Definitely. We believe that’s half of the feeling of being present, that it’s driven by audio. If you just use stereo audio, or a mix that’s not accurate, it’s not the same.

VB: I do wonder where the technology is going to settle. Some of the early cameras had something like 36 modules. Why not use that many? Is that necessarily better?

Voltolina: The compromise is always the amount of data versus the benefit you get from it. We have eight cameras with a lot of overlap to create two layers of pixels. But we stop at a certain amount of data, because we wanted the live monitoring. We want a work flow that’s doable in real time. It’s a bit like those cameras that can capture a really high number of megapixels, but then you need to transfer the data in a certain way before you can see the picture.

Also, it’s the number of seams. If you increase the number of cameras, it’s true that you’re capturing at a much higher resolution, but then the number of seams you have to stitch is much higher. If you look at the wall over there, at that panel with the seams, if I just had one big piece of glass it would look better. But of course the cost is a tradeoff, having a big piece of glass that’s hard to transport compared to a bunch of little ones. More seams adds much more heavy computation. And if the seam cuts through an interesting part of the video, my brain will pick up on it right away. I’ll look right at it, and after that I can’t ignore it.

We compromise with our number of cameras because we want some flexibility in where the seams go. If you have a very high overlap, I can choose to position the seam at different degrees. I can dynamically change it so that if a person’s face is right on the seam, I can shift it to the left or right to avoid that problem. But if there are too many seams, as soon as I shift it to the right it’ll impact the other ones nearby. This gives you more space for that flexibility.

VB: What about the difference between applications? You have Hollywood cinematographers, consumers, GoPro enthusiasts. It seems like different cameras will be useful for different people.

Voltolina: If you start from the top of the market, that’s where money and time is available right now. If I’m at the top, I can shoot and perfect my product very well. But it also means the data I’m capturing has to be the best data possible. I can extract every single drop out of that data because I have time and money.

As soon as you become more limited in your budget — by which I mean both time and money — then the number of people operating in a smaller crew — it’s not that they don’t follow the same steps, but each person winds up with two or three roles. In a big production you have a cameraman, a lighting person, a sound person, assistants, and so on. A smaller team, maybe five people, one person will be the director and DP, another will do sound and light. That means that the product has to be able to do more things for more roles, all integrated.

Then you go all the way down to the one-man band. A news freelancer out there in a war zone, a guy who films weddings, or a guy who does educational videos for corporations. These might be $5,000 to $15,000 productions. Certainly not millions of dollars. But they need to work fast, because they need to make that money in a week, not over six months. The setup time becomes very important. Fast stitching and turnaround is very important, because they need to be able to show it to the customer, get an approval, and get paid.

Ozo, right now, is reaching the independent production stage. But for the one-man kind of production — it’s practically usable by one person, but the price is still significant. If I do weddings, I might rent an Ozo for one job. But most likely I wouldn’t own one yet. At some point I might line up enough jobs to make the investment and have that as a differentiator for my customers. Even if you’re not shopping for a VR experience, you might go to the guy who also does VR experiences because that shows that he’s the most technically advanced. It becomes a marketing hook for professionals.

The market for 2D 360 video and VR experiences is rapidly expanding. But they’re still nowhere close to regular video. Like I say, we’re still in the brick phone era.

VB: It sounds like a fun territory to be in right now.

Voltolina: It’s very interesting, yes. The part that’s most intriguing to me is this area where we can watch the same video and have a different experience. I can share something with you that, even if we both watched the video, you haven’t seen. A third person could come in with something we both haven’t seen. From a social exchange point of view, it’s fantastic. If we’ve all seen it once, that doesn’t mean we’ve any of us seen the same thing. Maybe I’ll try to watch the way you did a second time. It becomes a very interesting mechanism.

VB: Do you have any news coming up at CES?

Voltolina: We’ll keep you posted. [laughs] We’ll have some news. In general, we’ll have ongoing updates all year long, because we’re working on so many different fronts. We have the camera, the Ozo Live software, the Ozo Player, other technology to enable better viewing and stitching and live streaming. The last announcement on Ozo Live, for example, added support for multiple cameras. That’s a huge step.

VB: How big a part of Nokia is this? How many people are working on it?

Voltolina: A few hundred. Nokia Technologies overall is 800, 900 people. That includes digital health, digital media, and the licensing team. But we’re definitely expanding. If you visit our website, we’re hiring talent.

VB: Where is most of the work done? Is it Finland?

Voltolina: The majority of the R&D is in Finland. That’s where the project started. Now it’s maybe 65 percent Finland, 35 percent Sunnyvale. Sunnyvale is expanding. But it’s so competitive here. There’s a lot of VR expertise and a lot of VR investment. The expertise becomes a scarce resource. It’s like any wave of technology in Silicon Valley. As soon as everyone identifies a new wave, the highest concentration of investment is here and there’s a fight over the rock stars.

VB: What about augmented reality? Are you looking into that?

Voltolina: Definitely. AR is another area, though. AR has two meanings now. One is AR on your real surroundings, but there’s also AR video capture. As you can imagine, I can capture a video of a certain area and do AR not necessarily on what’s around me, but what’s been captured. That area is extremely interesting. It’s not just like subtitles or overlays, additional data that’s embedded in a video and it’s the same every time you watch it. With AR you can do it dynamically.

Again, you watch the same video, but depending on how you look at it or how you control it, different information can be overlaid. You can do that in a more interactive way. Every time you watch it you discover something new, extract different information. Up to a point of augmenting something that wasn’t there. What if something else was happening? But in an interactive way. What if I watched a recording of a meeting, but with different people there? Or the room was different somehow. All kinds of things.

You can see a convergence happening. There’s computer-generated VR and recorded VR. But you can easily imagine mixing those two together, in particular when the playback platform is the same.

This post by Dean Takahashi originally appeared on VentureBeat.

Tagged with: , ,

American Heroes Channel Releases Civil War 360-Degree Video Tie-In For Upcoming TV Series

American Heroes Channel Releases Civil War 360-Degree Video Tie-In For Upcoming TV Series

The American Heroes Channel recently released a new 360-degree video titled Civil War: Letter From the Trenches. The video is planned to serve as a companion piece for Blood and Fury: America’s Civil War, a TV series scheduled to debut on Dec. 14th on AHC. The series will show the war from the perspective of soldiers during iconic battles throughout the campaign.

Letter From the Trenches on the other hand will focus specifically on a Confederate soldier that’s under heavy enemy fire in the thick of combat. Instead of taking a more measured, slow-paced, and passive approach to immersive video like most content creators, this video appears to be designed to shock and rattle viewers directly. The sounds and sights of bullets whizzing by amidst cannon ball eruptions isn’t exactly a cheery way of spending an afternoon — especially when you factor in how powerful 360-degree audio and visuals can be at tricking you into thinking you’re actually somewhere dangerous.

The experience will use a combination of Go Pro cameras and fish-eye lenses to bring viewers closer to the battlefield than ever before with this reenactment and furthers the discussion of not only mature and serious content in virtual reality — but also how VR could be used to revolutionize historical documentary-style films as well. Reliving moments from history, such as the Civil War or the Wright Brothers flight, is an excellent use-case for educational VR.

Cream Productions’ co-founder and CEO, David Brady, told The Hollywood Reporter that, “above you there’s cannonball contrails, and those are spatially directed so you can follow with your eyes as your ears tell you where they are going. The effect is to hopefully make the viewer turn their heads to see all of this.”

The video itself, Letter From the Trenches, is available now for free at DiscoveryVR. There are various ways to watch the video, as it can be viewed within your browser, on Google Cardboard viewing devices, as well as on the Rift, Gear VR, and HTC Vive.

Tagged with: , , ,

Disfellowshipped Is Investigative Journalism Adapted To Immersive Media

Disfellowshipped Is Investigative Journalism Adapted To Immersive Media

News outlets like USA Today and Huffington Post have turned to 360-degree media to add a new layer to reporting in recent months. Though interesting, both works are just brief looks into various topics. Investigative journalism is a deeper dive into specific topics and The Center for Investigative Journalism, also known as Reveal, is using immersive media to pull viewers right into the story.

Disfellowshipped follows a journalist’s

The process that led to Disfellowshipped started back in May of 2015 when the VR start-up Vragments started work on the Fader tool. With Fader, the dev group wanted to provide journalists an easy way to integrate virtual reality into the reporter’s work loads. Fader works by allowing the journalists to create specific story points and allow viewers to explore them in virtual spaces while also adding additional elements in post-production. In the early stages of development, they turned to Trey Bundy’s investigation of Jehovah’s Witnesses and he broke down the story he wanted to tell into three parts.

The story focuses on Debbie McDaniel, a young woman who was exiled from the Witnesses by the very person that had abused her for five years. The 360-degree visual takes you through her story and her hometown of McAlester. The experience isn’t just one long 360-degree recording of the interview with McDaniel; it has various stills, cuts of her hometown, and music mixed in with Debbie’s interview and narration much like television shows that focus on investigative journalism. The resolution isn’t the best (Bundy used an inexpensive 360-degree camera) but the animated portions of the feature represent big additions.

“The answer is to give the viewer a more intimate understanding of a character and her experience,” Bundy wrote about the project. “The technology allows us to put you in the reporter’s shoes, to feel what it’s like to sit with people as they look you in the eye and tell you their story, to visit their towns and the places that affected their lives. In some instances, it becomes a window into a person’s emotional memory.”

Disfellowshipped is a solid example of how to keep viewers involved with a longer form of reporting with the experience broken into two parts, with an additional “Inside the Investigation” feature. You can watch part 2 of the feature on Youtube here or via your browser where all three clips are in the same virtual space here, clicking and dragging like a YouTube video. It has a VR option in the bottom right corner.

Tagged with: , , , , ,

Leap Of Faith: Behind The Making Of The ‘Assassin’s Creed’ Movie VR Experience

Leap Of Faith: Behind The Making Of The ‘Assassin’s Creed’ Movie VR Experience

 

In mid-October we learned about an upcoming VR experience was in development to accompany the theatrical release of the Assassin’s Creed feature film starring Michael Fassbender (X-Men: Days of Future Past, Inglourious Basterds). As it turns out, the project is a large-scale collaboration between AMD, Alienware, Practical Magic, 20th Century Fox, New Regency, and Ubisoft. Here’s how they made it happen.

The 360-degree experience was filmed by Practical Magic in cooperation with 20th Century Fox and New Regency. The theatrical activation will feature kiosks with Oculus Rift headsets powered by Alienware Aurora PCs using AMD Radeon RX 480 graphics cards.

The five-minute experience promotes the upcoming Ubisoft Films and 20th Century Fox movie starring Michael Fassbender, which opens Dec. 21. Matthew Lewis, CEO and founder of Practical Magic, told UploadVR from the outset the goal was raising the visual quality bar for VR videos.

Michael Fassbender as Aguilar de Nerha in the Assassin’s Creed film.

“The film Justin Kurzel and Adam Arkapaw shot is beautiful, and we wanted to make sure the VR experience kept up,” Lewis said. “This meant we were going to be building a lot of new production and post-production technology, which is what ended up happening at Practical Magic.”

While the Spanish Inquisition scene in the big budget film was shot on location in Malta, the VR experience was shot in Los Angeles. Lewis said his team went out with drones and scanning equipment to painstakingly scan the set, props, and other elements from the film production in Malta and London.

“Over the course of a few days, we scanned the world of the movie, and took it back with us to Los Angeles,” Lewis said. “We were then able to recreate the set both physically in the art department, and in the computer at extremely high resolution.”

A cast of 50 to 60 people assembled in Los Angeles to bring the Spanish Inquisition to life. Because of the 360-degree nature of the experience, Lewis said a lot of the background talent ended up featured very prominently in the sequence.

The Animus from the original Assassin’s Creed video game.

“There’s action happening all around you,” Lewis said. “If you watch it more than once, you definitely see things you missed the first time that add to the experience. Right at the very beginning, you get a full true 360 view of the Animus — every last inch of it — so you can study it in great detail and see things you might have missed in the movie. It’s a gorgeous set full of props and eye candy, and it’s the same exact set you see in the film.”

Practical Magic produced the show in segments over the course of 2016 in London, Malta, and Practical Magic’s VR studio in Burbank, and it took a laundry list of new technology to pull it off.

“We weren’t happy with any 360 camera rig at all, so almost all of the action was captured using motion control rigs, including some of our own invention,” Lewis said. “We used mostly RED Dragon cameras and shot multiple passes of everything, with a baseline resolution of 6K. The hallway fight scene is actually 26 passes of 6K images composited together covering different angles of the scene. When you see it at full resolution, it feels cinematic — it’s rich, sharp and detailed. The dynamic range is there — it doesn’t feel muddy or overly-compressed. That was hugely important to us. We also really need to call out Litegear, our lighting supplier, who provided literally hundreds of individually controllable LED light fixtures that allowed us to perfectly manage the world light during motion control. We couldn’t have done it without them.”

Post-production was done in-house at Practical Magic, using Nuke and CaraVR for compositing, Maya for 3D and Vray for rendering, and After Effects for a few key tasks, along with some custom software, plugins, and tools of their own. Lewis said his studio has a pretty solid on-premises render farm that is built specifically to deal with VR, so every frame seen on screen was generated there specifically.

Michael Fassbender fighting as Aguilar de Nerha in the Assassin’s Creed film.

“Editing itself was only a fraction of the post-production work,” Lewis said. “The visual effects component was very complex, and took months of work. The post-production pipeline for VR industry-wide is very immature and the software is alpha quality at best. We were also pushing our hardware to the absolute limit — imagine trying to work with 26 video streams of 6K footage at the same time in the same shot. We needed the best hardware you can get your hands on, and that’s what it took to get the job done. Otherwise, we’d still be sitting here watching progress bars.”

Fassbender plays Aguilar de Nerha in the film, an original character that’s part of a new story that ties into the universe of Ubisoft’s bestselling video game franchise. While filming last year at Pinewood Studios, Practical Magic shot Fassbender for this exclusive VR experience.

“We shot him on stage in London and he was a great sport,” Lewis said.

While Fassbender is the central character in the big screen adventure, the VR experience allows users to step into the boots of an original character.

A screenshot from the Assassin’s Creed VR Experience trailer embedded above.

“The viewer is not playing Aguilar — that’s a job best left to Michael Fassbender’s talents,” Lewis said. “I don’t want to give too much away, but yes the viewer is an Assassin.”

Gamers will also recognize Easter Eggs in the VR piece, according to Lewis. These occur mostly in the first scene, which was shot in the Animus set from the movie. There are other elements inspired by the video game franchise, as well.

“We move the camera a lot, which means there are a number of major parkour-type moves the viewer will experience,” Lewis said. “There are plenty of classic, tried-and-true Assassin’s Creed moves. There’s one part that always makes people scream a bit, which is exciting to watch.”

Lewis said the team knew the games and immediately everyone went to,“We have to do the Leap of Faith in VR!” So naturally, Lewis jokes, “I don’t want to give it away by saying we did a Leap of Faith in VR, but I mean, we did a Leap of Faith in VR, obviously.”

Lewis said the last few years of experience working on complex cinematic projects like Capture for The CW have been invaluable.

“We like to move VR cameras while we’re shooting, which is traditionally considered very difficult — so moving VR cameras has kind of become our thing at Practical Magic,” Lewis said. “A couple of years ago we built a cinematic VR camera rig for Google that Justin Lin used to produce Help!, which won the Gold Lion for VR at Cannes this year. We’ve continued to build all manner of cinematic VR rigs since. If we didn’t have engineering and rapid prototyping in-house to build our own VR gear, and a lot of really experienced technical people, we couldn’t have pulled any of this off.”


Gamers will get a first look at the Assassin’s Creed VR Experience today for free through the Oculus Video app on both Oculus Rift and  Samsung Gear VR, as well as a 360-degree video on Facebook. Additionally, moviegoers will see a national theatrical roll-out of the experience at AMC theaters in San Francisco, Los Angeles, Austin, and New York City between Dec. 2 and Jan. 1.

Tagged with: , , , , , ,

Insta360 Crowdfunds 360-Degree Camera Accessory For Android Phones

Insta360 Crowdfunds 360-Degree Camera Accessory For Android Phones

Social 360-degree media is a growing force with new applications, peripherals, and even phones built around the media becoming more and more readily available to consumers. Recent mobile 360-degree options we covered were geared to Apple and devices out of the US but there’s a new development. Insta360 is all about the creation of 360 media through mobile accessories and is planning a 360-degree clip-on for Android phones via IndieGoGo.

The Insta360 Air is a lightweight camera meant to live-stream or capture 360-degree video for sharing on social media. It has a sleek design with a few colors to choose from and it fits the aesthetic of popular smartphones. It is said to use a dual fisheye lens, captures 3k stills with 2k videos with stabilization tech, and can be paired with Insta360 Air mobile app for a gallery, capture, and 4 viewing modes (flat, VR, sphere, and planet). The app also includes editing tools for the 360-degree media.

The Insta360 Air comes in various packages on IndieGoGo , with the single device starting at $99 and a couple of them for $169. The other bundles are combinations of the device and different accessories such as an android cardboard VR viewer and tripod. There are a couple accessory- only collections as well. They asked for $20,000 to complete the project and fundraising is at $23,856 with a month left.

The Air has moved from concept to funding in a short time, with the prototype defined just in September of this year. The 2nd prototype is planned for December 10th with samples to be ready on December 30th. Though much after this will likely depend on how well the samples come out, the team is planning to preview the accessory at CES in January and start shipping out to backers in February. It’s an aggressive timeline and hardware crowdfunding projects often see unanticipated delays or costs.

Tagged with: , , , ,