If you love oceanic videography or just want to experience deep sea diving without getting wet, you’re going to love Hydrus VR, a submersible 8K virtual reality video system designed for professional filmmakers. The unit uses a total of 10 cameras — eight in a horizontal circle plus two vertical — to capture 8K, 4K, or stereoscopic 4K imagery, notably with impressive low light capabilities.
Developed by Marine Imaging Technology (MI Tech), the system looks like a large metal can ringed by lens bumps. Weighing 75 pounds with neutral salt water buoyancy, it’s depth rated for 300 meter submersion, so it can be connected to a metal control arm or underwater robots, depending on the filmmaker’s needs.
Inside the can are Sony ultra-high sensitivity UMC-S3CA cameras equipped with SLR Magic lenses, plus enough storage capacity and battery life to record continuously for two hours. A subsea control module enables the recording time to be expanded to eight hours. Users can remotely monitor nine of the cameras in real time during shooting.
Sony’s camera sensors enable the system to capture video at a minimum illumination level of 0.004 lux (ISO 409,600), which is especially important when recording in places without their own lighting sources — a challenge that increases video noise and grain. While the sensors have a normal ISO range of 100–102,400, the four times lower “expandable” ISO gives filmmakers the option to accept additional noise where necessary to capture an underwater scene with particularly poor illumination.
You can see the camera’s performance yourself in this Cayman Islands panoramic YouTube video, which can be boosted using the Settings button up to 4K resolution, and turned 360 degrees horizontally using the controller. Hydrus VR creates the videos by stitching together multiple cameras’ output using a 60 percent image overlap, resulting in such a low-distortion composite that seams aren’t visible. Users can choose from 8,192 x 4,096 spherical output at 30 frames, 4,096 x 2,160 output at 120 frames, or stereo 4K for VR viewing purposes.
The result of all this technology is going to be spectacular underwater videography of the sort that will likely appear in movies and upcoming VR applications. MI Tech’s raw recordings are high enough in resolution to exceed the capabilities of virtually every current VR headset out there, but between downscaling and the march of technology, that won’t be a problem for filmmakers.
“We are very excited to help tell interesting stories and work with our partners to create an unforgettable experience,” said MI Tech founder Evan Kovacs, “an experience that will make audiences feel inspired to cherish, save and protect — and even one day visit — these underwater environments.”
YouTube is emerging as one of Google’s highest priorities in VR, and now the service is coming to competitor Facebook’s Oculus Store for Gear VR.
YouTube VR is already available on Google’s own Daydream headsets, and on Sony’s store for PlayStation VR, as well as on Steam with support listed for HTC Vive and Oculus Rift. The YouTube VR app delivers the vast growing library of YouTube content on a big screen plus hundreds of thousands of videos made in 360- and 180-degree formats. Among these immersive videos there are some high quality productions commissioned specifically for YouTube by Google. The app is expected to be available on the Oculus Store for Gear VR this week.
A Google spokesperson declined to say anything specific about Oculus Go support for YouTube VR, but gave the following statement: “We want everyone with a VR headset to be able to experience YouTube VR, and we’re working to bring it to more VR platforms in the future.” Google previously brought to the Oculus store Google Earth for exploration and both Tilt Brush and Blocks for creation. Last year, the ad giant acquired Owlchemy Labs,which also sells Job Simulator and Rick & Morty: Virtual Rick-ality for Rift on the Oculus Store.
While many YouTube 360-degree and 180-degree videos available on YouTube might be poorly captured and your Internet connection might stream at low resolutions, the type of immersive content that works well on the service is also getting easier to make. Google’s VR180 format and new cameras which use it could enable a new generation of higher quality content. Also, once YouTube is available on so many platforms, Google might be able to push the boundaries on reactivity by funding new projects that are built to respond to your gaze.
Google is also taking this opportunity to begin rolling out its first true VR social features for YouTube — if you click the on the Watch Together “icon beneath the play controls from your Daydream View or Gear VR” you’ll be able to “watch and discuss videos with others in a communal, virtual space.”
A new partnership between RED Digital Cinema and Facebook aims to deliver a revolutionary new kind of camera system that can capture reality.
RED’s digital cameras (like the one pictured above) helped change the way movies were made by playing a key part in the switch-over from using film to completely digital production. Starting with blockbusters like Peter Jackson’s Lord of the Rings, RED’s cameras lowered production costs while capturing the detail required by some of the world’s most discerning directors.
Over the last few years Facebook started exploring more immersive camera technologies, and last year revealed an effort to create a system that can capture reality with six degrees of freedom — that is, the ability to move your head around or even lean to see a scene from any angle.
So the team-up aims to take RED’s image quality “with over 16-stops of dynamic range and high spatial resolution” and combine it with “Facebook depth estimation technology” and then put together a workflow that can make it easier for directors to capture immersive content.
“The imagery is clean — it makes the depth reconstruction work better,” said Brian Cabral, Director of Engineering at Facebook, in explaining why they teamed up with RED.
RED is notably also developing a holographic phone, but we don’t have any idea how this camera system might be used in conjunction with that device.
Some of the world’s largest tech companies, including Google and Microsoft, have been building up reality capture teams and exploring a variety of partnerships over the last few years to make higher quality cameras and easier-to-use production pipelines. Many of those early efforts have struggled to find a robust market for either the cameras or the imperfect content most of them produce. Could this RED and Facebook partnership be the one that finally succeeds?
The fourth production from VR studio Penrose, Arden’s Wake: Tide’s Fall, is debuting at Tribeca and it touches upon well worn territory for this team. Namely, it’s a story about the relationship between parent and child, coping with loss and, ultimately, moving on. Heavy topics for the film’s deceptively charming visuals, true, but the studios’ growing competence in this emerging field of narrative steers the viewer through some difficult drama with aplomb.
Arden’s Wake is the story of a young girl living in a flooded world. In the first installment, Prologue, we met Meena — now voiced by Alicia Vikander of Lara Croft fame — a young girl living in an unlikely paradise with her father who carries the weight of the old world on his shoulders. When Dad is lost to the ocean, Meena breaks his golden rule and dives in after him in a rushed rescue mission that seemingly does more harm than good.
In a press release announcing the second episode, Penrose noted it wanted to bring ‘important issues to the surface’ with its storytelling. A cute pun, perhaps, but Tide’s Fall wastes little time delivering on that statement. Any preconceived notions drawn from the film’s whimsical art style are quickly tossed aside here as some harsh truths are exposed. The first scene is crucial, taking Meena back through her childhood memories and digging up some aspects she’d have rather left under the ocean.
In an email, CEO Eugene Chung said Penrose is focused on telling “stories in a relatable way and make the viewers think about societal issues that impact many, if not all of us, in some way.”
What makes Arden’s Wake work so well is the freedom it gives the viewer. It’s a concept many still struggle to grasp in VR filmmaking, but Penrose holds onto it tight. Rather than constantly wrestling for your attention, Tide’s Fall’s plays out over a handful of long shots that don’t just allow the viewer to explore and investigate, but practically invites it.
It’s unconcerned with the camera colliding with the environment, for example, simply making walls fade from view as you approach them. The diorama-style sets, meanwhile, essentially give you a continuous wide-angle shot, making it hard to miss any action and, at the same time, bring intimate moments closer to you, sometimes even casting you as the growing barrier between father and daughter. Spatial storytelling like this can really help VR shine, and this is a great example of that.
Likewise, scale plays just as important a role as space. At times, it’s hard to resist the urge to scoop the puppet-sized characters up and shield them from the dangers surrounding them (I actually tried to do this early on in the Prologue despite the film being entirely non-interactive). Chung is keenly aware of this direction. “Many of Penrose’s characters are miniatures, like toys, and we dreamed as children of our toys coming alive as our companions,” Chung said. “This is the promise that interactive characters can bring, and we’re excited to develop further on this.”
Just as Tide’s Fall demonstrates the talented artistry on display at Penrose, it’s also a great showcase for the company’s “proprietary” review tool Maestro, which helps the creators “anticipate the variety of ways the film will be absorbed and create the story accordingly.”
As for a release of Tide’s Fall on consumer headsets, Penrose isn’t handing out any specifics yet. Keep your eyes peeled, though; this is one to watch.
The Tribeca Film Festival starts next month, an annual screening of everything from indie documentaries to family-friendly films. Coming to the festival’s Immersive program, which showcases works by artists who are pushing boundaries and using cutting-edge technology, are six Oculus-funded projects.
Many of the projects funded by Oculus came out of the company’s VR For Good program, which sees funding and expertise in VR film making go to causes for social change.
A total of 21 AR/VR projects are coming to the film festival (check out the full list here). Here’s all of the Oculus-funded projects heading to Tribeca next month:
Meeting a Monster
image courtesy Oculus
Gabriela Arp + Life After Hate:Meeting a Monster examines the memories and motivations of former white supremacist Angela King. Through audio recordings, dramatic re-enactments, and present-day footage, the film invites us to experience both the stereotypes and bigotry that lured Angela into the white power movement as well as the encounters that led her back out. While the monsters of Angela’s past and imagination define much of the eight years she spent mired in bias and hate, she finds the path to redemption only after encountering and acknowledging the ultimate monster—herself.
The Hidden
image courtesy Oculus
Lindsay Branham + International Justice Mission: In southern India, debt bondage enslaves entire families in a vicious cycle of deception and violence. The Hidden follows the developing case of a family of nine that has been enslaved in a rock quarry for 10 years—over the ludicrous sum of just $70 USD. Indian government representatives and human rights activists plot a raid to attempt to apprehend the creditor and free the family. The Hidden takes you to sites of active slavery and inside the rescue mission itself, bringing you face-to-face with two families as they endure the unspeakable.
Authentically Us: She Flies By Her Own Wings
image courtesy Oculus
Jesse Ayala + Pride Foundation:Even as transgender visibility in pop culture continues to break glass ceilings, direct violence and discriminatory legislation against the transgender community continues to rise. Shannon Scott stands up at a time when her communities—proud transgender service-people and veterans of the US Armed Forces—are vulnerable and under attack. Shannon has dedicated her entire adult life to defending and safeguarding American citizens at home and abroad. Driven by the military tenet of “Leave No One Behind,” she seeks freedom and justice for all from the marbled halls of Washington, DC, to the hallowed ground of those who championed equality before her.
Campfire Creepers: Midnight March
image courtesy Oculus
Directed by Alexandre Aja (High Tension, The Hills Have Eyes) and starring iconic ’80s horror icon Robert Englund of Freddy Krueger fame, this episodic narrative from Future Lighthouse and Dark Corner leverages the unique affordances of VR storytelling to chill and thrill audiences like never before.
Untitled Ok Go & WITHIN Project
image courtesy Oculus
WITHIN Founder and CEO Chris Milk joins forces with OK Go’s Damian Kulash to let you and a friend experience the joy of music creation. Enter an environment surrounded by magical music-making contraptions, involving animals and robots wondrously working together with your help to create an original song.
SPHERES: Pale Blue Dot
image courtesy Oculus
The Big Bang was silent. Then came sound. Journey through the history of sound in the Universe and uncover the strangest song of all. Following the premiere of the first episode of SPHERES at Sundance, Eliza McNitt returns to debut the second installment at Tribeca. And with an unprecedented deal signed as a result of the project’s first public showing, we can’t wait to see what the encore has in store.
From deep introspective explorations, to far-out journeys into the unknowns of the universe, explorers are the heroes that guide us forward. They are the ones whose relentless curiosity uncovers possibilities.
When it comes to storytelling in VR, Félix Lajeunesse and Paul Raphaël are among the most celebrated explorers, uncovering artistic and technical tricks that help immersive content creators progress further. And there is no more extreme example of exploration, than those that choose to venture beyond the Earth’s atmosphere.
Felix & Paul’s newest experience, Space Explorers: A New Dawn, premieres at the Sundance Film Festival New Frontier Exhibition this week. After the premiere, it will be available for free alongside its second chapter, Space Explorers: Taking Flight, with the launch of Oculus Go this year.
A space exploration vehicle takes a break on a test run in the desert.
I had the opportunity to preview Space Explorers: A New Dawn, and then uncover what Felix & Paul have learned from creating the experience. Plus, NASA’s Principal Virtual Reality Engineer and several astronauts shared with me how they are using VR to prepare for future explorations throughout our solar system.
Space Explorers: A New Dawn introduces visitors to NASA’s new generation of astronauts, Jessica Meir, Jeanette Epps and Victor Glover, the depths of what space exploration entails for them now, and the ambitious plans for the near future. I enjoyed being educated directly by NASA astronauts, while joining them in inspirational and breathtaking scenes. We covered a lot of ground, from training underwater, through to speeding through the air in a T-38 jet.
Félix Lajeunesse of VR studio Felix & Paul working on Space Explorers.
Lajeunesse shares that the second chapter, Space Explorers: Taking Flight, takes visitors on an exploration of “…the collaboration between NASA and private space companies such as Space X and Boeing, as well as the spirit of collaboration between the world’s national space programs.” Viewers visit Cape Canaveral, Russia and Kazakhstan, and also experience two up-close rocket launches.
Michael Gerndhart takes the wheel in a space exploration vehicle.
How Felix & Paul Create Entertaining, Educational Content in 360
Storytelling in VR not only takes you to new places, or allows you to be someone or something else, but it has the ability to give you experiences that enhance your life in a way that other mediums cannot do. The drivers of
Felix & Paul have innovate their proprietary technology as they capture new experiences for VR.
“We had quite a few firsts on this production, and solutions to find for many extreme situations,” explains Raphaël.
They went from the depths of shooting underwater at the Neutral Buoyancy Lab at NASA during actual astronaut training, to shooting from the co-pilot seat of an airborne T-38 jet which, Raphaël said, “subjected our camera to monumental vibrations that not only put our hardware at risk but made getting a clean and comfortable shot a challenge requiring the collaboration of NASA’s own engineers.”
“In all these cases we had to devise new ways of shooting, securing and processing our images, but the challenges were far from only being technical,” Raphaël said. “Our goal was never simply to get a 3D 360-degree shot, but to really immerse the viewer in a way that they felt they were truly there. Navigating the technical and logistical challenges and unpredictability of these extreme scenarios while getting that right shot pushed us to our limits.”
In October, I ran into Raphaël at an industry dinner the evening of the SpaceX Falcon 9 rocket launch. He excitedly shared that he was able to watch the live stream of the footage being captured for the second chapter of Space Explorers, with their camera just a few meters away from the rocket. So close in fact, that they subjected their camera to the flames of the rocket engine. It may have cost them a camera, “…but in every case, it was more than worth it.”
The moon is our Earth’s only natural, permanent satellite.
How NASA Is Educating Astronauts with Custom VR Experiences
This is not NASA’s first adventure with VR.
Evelyn Miralles, Principal Virtual Reality Engineer of the NASA Virtual Reality Laboratory/Astronaut Training Facility, explains that “NASA has been involved with VR research and development for space and military purposes since the 60’s.” But, since 1992, they “have been using VR officially for astronaut training at the Virtual Reality Laboratory.”
Their lab uses a VR system that was designed in-house for astronaut training. It’s used to “train astronauts for Spacewalks and Robotics operations as well as for [zero gravity] mass handling techniques,” simulating handling objects with large masses in case a repair or replacement is required outside of the International Space Station.
Astronaut Victor Glover says this VR simulation includes the use of gloves and a headset that allows them to see the simulation, and there are also physical “handling aids and equipment that we can hold and manipulate to simulate how heavy equipment behaves in microgravity. The hardware is connected to a series of cables, pulleys, and motors that really create a convincing simulation.”
Astronaut Jessica Meir told me that they also learn “how to use and operate the Simplified Aid for EVA Rescue (SAFER), a backpack-like system that we wear on our spacesuits in the remote chance that we became untethered from the space station and had to maneuver our way back.” The VR lab is essentially their “only means of training with this system to learn how it might feel and react if we were ever to have to use it on our mission.”
Miralles explains that NASA has also used VR to develop experiences that simulate life on Mars, and for collaborations. Reflecting on other opportunities for VR that are being investigated, beyond its ability to support activities like design and development, she highlights that it could also “aid in communication between astronauts and ground team work.”
Pilots in the Air Force conducting high-speed flight research.
An Elevated VR Experience for Out-of-Home Viewing
Felix & Paul chose to do a special cut of Space Explorers: A New Dawn for its premiere this week, with synchronized Voyager chairs by Positron. Raphaël explains that they were engaged by its ability to “elevate a VR experience in more than just a visceral way…Having your body react to the rumble of riding shotgun in the Mars Rover, the thrust of a T-38 jet or the feeling zero-G outside the ISS is incredibly satisfying, but being able to orient the viewer opened up new ways to frame a shot and tell the story.”
The full-motion experience may pop up next at a NASA visitor center.
Laura Mingail is a Marketing & Business Development executive in the entertainment industry, focused primarily on driving engagement with film and VR properties, as well as developing monetization strategies for VR content creators, publishers and out-of-home entertainment centers. She is also a contributor to UploadVR.
San Francisco startup Rylo is launching a 360-degree video camera today that emphasizes smooth videos that are easy to share.
Created by a team of former Instagram and Apple engineers, Rylo has stabilization software and a smartphone app that helps eliminate some of the pains of traditional shooting, editing, and sharing of videos, said Rylo CEO Alex Karpenko, in an email.
“For most people, creating and sharing beautiful video is a lot of work. It requires planning, and most of the time, videos turn out shaky, or you miss the moment entirely,” said Karpenko. “The combination of Rylo’s hardware and software gives anyone the confidence and creative freedom to get the perfect shot every time.”
The company says that Rylo lets you shoot the video and make it perfect after the fact, meaning you don’t have to worry about framing your shot or holding the camera steady to capture a video.
The camera has dual 208-degree wide-angle lenses that can capture everything around you. Videos shot on Rylo can be shared in two formats. You can create a regular high-definition video by selecting a traditionally framed view within the 360-degree footage, or you can share a fully immersive video in 4K 360 degrees. Rylo’s software automatically corrects any distortion typically expected with fisheye lenses.
“Historically, camera innovation has been dependant on upgrading hardware, but the future of innovation for cameras is in the software,” said Chris Cunningham, chief operating officer, in a statement. “The magical thing about camera software is how it closes the gap between what professionals and everyday people can do. That’s why we built software first and designed the camera’s hardware around it.”
With stabilization, Rylo’s software eliminates unwanted camera movement and shakiness, producing smooth videos that have historically only been achieved using expensive, professional-grade stabilization rigs and gimbals, the company said.
The Rylo app reduces editing time. After shooting a video, you plug the camera directly into your phone, and the app automatically opens. You can trim and crop the video and put yourself into the action with a picture-in-picture feature that shows your reaction to the main scene being filmed. Rylo also automatically follows the action if you want it to, as it adjusts the camera’s orientation to keep the action in the frame. You can share video to Instagram, Facebook, or directly with friends and family.
Rivals include GoPro, Vuze, Samsung, and others. Rylo is available for $500 today, and its iOS app is free in the App Store. An Android version is coming soon. It comes with a battery, 16GB microSD card, protective pouch, sync, and charge cables.
Rylo was founded in 2015 and is backed by Accel, Sequoia, SV Angel, and others. The company has raised $15 million and has 21 employees.
“Videos are useless if they’re stuck on your camera or computer never to be seen,” said Sameer Gandhi, partner at Accel, in a statement. “Rylo’s cofounders learned firsthand how complex technology, in the form of simple tools, helps people create pictures worth sharing. Now, they’re bringing this concept to video, and I’m excited to see what people do with it.”
Startup Live Planet plans to deliver its first cameras in December as part of an end-to-end stereoscopic 360-degree video platform aimed at making it easier for creators to stream live high-quality content.
Live Planet is led by serial entrepreneur Halsey Minor, who largely self-funded this company after a long track record at the start of businesses like CNET, Salesforce and GrandCentral (which later became Google Voice). With 360-degree video, he’s intimately aware of the failures of other products on the market, including Nokia’s OZO, and aims to succeed through a combination of high-quality hardware and easy-to-use software.
“As long as the industry doesn’t have standards, somebody has to build an end-to-end platform that delivers the quality of service necessary to bring consumers on board. The quality of the video is defined by the worst step in your workflow,” said Minor. “It’s not about being first, it’s about being best.”
Minor is certainly on point with his strategy here. There is a long list of things that can undermine 360-degree videos. If you put the camera too far from the action you can’t see the detail on people’s faces. If the stitch lines between the different lenses don’t line up you’ll constantly be distracted by distortions as people walk around the scene. If the streaming doesn’t deliver every pixel of detail exactly where the viewer is looking you’ll still be seeing a low-quality video.
Minor’s system, which represents years of development, aims to address all of these problems while offering a Web-based interface to creators so they can manage multiple live-streaming cameras from anywhere in the world.
The Live Planet system starts with a tiny 360-degree camera equipped with 16 lenses capturing stereoscopic video, with NVIDIA silicon on board powering it. The system is aimed squarely at the professional market just vacated with Nokia’s decision to end development of the OZO camera. The Live Planet camera costs around $10,000, which is a fraction of OZO’s price. And because the Live Planet camera is so small — a little larger than a coffee cup — Minor believes it will be able to be placed in tighter spots closer to the action. “The more unobtrusive you are,” Minor said, the more places you can put the camera. If a band is performing a concert, ideally there would be a camera next to each musician, according to the founder.
The camera stitches footage on-board and streams a finished stereoscopic file over Wi-Fi, ethernet or USB. It can be operated locally or through the Web, allowing for its settings to be easily adjusted through a Web page.
Live Planet’s cloud tools include adaptive streaming techniques designed to deliver the most detail to the viewer wherever they are looking at any given moment. Enabling the feature reduces the bandwidth required for streaming to a VR headset from 35 Mbps to 5 Mbps, which the creators say puts it in the realm of working over a cellular connection without any noticeable loss in quality.
The camera shoots at 30 frames per second and is said to provide a 4K stereoscopic view with USB to support accessories, an SD card slot, Toslink and HDMI.
“It stitches perfectly without seam lines in real time and encodes on the device,” Minor said. “No encoding box needed.”
The Web tools are designed so any number of cameras can be managed simultaneously and additional features can be added later by creators. The company is also offering tools to build apps so creators can stream inside their own apps.
According to Minor, the company plans to deliver its first batch of 100 cameras in December to pre-order customers who have already signed up for the system. Live Planet’s delivery platform also works with cameras from other companies — but those don’t work with live streaming. The cloud platform is currently being offered in a testing release.
Live Planet is entering a space that’s already seen a number of entries from Nokia to Google to Facebook to Jaunt all offering different pieces of the 360-degree capture and streaming pipeline. But these efforts have all met with limited adoption and success. Live Planet, however, aims to provide a robust, complete and easy-to-use toolbox for creators.
“The only way to be successful is to put these tools in people’s hands and do it so they can very quickly start getting value,” Minor said.
Earlier this year Facebook debuted a technique for capturing volumetric content from a 360-degree camera — meaning you can move freely around inside the footage to see the action from different angles. This week Adobe debuted a project that seems to produce a very similar effect.
The concept is called Project Sidewinder and it was presented on stage during Adobe’s MAX creator’s conference in Las Vegas. The project may or may not become a part of the company’s products. It was presented as one of 11 concepts this week showcasing Adobe’s future-facing efforts to help creators work more efficiently. Another VR-related concept Adobe showed, called SonicScape, looks like it could easily fit into the workflow of people using Adobe’s Premiere video editing software.
The techniques from both Adobe and Facebook look like they are extremely limited, with some stretched and distorted artifacts becoming more and more visible the further you move away from the camera’s actual position. Nonetheless, for small movements in VR the techniques can powerfully enhance a persons’s sense of presence in captured content.
Check it out in the video from MAX as Silicon Valley’s Kumail Nanjiani tests out the feature.
The virtual reality industry is fragmented. Secret Location, a VR tech company, wants to address that with Vusr, a white-label distribution platform for VR apps. It’s getting an update that makes it easier to publish, distribute, and monetize 360-degree video and other VR content on any VR headset.
Vusr can now support real-time rendered content, including room-scale VR (like on the HTC Vive VR headset) and augmented reality content. Vusr is already being used by big publishers to distribute content to the masses. Much of VR and AR technology is applied to gaming, but software developers can push beyond games to offer applications from entertainment, news, social issues and business-level operations.
“Allowing publishers to incorporate real-time rendered VR and AR content, and go beyond just 360 video, opens up endless possibilities,” said Secret Location president and founder James Milward in a statement. “Removing barriers and simplifying distribution is going to accelerate the amount and quality of content we’re seeing in the industry, which is really what we need in order to grow it.”
Toronto-based Secret Location is addressing the problem of fragmentation, and it believes that content creators and publisher need a centralized platform to cost effectively distribute and market their apps. Vusr enables a large number of apps to be accessed and experienced from within a single unified app.
Vusr was incubated by Secret Location and is backed by Entertainment One, a global independent studio that specializes in full-service publishing focusing on the development, acquisition, production, financing and distribution of world-leading entertainment content. Secret Location debuted in 2009 and it was acquired by Entertainment One in 2016.