Inside Zero Latency’s Las Vegas VR Arena At The MGM Grand

Inside Zero Latency’s Las Vegas VR Arena At The MGM Grand

Melbourne, Australia-based VR company Zero Latency has partnered with the MGM Grand Hotel & Casino to launch its latest eight-player virtual reality gaming location in the U.S.

The 2,000-square-foot arena, which opened last week, is the latest video game attraction for MGM’s Level Up gaming lounge. And it’s one of 20 Zero Latency arenas in operation around the globe, including locations in Orlando, Boston, Philadelphia, Wisconsin and the Poconos.

Zero Latency co-founders Tim Ruse (CEO) and Scott Vandonkelaar (CTO) told UploadVR that the plan is to add another 30 to 40 arenas by the end of next year with the goal of hitting 100 arenas by the end of 2019.

Zero Latency is building its brand in the location-based VR arcade business through this new partnership with the established MGM brand. Ruse said MGM is putting its full marketing muscle behind this new “Virtual Reality Powered by Zero Latency” arena.

“It’s great to add an arena to another tier one U.S. market, and to partner with a global household name like MGM,” Ruse explained. “Outside of the MGM brand boost and marketing platform, the global destination of Vegas allows us to market Zero Latency to tourists who can play and then pass the word to their friends when they return home.”

In other words, Zero Latency is betting that what happens in VR in Vegas doesn’t stay in Vegas. This arena is the first for the company near the West Coast of the U.S., where rivals like IMAX VR and  The Void have opened up location-based arenas in New York and Utah with a Disneyland location featuring The Void’s Star Wars: Secrets of the Empire opening this fall.

Featuring an open space with no obstacles, the arena enables teams to freely explore the virtual environment together. The games have been designed to give the illusion of exploring huge spaceships or alien worlds, while players are steered through the same arena in different ways. While battling through a growing library of games (with three titles available and a fourth launching this month), teammates will see each other as full-motion avatars and stay in constant communication, allowing them to call for backup or ultimately compete for the highest score. There’s also a Zero Latency employee inside the arena with players to set them up for combat and oversee the game.

Players stay in contact with each other through Razer  headphones and are outfitted with Razer OSVR HDK2 virtual reality headsets while wearing a military-grade backpack containing a high-performance Alienware laptop computer. There’s also a custom-made, two-handed rifle that offers multiple firing options in-game.

The MGM arena charges $60 per person for a 30-minute gameplay experience. And the VR arena is accessible to younger players through a side entrance that circumvents the casino floor.

All Zero Latency locations currently offer three different multiplayer games, including the tower defense shooter Zombie Survival, the zero gravity shooter Singularity and the puzzle adventure Engineerium. Each game lasts for 30 minutes and allows up to eight players to team up and play together. Vandonkelaar said a new zombie game will launch this fall, which combines elements from the first three games for a more in-depth 30-minute experience.

“Over 100,000 players have gone through our system now, and we’re applying lessons learned from those players and their experiences in our first three games,” Vandonkelaar explained.

“The new zombie game has been designed from the ground up to be an eight-player and we’re adding more intensity and more exploration to this game than our first zombie game has.”

Vandonkelaar said all of these initial games have been created using Unity, but the company is currently working on Unreal Engine 4 games. And this fall they’ll release an API and developer kit for third-party companies to create new games for this platform of VR arenas. Some of these games will enter private trials this fall with 2018 launches planned.

Internally, Vandonkelaar said development is under way for a competitive multiplayer game specifically designed for esports.

 

“We’re working on new content to allow our players to compete,” Vandonkelaar said. “There are a lot of challenges to make a compelling experience, and we’re limited by the amount of space we have to have in these arenas. But we’re making sure the competition is as fun as possible and is skill-based, so that it has everything else esports needs. We’re coming to the point as we have more sites where teams of people can compete against people who aren’t at the same location.”

“There’s an incredible opportunity for VR esports in our arenas,” Ruse said. “It’s not just about a team’s knowledge of gameplay and skills, but you have to move around the arena and work together to best the opposing team. We see VR esports as part of our near future. We’ve been investigating and working on it and we’re keen to get into that business.”

Zero Latency is also fine-tuning a spectating system for its arenas. This will allow players to stream their play sessions across existing platforms like Twitch and Facebook Live to share with friends. But it will also set up a spectating mode for esports competition.

“Players will be able to stream their VR gameplay to the world quite soon,” Vandonkelaar said. “Everyone will get the chance to see the streams, and players can watch them back afterwards.”

Ruse added that in testing, there’s a real connection that develops through watching players compete in VR because you can hear them communicate and see them move within the game worlds.

“It feels like you’re connecting with an actual person,” Ruse said. “Friends can watch friends play and see and compare high individual and team scores, which should help drive the engagement of people going in to play and stir up the competitive nature of gaming.”

As Zero Latency builds out its esports strategy, Ruse said the company has been talking to brands interested in VR esports.

“There are a lot of companies interested in esports,” Ruse said. “It’s something we can have sorted for next year.”

The esports audience could introduce brand new consumers to the Zero Latency brand. Ruse said the company’s core demographic is comprised of males 25 to 40 years of age globally. And 30% of Zero Latency customers are female.

“The majority of people up to 45 years old are gamers to various degrees,” Ruse said. “Because people are more receptive to playing games, they’re more comfortable with our VR tech and we’re seeing a broader range of people coming through. We’re also seeing a lot of people who are VR curious and want to experience it first-hand.”

MGM is betting big on gaming, adding this Zero Latency VR arcade experience to a growing assortment of skill-based games designed to lure younger crowds. Now visitors to the Las Vegas Strip will be able to escape the real world, and any gambling losses, by entering these VR worlds.

Tagged with:

Flatline VR Brings Near-Death Experience To Life

Flatline VR Brings Near-Death Experience To Life

Jon Schnitzer has been waiting 16 years for virtual reality technology to evolve to this point. The creator, director and producer of Flatline VR has been fascinated by real-life accounts of near-death experiences since first meeting a friend in 2001 who lived to tell the story of leaving his body, only to return to Earth. That was the beginning of years of research, exploring the accounts of hundreds of people who had eerily similar stories.

“I thought it’d be amazing to document this, but if you film this you’d just watch someone else’s experience,” Schnitzer said. “I thought it’d be cool to film something in 3D, but I decided to wait until we could do 3D VR so we could put people into the experience and have that visceral and emotional connection.”

Working with 3D Live Entertainment and Epic Games’ Unreal Engine 4 technology, Schnitzer is now able to give people an accurate recreation of one woman’s near-death experience through the power of virtual reality. The five-minute experience will make its debut at ScareLA on Aug. 5-6 at the Los Angeles Convention Center before being released for consumers at a later date, with more information on www.flatlinevr.com.

Although the first imagery that comes to mind with “near death experience” is that of a camera floating above the body and looking back down, that’s not the focus of this first Flatline VR episode.

“There are so many different variations of the stories, but entering a vortex is a common theme for people from all different cultures and locations around the world,” Schnitzer explained. “That’s actually where the phrase ‘seeing the light at the end of the tunnel’ came from.”

The journey the Flatline VR experience follows is that of Gloria, a young woman in the 1950s who suffered a miscarriage in a wing of a large hospital and was left alone while she bled out.

“They put her in a room and forgot about her in a wing that wasn’t busy, and she was screaming for help for days and nobody heard her,” Schnitzer recounted. “She lost consciousness and died and went through this Flatline experience. When she woke up there were doctors at the foot of her bed. Her husband was in the military and he told her not to tell this account to anybody, so it wasn’t until decades later that she wrote about it in a letter to somebody.”

The words written in that letter are the exact words that actress Mella Leigh recorded for this VR experience. In another eerie coincidence, Leigh herself had a direct connection to this project.

“Mella had just had a near-death experience in a car wreck right before we approached her with this, and her story had a lot of similarities to Gloria’s story,” Schnitzer said. “I definitely get the chills when I hear Mella’s voice as Gloria.”

While Schnitzer doesn’t want to spoil the experience by offering a complete play-by-play, he does admit that Gloria spoke of being pulled down into a spinning vortex.

“That’s what hooked me,” Schnitzer said. “We wanted to do a vortex different than the way you’re used to seeing one. The vortex accelerates and closes in and it has textures and colors to it that come from all the different years of talking to people who have had different near death experiences, as well as speaking to the scientists that explain why people are seeing these images.– and what types of things you’re actually seeing.”

Beyond the expansion and the extraction of the color spectrum, the feeling of falling and spinning and turning, and the 360 audio experience that’s been designed to disorient you, there’s a lot more to the experience, including an element that makes it worthy of a horror festival. But that’s for people to try first-hand.

“She saw something come after her in the vortex, and how you interact with that is interesting and thrilling and controversial,” Schnitzer added.

The set-up at ScareLA will have participants enter the back of an ambulance and lie on a gurney, where an HTC Vive is placed on their head. Although the later home version won’t come with an ambulance, it will offer a built-in replay feature.

“In addition to Gloria’s version of the account, we have three different commentary tracks where experts are explaining why different parts of her story are happening,” Schnitzer said. “Each one is a very different point-of-view that goes into the science and what’s typical for these types of experiences.”

 

The entire experience was created in four months by a team of 12 people, including the audio team. It marked a collaboration between Schnitzer’s The Brain Factory 3D Live, the company behind Electronic Arts, Bioware’s and Cedar Fair’s Mass Effect 4D ride at Great America.

And if all goes according to plan, this near-death experience is just the beginning for Flatline VR.

“This was just the pilot episode,” Schnitzer explained. “When I met my friend 16 years ago who had a near-death experience he shared a story that was so epic that I knew I couldn’t pull it off for the pilot episode. We created this gateway episode, which is really emotional and powerful. And hopefully this episode will open the door for us to make the other episodes that I’ve been dreaming about for over a decade.”

Zero Latency Shoots For Multiplayer VR Arcade Network With 24 Locations In 2017

Zero Latency Shoots For Multiplayer VR Arcade Network With 24 Locations In 2017

Over the past three years, Melbourne, Australia-based startup Zero Latency has been refining its multiplayer virtual reality arcade platform, which currently has three playable games for up to six players with plans to add eight-player support by the end of this year.

The idea came to co-founders Tim Ruse (who’s now the CEO) and Scott Vandonkelaar (CTO) when the two first saw the Oculus Rift while working together at web agency Roadhouse Digital.

“The Rift had just come out and there was also this craze with IRLShooter’s Patient Zero, which combined laser tag with real actors and a zombie storyline, and the idea was to put these two things together and create a new form of entertainment from scratch,” Ruse explained.

The pair left their secure jobs and turned to crowdfunding to kickstart the idea, raising $30,000 on Pozible in July 2014. Ruze said that while this was hard money to raise, crowdfunding allowed the startup to get the message out to VR enthusiasts who were eager to step into virtual worlds and fight cooperatively. This also opened the door for traditional VR investment.

By August 2014 the company had a 2,200 square foot system operating with two players in a completely tetherless freeform multiplayer environment complete with pistols. A year later, the team had upgraded the platform to six players combating virtual reality zombies with assault rifles in the Unity-developed Outbreak game.

Fast forward to today and the company’s development team has created three new games: Survival, a defend-the-fortress style zombie horde experience featuring a squad of heavily armored soldiers, Singularity, which pits a team of space marines against rogue robots on a massive spaceship, and Engineerium, a puzzle-based exploration of a gravity-defying ancient alien world.

Zero Latency has a team of 10 people plus some external contractors working on new titles as well as downloadable content for existing experiences. Vandonkelaar said it takes between seven and nine months to create a game. A new title will be introduced for the platform this August, which will expand the zombie concept to a longer, more exploratory experience.

“We’re a one-stop shop right now,” Vandonkelaar said. “While we’re not game designers ourselves, we’ve tapped into the local game development community. And we’ve spent a lot of time developing and exploring free roam VR. Our goal is to create 15 or 30-minute experiences that people of all skill levels can enjoy and get people through the system in a timely manner.”

By this June, Zero Latency will have 10 sites across the globe and 24 by end of 2017, including U.S. locations in Orlando, Boston, Philadelphia, Wisconsin and the Poconos. The company owns several facilities, including its Melbourne location, and licenses out its platform through partnerships with companies like Main Event Entertainment, which hosts the VR arcade inside its Pointe Orlando arcade and entertainment center.

Hands-on With Zero Latency

 

I was able to go hands-on with the three current Zero Latency games in Orlando, which operates under the name VPlay Reality. The experience begins with meeting a Game Master, who will first walk you through all of the equipment and then serve as your guide inside of the 4,000 square foot arena. It takes about 8 minutes for the Game Master to walk you through the vest and backpack (which houses an Alienware PC gaming laptop), OSVR HDK2 headset and Razer headphone/microphone before running through the two-handed assault rifle (another 7 minutes).

While nowhere near as heavy as a real gun, the assault rifle is a sturdy handful with a button on the bottom for reloading and a button on the side to change the in-game configuration to any of four guns. Two of those guns are automatic, while the other two require individual cocking with every shot. It’s these latter in-game guns (which vary from game to game), that will leave your arm sore the next day because it’s quite a workout when zombies are swarming you or robots and drones are surrounding you in-game. You get more points for killing with the shotgun-style weapons in games than you do for the automatics.

Vandonkelaar said these custom-made guns use a mixture of optical and sensor-based tracking, which allows them to operate independently within any game inside the arena. Patent-pending technology using more than 100 cameras and motion capture devices track each player in real-time as they move.

“You can pass them between players; you can wield two guns in the game,” Vandonkelaar added. “We’ve also used dual pistols before as part of our crowdfunding campaign. But we’re focused on the rifles due to the safety aspect. If you give people pistols they’re more likely to wave guns around and hit another player.”

Part of the pre-brief explains the radar system that pops up in-game when any two (or more) players are close together. It offers a top-down video of the player and shows where the other players are, so it’s easy to move away from them and not bump into anyone in the heat of battle.

Once your team of six is all geared up, it’s time to enter the huge arena – which is essentially a pitch black room that the Game Master guides you into with a flashlight and then spaces you out. It’s inside where you place the rifle on the floor in front of you and pull down the VR headset and headphones. The Game Master turns on the game, which allows you to see and then pick up your rifle. And then you select a circle with your screen name above it for the actual game to load.

After playing a lot of Oculus Touch and HTC Vive games, it is a bit jarring to not see your hands inside of VR. This platform focuses on the gun and allows you to walk and run around, but you don’t have any form of gloves to bring your hands into the experience (at least not yet). And for anyone who plays a lot of home games on those platforms, Unity can only go so far inside of VR, so the graphics of these first-generation games are certainly solid, but they’re not raising the bar.

Vandonkelaar did say their platform will support Unreal Engine 4 and other technology beyond Unity. And Zero Latency is working on an Software Development Kit to allow developers to create original games for this platform.

What really separates this multiplayer Zero Arcade experience from anything at home and from some of the other VR arcades that are popping up around the world is the role of the Game Master. Ruse said the company focuses its hiring for this position by looking at only passionate, heavily tech-skilled people who are friendly and great with customer service.

“Early on we found they’re integral to the experience because people have a one-to-one connection with them and the Game Master brings the group into the experience,” Ruse said. “We want to give them more control, where they can actually control your destiny like in the old Dungeons and Dragons games. They bond with them and they’ll know how skilled a group is and will be able to choose different options to cater the gameplay experience to that particular group.”

Zero Latency is already testing this concept in Melbourne with its frequent players, so it’s only a matter of time before it rolls out to the rest of the world. I saw first-hand where this customization could make the gameplay better. While there were five other players ready to take on zombies with me in Orlando for 15 minutes, only one other player was interested in taking on robots in the longer 30-minute Singularity game.

Before we jumped in, our Game Master warned us that things would get pretty crazy in the game, which was designed for six players. If she could have tweaked things, the gameplay would have been more fun. While I didn’t die once in the zombie game thanks to teamwork and a full squad, I died 19 times and my teammate Enrique died 15 times when we were left to fend for ourselves in a game designed for six players.

All three of these VR games are different, which opens the door for replay. I personally enjoyed killing zombies the most because you just can’t go wrong there. Zombies are just part of the timed onslaught that will come at you in the burned-out city environment at night. You’re given some time between rounds to bolster up your defenses before the monsters tear them down from every angle (including above) to kill you. The action is fast and frenetic and all confined within a fixed area within your fortification. There are a pair of elevators for snipers to take a higher position to eliminate enemies, but I stayed close to the ground throughout the experience (although everyone goes airborne at the end when a military chopper picks up your platform to escape the zombies).

Singularity is the longest (30 minutes instead of 15) and most expensive ($40 instead of $20) of the three games. It’s also likely a game that’s a lot more fun with the full six players engaging the robots, flying drones and AI gone mad. The interesting part of this sci-fi shooter is how easy it is to maneuver throughout a VR world, while also walking in real-life. Through the use of elevators, tight corridors and moving platforms, your mind does believe you’ve traversed a massive spaceship by the end of the boss battle. There’s also a mind-bending zero gravity walk-on-walls sequence as well as a vertigo-inducing battle across a narrow ledge. The game splits the teams in two during the middle portion and then reunites them for the final showdown.

A Puzzler That Isn’t For People Afraid of Heights

Speaking of vertigo, Engineerium, the newest of the games, isn’t for anyone who has problems with heights. Set high above an alien world, the paths and platforms you explore twist and turn in all directions. There are times where you’re upside-down, which takes some getting used to considering you’re walking right-side-up in the real world. It’s truly an otherworldly and mind-tripping experience.

What’s most impressive about this platform is how seamless it is to shoot or explore or do whatever your in-game experience requires. Your mind can completely focus on having fun inside these game worlds without worrying about running into other players. Once Zero Latency ups the ante in the visuals departments with these first-generation games (and maybe adds some type of haptics to the vests – something the company is exploring), I can see this evolving into the type of experience you only once read about in sci-fi novels.

The team is expanding the number of players for future games to eight. Ruse said 16 players can interact inside the arena in Melbourne today, but they’re capping the number of players because of safety concerns and to ensure the games have enough room to maneuver so that it’s fun for everyone.

Tagged with:

The Future of Play: How Intel And The ESL Are Helping Bring Virtual Reality To ESports

The Future of Play: How Intel And The ESL Are Helping Bring Virtual Reality To ESports

While traditional eSports aren’t going away, many companies are building the foundation for a virtual reality subset. The current eSports numbers are staggering. According to Newzoo, of the 1.3 billion gamers worldwide, 256 million are eSports fans today. That number will grow to 385 million by 2017. ESports generated over $493 million in revenues last year and are expected to jump to $696 million this year.

When you look at the gaming landscape, the only thing as “hot “as eSports is virtual reality. Newzoo forecasts global virtual reality and augmented reality will generate $569 billion by 2025. And gamers will be a big part of that revenue, with projections of spending $100 billion on VR hardware by 2018.

Record-Breaking Numbers

So it should come as no surprise that the world’s largest eSports company, The Electronic Sports League (ESL), and one of the giants in both the tech and gaming markets, Intel, are already laying the groundwork for virtual reality eSports. The eleventh season of the Intel Extreme Masters (IEM) eSports tournament just concluded in Katowice, Poland over two sold-out weekends this month. Over 173,000 people attended and another 40 million people tuned into the livestreams across Twitch, Twitter, and a dozen television networks globally. IEM Season 12 will kick off its year-long tournament tour schedule on May 6 in Sydney, Australia.

ESL and Intel partnered with Sliver.tv to broadcast IEM Katowice in 360-degree video as part of a 2017 contract that will include seven global eSports events across ESL One and IEM. At Katowice, the included a first-person virtual eSports stadium experience delivering an immersive 360 VR space that includes live stats, replays, and scores in real-time. The VR stream featured a 200% growth in peak concurrent viewers compared to IEM’s first virtual reality live stream in Oakland, with 340,000 total unique viewers tuning into the VR broadcast.

The tech companies first experimented with 360-degree broadcasting across Counter-Strike: Global Offensive, League of Legends, and Dota 2 at ESL One New York and IEM Oakland last fall. More than 130,000 unique viewers tuned into the IEM Oakland VR streams. Sliver.tv has a separate deal with DreamHack to bring seven of its global 2017 eSports events to fans in 360. That deal kicked off with Dreamhack Vegas last month, which means eSports fans will be able to virtually attend 14 events this year using any VR headset.

Frank Soqui, general manager of the enthusiast desktop group at Intel, told UploadVR that Intel has invested in companies like Voke and Replay people fans of all sports want to look around and enjoy a 360-degree experience.

Redefining How Viewers Enjoy eSports

“We want to bring the audience into the immersive experience from a VR perspective through apps like Sliver.tv,” Soqui said. “Just because existing eSports games like League of Legends and CS:GO aren’t native VR games doesn’t mean we can’t use Sliver.tv to get people inside. We believe eSports will quickly evolve from watching competition from a flat screen perspective and will include virtual reality. I don’t know how many games will start taking existing designs and move to VR, but a lot more games will show up inherently designed with VR in mind.”

Intel actually showcased several of these potential VR eSports titles in Katowice, including a game being developed in Warsaw, Poland by HyperVR called Hyper Arena. While the game developer had a 1 vs. 1 version of the TRON-inspired disc-based HTC Vive game playable at the Intel Showcase at the Katowice International Conference Center, Lukasz Kur, founder and general director at HyperVR, said the studio is creating multiple additional levels that add new locales as well as a variety of weapons to the mix. The ultimate plan, according to Kur, is to release the game in 2018 with 2 vs. 2 gameplay. Beyond that, Kur would like to expand the gameplay into a 5 vs. 5 gameplay experience, which he believes will be perfect for team-based eSports. The game’s being designed to allow spectators to sit and watch the physical competition inside the virtual arena.

“Just imagine a full stadium of people who are watching gladiators rumble inside a virtual arena, when not only reflex and concentration matters but also physical muscle strength, agility, balance and creativity to finish off your opponent with style,” Kur said. “Hyper Arena VR is a perfect balance between sport and eSport.”

Intel is also showcasing a number of other eSports titles that could find a place into the tournament this year or beyond. Insomniac’s Oculus Touch spellcasting game, The Unspoken, made its second straight tour stop in Katowice (following its debut at IEM Oakland), alongside Ready at Dawn’s Lone Echo and Croteam’s Serious Sam. These games were featured in eSports tournaments open to the public in Katowice – complete with prizes.

“These games were developed with VR in mind and we’re starting to see the eSports angle emerge,” Soqui said. “Developers are starting to think about what kinds of VR games they should be creating that incorporate eSports fans into the experience. With Sliver.tv we at least have the audience inside of the game, but now we’re starting to see developers create games specifically for VR eSports.”

Intel also hosted VR games throughout the two-weekend event, which also featured a VR Festival Day on March 5. Vertigo Games’ zombie shooter Arizona Sunshine, Ubisoft’s Star Trek Crew Commander and Survios’ cooperative shooter Raw Data were among the titles playable for visiting eSports fans who attended from across Europe.

According to Ralf Reichert, CEO of ESL, two things will happen in the coming years as eSports evolves. One is that almost every game will have a competitive online aspect to it. And the other thing is there will be growth in the diversity of games.

“There’s a very small number of top games that people play today, but that will grow to include more games,” Reichert said. “And more professional eSports teams will be playing different types of games. Some of those games will be in VR, where you play standing with a controller and other input mechanisms that we haven’t even invented yet. The viewing experience could change as spectators wear VR headsets. It’s going to be fascinating to see how this all develops over the next 20 years. Like everything in gaming, it changes quicker than anything else does.”

Getting Active

Soqui said to succeed in eSports, VR games have to be really compelling from a viewing perspective like these giant tournaments that CS:GO, League of Legends and Dota 2 attract.

“I expect to see a lot of experiments and small local eSports things spring up,” Soqui said. “How fast it gets to that depends a lot on the fan base and how immersive it is. But you can see developers already interested. The great thing about VR is that it can bring new players into the market, and introduce a new audience to eSports.”

Lee Machen, director of developer relations at Intel, said one role IEM will play moving forward is ensuring that everyone has a chance to experience VR around the world.

“People who try VR are usually blown away by the experience,” Machen said. “There are a few things that have limited the growth of VR to date and one of them is how to get more people have that first ‘Oh my God’ VR experience.”

Intel showed off its Project Alloy wireless VR head-mounted display to eSports fans. That technology debuted at CES 2017. Soqui believes Intel’s Y-Gig technology, which debuted at Mobile World Controller, will also find a place at IEM moving forward, while that tetherless VR tech could also free up more competitive eSports play inside virtual reality in the near future.

As ESL and Intel map out the global stops for the 2017-18 tour schedule, VR will be a mainstay for eSports fans to play games, watch livestream eSports from the arenas and potentially become the future of eSports competition – at least on the smaller tournament stages for now.

Tagged with: , , ,

Nearpod Raises $21 Million To Further Virtual Reality Education

Nearpod Raises $21 Million To Further Virtual Reality Education

Edtech startup Nearpod has closed a $21 million Series B round led by Insight Venture Partners, Reach Capital, GSV Acceleration, Krillion Ventures and AGP Miami. The company has now raised $30.2 million to further its reach across the education landscape.

Guido Kovalskys, CEO and co-founder of Nearpod, told UploadVR this latest round of funding will support the demand from teachers seeking VR and interactive content, to ramp up our team and expand our portfolio with more classroom-enriching products like Nearpod ELL and 3D objects.

More than 10,000 U.S. schools in hundreds of districts use Nearpod, as do many schools overseas. That means one in 10 U.S. schools have already adopted virtual reality into their curriculum.

The company has created over 100 pre-made lessons that incorporate VR experiences, including trips to Mars, the Great Pyramids and U.S. historical landmarks. They teach subjects across a range of subjects from U.S. history and digital citizenship to algebra and science.

“There are more than 4,000 ready-to-teach interactive lessons for all K-12 grades and subjects created by our community of educators and partners and available now in the Nearpod Library,” Kovalskys said. “There are millions of VR images that teachers can use to make their own lessons as well.”

At Miami-Dade public schools, one of the largest English Language Learner (ELL) districts in the country, the company recently launched Nearpod for ELL to provide teachers with more than 500 lessons designed specifically for the fastest growing student demographic: non-native English speakers.

“Because we provide digital content they can edit, teachers can adjust their lessons to easily accommodate the diverse array of languages that are in their classrooms,” Kovalskys said.

Nearpod VR is compatible with any digital device – smartphone, tablet, laptop, chromebook, etc.

“The reason we’ve had such strong success with Nearpod VR — millions of students have used it in the last year — is because we’ve started with a system that works on anything,” Kovalskys said. “Schools don’t have the budget for an Oculus Rift or Google Explorer Kit, and honestly, the potential of smartphones is only just beginning to be realized. Eighty percent of U.S. teens have a smartphone, but most schools don’t allow them to be used as part of instruction.”

Nearpod employs a multi-tiered business model. The company offers free accounts to teachers that allow them to use the app, create polls, interactive quizzes, open-ended questions, slideshows, and more. There’s a gold account that offers teachers the ability to assign homework, receive detailed reports and integrate multimedia into the lessons they create for $10 per month.

“We have volume discounts for schools depending on the number of users that allow administrators to manage users school-wide, create shared content libraries and access advanced reports,” Kovalskys said. “And we have district-level editions that provide all of the school-level benefits plus on-site training and school/user management.”

“Nearpod is in the enviable position of having created a product that is both beloved by teachers and students and sustainably monetizable,” Brad Twohig, managing director at Insight Venture Partners, said in a statement. “It’s a rare feat in the education industry to find both traits in a single company, and in that regard, Nearpod is in a class of its own.”

Nearpod’s VR content is produced by partner 360 Cities. Kovalskys said the connection to students is in the tight integration with pedagogically-sound, complete lessons taught by experienced classroom teachers.

“Because of our strategic cross-platform approach, teachers are able to incorporate VR in their classrooms without any additional investment,” Kovalskys said. “If they want to use a headset, sure it’s more immersive, but it’s not necessary.”

Kovalskys said the demand for innovative classroom solutions like Nearpod is exploding thanks to increasingly affordable devices and the realization by educational decision makers that technology increases student engagement and can improve learning outcomes.

“Teachers tell us that VR is an effective way to engage students and literally expand their conception of the world and what it contains,” Kovalskys said.

Nearpod was recently recognized as one of the fastest growing private companies with a three-year revenue growth rate of 1,320 percent by Inc 500.

“We’re focused on steady, sustainable growth over the next five years, but aren’t sharing specific forecasts,” Kovalskys said.

Tagged with: , ,

Chevy’s augmented reality test drive puts you behind the wheel of your dream car

Epic Games continues to expand the reach of video game technology by partnering with Chevrolet on a new project that uses augmented reality to transform a custom Blackbird vehicle into any car. In addition, that same Unreal Engine 4 technology drives a new Chevy car customizer.

The post Chevy’s augmented reality test drive puts you behind the wheel of your dream car appeared first on Digital Trends.

GDC 2017: Epic Games Unreal Engine VR Editor Coming in April With New Features

GDC 2017: Epic Games Unreal Engine VR Editor Coming in April With New Features

Epic Games is using the Game Developers Conference (GDC) to give an advanced preview of the latest additions to its Unreal Engine VR Editor, which allows creatives to build worlds in a virtual reality environment using the full capabilities of the editor toolset combined with interaction models designed specifically for VR world building. The goal is to officially launch the new VR Editor by April 17.

Mike Fricker, technical director at Epic Games, told UploadVR that working directly in VR provides the proper sense of scale necessary to create realistic, believable worlds, while the use of motion controllers means artists and other non-programmers can build environments with natural motions and interactions.

Epic’s own Robo Recall team used the VR Editor to build out the free pack-in game for the Oculus Rift with Touch, which also makes its complete debut at GDC this week.

“As soon as they started using it, they realized what the most beneficial use cases were to them,” Fricker said. “Inspecting and tweaking was one of them, but sometimes they just want to throw in things really quickly and see it at scale without having to constantly take the headset off and on.”

The Robo Recall team had a direct impact on the new VR Editor that everyone will have access to in April. Fricker said the team needed little power user features like the ability to snap objects right to the ground instantly without having to go grab them from a menu and move them down to the ground.

“They asked us to give them the power to use these additional features so that they can stay in VR longer,” Fricker said. “That’s not to say that we’re trying to replace desktop. If they’re going to go and do blueprint scripting or material editing, you can get to that stuff in VR and you can make some progress if you knew you were going to tweak something or make a quick change to something. If you’re going to develop a function library or a new game system, you’re probably not going to do that in VR today. But the fact that you can go and see it and inspect it without having to leave VR, that’s the feedback that we got from the team.”

Developing inside VR not only opens things to all members of a team, it also speeds up the development process.

“It’s much faster to navigate a scene in VR than it is with the desktop, where you’re constantly using the combinations of the mouse and keyboard and modifier keys to orbit around an object and zoom the camera around,” Fricker said. “In VR, it’s one-to-one. I know exactly where I’ll end up at any point. Once you get used to it, it’s super fast.”

Lauren Ridge, tools programmer at Epic Games, said they’ve put in safeguards to ensure developers don’t get sick working within VR. For example, you can only move in rotation towards one direction. Not a single Epic user has ever had any motion sickness problems while in the VR Editor at the studio, where high-end PCs ensure a fast framerate.

“We have various levels of safeguard settings that will do things like turn on a grid for my tracking space or dissolve the sky into grayness,” Ridge said. “For example, in real life, I don’t have the ability to grab the world, turn it like a steering wheel and see the sky change. To some people, that’s instantly not good, so we’ve looked at all the different cases people have and added safeguards for them. You also can’t tip yourself over.”

Ultimately, the VR Editor has been designed to allow creatives to do whatever they want. Epic showcased a complicated scene set on a beautiful beach during its GDC Keynote, which includes a surfing mini-game as well as a sea plane flying overhead. Moving the plane to a higher altitude is done in seconds by grabbing the plane and moving its trajectory.

“We’ve been improving things since last year, which was the equivalent to our early access,” Fricker said. “We know that navigating 3D spaces is really fun and fast in VR, so that’s another cool thing that we’re excited about.”

The GDC beach demo also shows how easy it is to access the Unreal editor UI in VR to change settings or change what types of plants you’re painting down for foliage painting. The brush has been improved and makes things like undo and redo more accessible with a quick action.

Simulate mode allows developers to see how objects act when physics are attached. Ridge shows rocks of different sizes accurately falling off a cliff that overlooks the beach.

“This means you can use physics as an art tool,” Ridge said. “When you move the rock around gravity will act on it. You can also trigger gameplay events.”

The demo shows accurately built wooden 2x4s being snapped together into a staircase for a wooden hut on the beach.

“We also added more precise snapping tools,” Fricker said. “That’s about having things look organic and natural, but we also wanted a way to have really precise interactions with objects.”

Epic is taking advantage of VR, which offers more degrees of freedom with motion controllers than when using a traditional mouse and keyboard.”

“If I paint using different pressure on the trigger of the motion controllers, it’ll paint different strengths of the rock material down,” Ridge said. “This is cool because the editor already had various painting and fluid creativity features, but then being able to use those with motion control suddenly made them way more accessible. I can instantly get the bird’s eye view and see how it looks all in the scene and then jump down to see the player’s view of it to make any changes.”

Epic has also partnered with Disney’s Pixar Animation Studio to have Unreal Engine 4 and the VR Editor support Pixar’s Universal Scene Description (USD) 3D graphics pipeline. Epic showed the coral reef from Finding Dory and characters Crush the sea turtle and Mr. Ray the manta ray running in UE4.

“The cool thing here is that we don’t need any other separate tools to go from USD to what you’d see on screen with this demo,” Ficker said. “USD is a pretty big deal to the film industry and other non-gaming uses, but it has some special powers that make it equally awesome for games too.”

Pixar wants to add more plug-ins for creatives beyond Autodesk Maya, so UE4 now opens up new opportunities for companies working in VR.

“As more plug-ins appear, more people will begin using this format,” Ficker said. “USD has a really elegant set-up for just describing a scene in its entirety with all the information you need to uniquely instance specific things along with dealing with complex animation.”

“We know the film industry will like it,” Ridge added. “We will increasingly use USD here. Hopefully, we will keep working with Pixar to make it awesome for every use case we can imagine. Right now we are working on USD import, but at some point we will probably be able to generate USD files as well.”

Tagged with: , ,

GDC 2017: Epic Games Teams With Chevrolet On New AR Car Project

GDC 2017: Epic Games Teams With Chevrolet On New AR Car Project

Ever since Epic Games opened its Unreal Engine 4 technology to the world, new use cases have been coming in fast and furious. The latest example of just how far real-time video game engine technology has come will be on display at GDC 2017 this week. Epic Games partnered with visual effects and creative content studio The Mill to shoot a new Chevrolet video that utilizes UE4 technology and an augmented reality Blackbird motion tracking vehicle.

That electric car is the brainchild of The Mill. And before the top-secret vehicle was revealed, Epic Games CTO Kim Libreri got an early look under the hood.

“This high-tech car can run any performance envelope of any car, and it has all the electronic equipment that you need to be able to track it, so you know where it was relative to the camera car, and also generates the light and reflection information that you need to be able to light a computer-generated car,” Libreri said.

Libreri met with Vince Baertsoen, the head of R&D at The Mill, last year in Germany at FMX 2016. At the time, the one challenge Baertsoen had was that the director filming the car was still seeing the Blackbird and the transformation into whatever car they wanted it to become occurred in post production. The Holy Grail was for those shooting the sequence to see the final version of the vehicle in real-time.

At GDC, Epic is showcasing a 2017 Camaro ZL1 in a race against the Chevy FNR concept car, except the vehicles are actually photorealistic pixels running in real-time using UE4 technology. To prove the point, the ZL1 can be swapped out for a classic ’69 Camaro.

“Those cars are built like we would do a video game asset,” Libreri said. “Right now, it’s a specialized version of Unreal because we’ve just put the demo together, but these are features that are going to be available in regular Unreal. The only difference between this and a car that you would put in a video game is the amount of polygons in the car. We actually have a couple levels of detail to the car. The one that you see in the video is comprised of millions of polygons. We also have a low resolution version that would be a more normal game-level asset that would run on a PlayStation 4. The materials and lighting and most of the things you see in the video would run on a console in a more regular video game environment.”

These virtual vehicles were super-imposed on top of the Blackbird during a live action shoot on the Angeles Crest Highway in Los Angeles. The Blackbird uses professional grade AR, filming a 360 video from the center of the vehicle using Red Epic cameras. Everything that’s around the vehicle is filmed as if it was a panoramic 360 photography. And a spinning LiDAR scanner is scanning the surrounding environment.

“They take the output from these four cameras, stitch it into a panorama, and then beam it to Unreal Engine wirelessly,” Libreri explained. “And then we take that as lighting and reflection information that we can place on top of a car that they’ve tracked with a real-time tracking system developed by partner company Arraiy.”

Before joining Epic in 2014, Libreri spent 20 years working in the Hollywood visual effects industry at companies like Lucasfilm, Industrial Light & Magic, and Digital Domain. These vehicles are inserted into the live action compositing background plates, which are equivalent to the kinds of images ILM would use.

The director of the 60-second video is sitting inside a customized Mercedes ML that has a Russian Arm that can film the Blackbird from any angle. Inside the Mercedes, he can watch the UE4-generated vehicle in real-time and make filming adjustments on-the-fly. A PC running on a high-end consumer NVIDIA graphics card is set up inside of the Mercedes to transform the Blackbird into the Camaro vehicles.

“We’re using some pretty beefy hardware for the demo right now, but that hardware capability is going to be available in the cloud very, very shortly, so you’ll be able to run these kinds of graphics-on-demand projects from the cloud,” Libreri said.

In addition to handling the augmented reality, UE4 is also handling a lot of information simultaneously.

“Each of these shots is an individual shot like you would have in Premiere or Avid, where you can cut backwards and forwards, and trim, and add the audio tracks,” Libreri said. “It’s all running just like you were doing normal visual effects photography, but inside a game engine.”

The Mill officially revealed the Blackbird on stage at GDC during Epic’s keynote. And Chevy also used that event to debut the final version of the race, which offers a wow factor when the vehicles enter a tunnel and go all TRON-like to showcase the real-time visual effects UE4 opens up.

“At every GDC we like to do some project that not only blows people away and inspires them, but shows that together with a customer we take some of the best people on the planet using our technology and make our engine better,” Libreri said. “We do something that people thought was impossible, so that’s why we went to this next level.”

Tagged with: , , , , ,

GDC 2017: Epic Games Teams With Chevrolet On New AR Car Project

GDC 2017: Epic Games Teams With Chevrolet On New AR Car Project

Ever since Epic Games opened its Unreal Engine 4 technology to the world, new use cases have been coming in fast and furious. The latest example of just how far real-time video game engine technology has come will be on display at GDC 2017 this week. Epic Games partnered with visual effects and creative content studio The Mill to shoot a new Chevrolet video that utilizes UE4 technology and an augmented reality Blackbird motion tracking vehicle.

That electric car is the brainchild of The Mill. And before the top-secret vehicle was revealed, Epic Games CTO Kim Libreri got an early look under the hood.

“This high-tech car can run any performance envelope of any car, and it has all the electronic equipment that you need to be able to track it, so you know where it was relative to the camera car, and also generates the light and reflection information that you need to be able to light a computer-generated car,” Libreri said.

Libreri met with Vince Baertsoen, the head of R&D at The Mill, last year in Germany at FMX 2016. At the time, the one challenge Baertsoen had was that the director filming the car was still seeing the Blackbird and the transformation into whatever car they wanted it to become occurred in post production. The Holy Grail was for those shooting the sequence to see the final version of the vehicle in real-time.

At GDC, Epic is showcasing a 2017 Camaro ZL1 in a race against the Chevy FNR concept car, except the vehicles are actually photorealistic pixels running in real-time using UE4 technology. To prove the point, the ZL1 can be swapped out for a classic ’69 Camaro.

“Those cars are built like we would do a video game asset,” Libreri said. “Right now, it’s a specialized version of Unreal because we’ve just put the demo together, but these are features that are going to be available in regular Unreal. The only difference between this and a car that you would put in a video game is the amount of polygons in the car. We actually have a couple levels of detail to the car. The one that you see in the video is comprised of millions of polygons. We also have a low resolution version that would be a more normal game-level asset that would run on a PlayStation 4. The materials and lighting and most of the things you see in the video would run on a console in a more regular video game environment.”

These virtual vehicles were super-imposed on top of the Blackbird during a live action shoot on the Angeles Crest Highway in Los Angeles. The Blackbird uses professional grade AR, filming a 360 video from the center of the vehicle using Red Epic cameras. Everything that’s around the vehicle is filmed as if it was a panoramic 360 photography. And a spinning LiDAR scanner is scanning the surrounding environment.

“They take the output from these four cameras, stitch it into a panorama, and then beam it to Unreal Engine wirelessly,” Libreri explained. “And then we take that as lighting and reflection information that we can place on top of a car that they’ve tracked with a real-time tracking system developed by partner company Arraiy.”

Before joining Epic in 2014, Libreri spent 20 years working in the Hollywood visual effects industry at companies like Lucasfilm, Industrial Light & Magic, and Digital Domain. These vehicles are inserted into the live action compositing background plates, which are equivalent to the kinds of images ILM would use.

The director of the 60-second video is sitting inside a customized Mercedes ML that has a Russian Arm that can film the Blackbird from any angle. Inside the Mercedes, he can watch the UE4-generated vehicle in real-time and make filming adjustments on-the-fly. A PC running on a high-end consumer NVIDIA graphics card is set up inside of the Mercedes to transform the Blackbird into the Camaro vehicles.

“We’re using some pretty beefy hardware for the demo right now, but that hardware capability is going to be available in the cloud very, very shortly, so you’ll be able to run these kinds of graphics-on-demand projects from the cloud,” Libreri said.

In addition to handling the augmented reality, UE4 is also handling a lot of information simultaneously.

“Each of these shots is an individual shot like you would have in Premiere or Avid, where you can cut backwards and forwards, and trim, and add the audio tracks,” Libreri said. “It’s all running just like you were doing normal visual effects photography, but inside a game engine.”

The Mill officially revealed the Blackbird on stage at GDC during Epic’s keynote. And Chevy also used that event to debut the final version of the race, which offers a wow factor when the vehicles enter a tunnel and go all TRON-like to showcase the real-time visual effects UE4 opens up.

“At every GDC we like to do some project that not only blows people away and inspires them, but shows that together with a customer we take some of the best people on the planet using our technology and make our engine better,” Libreri said. “We do something that people thought was impossible, so that’s why we went to this next level.”

Tagged with: , , , , ,