The VR Job Hub: Big Roles At The Big Hitters

So ends another busy week for VRFocus covering everything at this year’s Electronic Entertainment Expo (E3) in Los Angeles. As with every year there was much that was revealed and much that left us with questions. Even beyond videogames though there will still plenty of news, so much we frankly didn’t have enough time to cover it all. There’s news across the medical industry, education industry, design industry and a lot more besides that we will get into over the course of the week to come. Because as we always say on VRFocus, immersive technology is in use everywhere.

So, unsurprisingly there are jobs everywhere too. Here’s a selection of roles and titles currently available in the immersive technology sector that you may well be interested in.

Location

Company

Role

Link

Jacksonville, FL, USA Brooksource AR/VR Developer

Click Here to Apply

San Francisco, CA

Cybercoders Software Engineer – React Dev for Augmented Reality

Click Here to Apply

Plantation, FL, USA

OSI Engineering, Inc.

Technical Sound Designer for a Virtual/Augmented Reality

Click Here to Apply

Haifa, IL, USA

IBM

Computer Vision & Augmented Reality Researcher

Click Here to Apply

Brussels, Belgium

Epson Account Manager – Professional Display

Click Here to Apply

Mountain View, CA, US

Google Software Engineer, Virtual Reality

Click Here to Apply

San Francisco, CA, US HTC VR Intern – Content & Acquisition

Click Here to Apply

San Bruno, CA, US YouTube Software Engineer, Virtual Reality

Click Here to Apply

Vancouver, Canada VRChat Inc Online Community Manager

Click Here to Apply

San Francisco, CA, US Unity Technologies Senior Graphics Engineer (XR)

Click Here to Apply

London, UK Facebook Technical Program Manager, Social VR

Click Here to Apply

 

As always, if there was nothing in this week’s feature that was a good fit for you, you can always look at the previous edition of The VR Job Hub.

As always, if you are an employer looking for someone to fill an immersive technology related role – regardless of the industry – don’t forget you can send us the lowdown on the position and we’ll be sure to feature it in that following week’s feature. Details should be sent to myself (keva@vrfocus.com) and also Peter Graham (pgraham@vrfocus.com).

Check back with VRFocus next Sunday at the usual time of 3PM (UK) for another selection of jobs from around the industry.

The VR Job Hub: Felix & Paul Studios, Hammerhead VR, Oculus & More

We’re now long past half-way through the year and we are, in fact, about to roll over into August. Time moves fast, especially when you’re detailing with a technology such as virtual reality (VR). In but a handful of weeks we’ll be off to Cologne in Germany for another Gamescom where no doubt we’ll be seeing many updates to various videogames and hear news of new titles in the works. We may also find out more about some of the hardware in development and see some recent additions, such as the Vive Knuckles controller, in action.

But before we even get to that we’ve SIGGRAPH which takes place next week.

If you’re excited by what you’ve been reading on VRFocus and are interested in taking the plunge into this industry and one of the companies working on VR, augmented reality (AR) or mixed reality (MR), or you’re already engaged in any of the three and are looking to switch roles we as usual have a selection from around the world below. Why not see if there’s anything that takes your fancy? A new career could be just a few clicks away.

View the new listings below for more information:

Location Company Role Link
New York, NY, USA YouVisit Unity Virtual Reality Developer Click here to apply
Yorktown Heights, NY, USA IBM Research Staff Member Click here to apply
Montreal, Canada Felix & Paul Studios Application Developer Click here to apply
Montreal, Canada Felix & Paul Studios 3D (Graphic) Developer Click here to apply
Montreal, Canada Felix & Paul Studios Computer Vision Developer Click here to apply
Newcastle, UK Hammerhead VR Lead Animator Click here to apply
Newcastle, UK Hammerhead VR Systems Administrator Click here to apply
London, UK Oculus Product Manager, Social VR Click here to apply
Cork, Ireland Oculus LED Research Scientist, Modeling Click here to apply
Seattle, WA, USA Oculus Developer Relations Engineer, Oculus Platform Click here to apply
Menlo Park, CA, USA Oculus Developer Relations Engineer, Rift Click here to apply
Menlo Park, CA, USA Oculus Manager, Display Engineering Click here to apply
Palo Alto, CA, USA Tesla Simulation Engineer Click here to apply

As always don’t forget that you can also view the roles in last week’s edition of The VR Job Hub. Also if you are an employer and are looking for someone to fill a role in a VR, AR or other related areas in the industry and want that position to be featured on next week’s feature, please send details to either myself (keva@vrfocus.com) or pgraham@vrfocus.com 

We’ll be back next Sunday, as usual at 3PM BST with more roles in the VR industry as part of The VR Job Hub.

IBM Watson’s Interactive Speech now Integrated into Star Trek: Bridge Crew

Last month Ubisoft released its biggest title yet for virtual reality (VR) platforms with Star Trek: Bridge Crew. VRFocus reported that the studio planned to add voice commands using IBM Watson integration in a future update, that update has now arrived.

Using IBM Watson’s interactive speech and cognitive capabilities, Watson Speech to Text and Watson Conversation, players will now be able to talk and interact with the virtual Star Trek: Bridge Crew members for a more realistic experience, mimicking that of the multiplayer mode for an experimental beta period.

Whether players are commanding the U.S.S. Aegis and U.S.S. Enterprise NCC-1701, IBM Watson services can be used to operate crews consisting of only AI characters or a mix of AI characters and human teammates with Star Trek: Bridge Crew’s full-body avatars that include real-time lip-sync.

Developed by Red Storm Entertainment, Star Trek: Bridge Crew, is a team focused experience with players taking on one of four roles, Captain, Engineer, Tactical Officer, or Helm Officer. Each has their own part to play in successfully completing missions.

VRFocus reviewed the videogame, giving it 4 stars, saying: “Star Trek: Bridge Crew definitely appeals to the core fan base. The production values are top notch making Star Trek: Bridge Crew one of those rare VR experiences that feels like a AAA title, and likely part of most VR gamers’ collections.”

Aside from Star Trek: Bridge Crew Ubisoft has released Werewolves Within and Eagle Flight. During the Electronic Entertainment Expo (E3) 2017 last week it was revealed that the studio had formed a VR partnership with the film company SpectreVision to create Transference was announced along with a new adrenaline fueled shooter, Space Junkies.

Ubisoft hasn’t stated how long the experimental beta period will last, as further details are revealed VRFocus will keep you updated on the announcements.

Hands-on: IBM Watson Brings Voice Commands to ‘Star Trek: Bridge Crew’

IBM Watson, the artificial intelligence platform designed to understand natural language, today launched support for Star Trek: Bridge Crew (2017) across PSVR, Oculus Rift and HTC Vive.

Before the service launched today, lone players could control the ship’s other posts—Engineering, Tactical, Helm—by clicking a few boxes to issue orders. Now a sole captain (also with a mixed crew of humans and non-humans) can complete whole missions by issuing commands directly to the non-human-controller characters using natural language.

image courtesy IBM

Voice commands are enabled by IBM’s VR Speech Sandbox program, which is available on GitHub for developers to integrate speech controls into their own VR applications. The Sandbox, released in May, combines IBM’s Watson Unity SDK with two services, Watson Speech to Text and Watson Conversation.

at the Captain’s chair, image captured by Road to VR

We had a chance to go hands-on at E3 2017 with Star Trek: Bridge Crew embedded with the Watson-powered voice recognition, a feature that’s initiated during gameplay with a single button press. While talking directly to your digital crew does provide some of those iconic moments (“Engage!” and “Fire phasers!), and most orders went through without a hitch, Watson still has trouble parsing some pretty basic things. For example, Watson doesn’t understand when you use the name of ships, so “scan the Polaris” just doesn’t register. Watson also didn’t pick up on a few things that would seem pretty easy at face value. Commands like “fire on the target”, “fire on the enemy,” and “come on, let’s warp already!” fell on deaf digital ears.

IBM says their VR speech controls aren’t “keyword driven exchanges,” but are built around recognition of natural language and the intent behind what’s being said. Watson also has the capacity to improve its understanding over time, so those “Lets get the hell out of here, you stupid robots!” may actually register one day.

This however doesn’t stop a pretty weird logical disconnect that occurs when talking to a bot-controlled NPC, and it stems from the fact that I was at first imbuing the NPCs with actual intelligence. When talking directly to them, I was instinctively relying on them naturally to help me do my job, to have eyes and ears and not only understand the intent of my speech, but also the intent of the mission. A human tactical officer would have seen that we were getting fired on, and I wouldn’t have had to issue the order to keep the Bird of Prey within phaser range. I wouldn’t have to even select the target because Tactical would do it for me. IBM isn’t claiming to be able to do any of that with its cognitive computing platform, but the frustration of figuring out what Watson can and can’t do is a stark reality, especially when getting your tail-end blasted out of the final frontier.

In the end, Watson-supported voice commands may not be perfect—because when the Red Shirts are dropping like flies and consoles are exploding all over the place, the last thing you want to do is take the time to repeat an important order—but the fact that you can talk to an NPC in VR and get a pretty reliable response is amazing to say the least.

The post Hands-on: IBM Watson Brings Voice Commands to ‘Star Trek: Bridge Crew’ appeared first on Road to VR.

Tribeca Film Festival Launches ‘Storytelling With Watson’ Contest.

IBM’s Watson cognitive computing system is one of the best-known examples of cognitive system in the world. It has already been used to create new recipes and create new clothing designs and now the organisers behind the Tribeca Film Festival are challenging the creative industry to come up with ideas about how Watson can be used to create new stories.

The Storytellers with Watson competition is open to the public as well as the many directors, writers and artists who form part of Tribeca’s network or industry professionals. Participants can submit their ideas on how IBM Watson can be used to create stories in any storytelling medium, film and video, videogames, augmented reality (AR) and virtual reality (VR).

The organisers have worked together with IBM to produce use-case guidelines and examples to help inform contributors on how Watson can help with realising their creation. Guiding categories include development, pre-production, production and post-production, marketing and distribution.

“The Tribeca Film Festival has always been a celebration of innovation and cutting-edge ideas,” said Andrew Essex, CEO at Tribeca Film Festival. “Since IBM Watson has been a big influence across many industries, we’re eager to see how our creative community will apply this technology to inspiring their own creative potential. Our collaboration with IBM is important to our mission because it spurs our community to push the limits of what they think is possible and find new inspiration that can redefine their approaches to art and storytelling.”

“Cognitive computing is driving incredible advancements in what humans and machines can do together, and one of the areas where we’re seeing this accelerate is within media and entertainment,” said David Kenny, Senior Vice President of IBM Watson and Cloud Platform. “Some of the best ideas come from our developer partners, and the Tribeca Film Festival community represents a wellspring of creativity and imagination. Through this competition, we’re eager to engage this dynamic group to see how they apply Watson to solve challenges and enhance the stories they tell.”

Tribeca Film Festival 2017 VR instagram

Submissions to the contest are open now and will be accepted up until 18th May 2017. Ideas can be submitted through the submission form on the IBM website. The creator who submits the winning entry will receive a trip for two to the Tribeca Film Festival 2018, including airfare, hotel and two festival passes.

VRFocus will continue to bring you news on competitions and projects involving VR and AR.

IBM And The New York Times Unveil AR Experience ‘Outthink Hidden’

The New York Times’s (NYT) T Brand Studio in collaboration with IBM, has revealed the launch of a new augmented reality (AR) app called Outthink Hidden, inspired by the 20th Century Fox film, Hidden Figures.

Hidden Figures recounts the true story of three female African American mathematicians as the heroes at NASA during the 1960s Space Race. Their groundbreaking calculations for spaceship trajectories, which helped put John Glenn in orbit, involved Dorothy Vaughan, who taught herself and others how to program a first-of its-kind IBM mainframe.

For Outthink Hidden, T Brand Studio explored the stories featured in the movie as part of 10 innovators in STEM (science, technology, engineering and mathematics). Similar to a virtual museum, viewers will be able to explore an array of 3D computer graphics renderings, written histories and audio and video narratives.

Outthink Hidden

“IBM has a long history of commitment to STEM, and to fostering diversity, tolerance and inclusion, which is core to our company’s culture and values,” said Ann Rubin, Vice President, Branded Content and Global Creative, IBM. “We were inspired to use this app to share the stories of unsung STEM innovators who have changed the lives of people around the world.”

“We’ve been waiting for the perfect opportunity to tap into Fake Love’s wealth of talent and creativity when it comes to experiential storytelling,” said Sebastian Tomich, senior vice president, advertising & innovation, The New York Times.  “We knew we couldn’t build The Times’s first AR experience just because we had the means to do it; we needed the right partner and the right story to tell. When we spoke to IBM about their work with ‘Hidden Figures, we recognized that this was an opportunity to bring users into the experience of the film and the remarkable women it showcases.”

The AR experience is available via the T Brand Studio AR app for free, via either iTunes or Google Play. The AR content can be activated on a mobile device at IBM.com/hiddenfigures, through select print editions of The New York Times, or at physical plinths at CES 2017 in Las Vegas this week. In addition, the content can be activated at one of 150 “geofenced” locations across the US.

These locations include popular tourist spots in 10 cities (New York, Los Angeles, Chicago, Philadelphia, Dallas, San Francisco, Washington, D.C., Boston, Atlanta and Houston); notable STEM centers  and STEM universities.

For further AR coverage from around the world, keep reading VRFocus.

‘Hidden Figures’ AR Experience Takes You Around The US To Learn About STEM Innovators

‘Hidden Figures’ AR Experience Takes You Around The US To Learn About STEM Innovators

Hidden Figures is an upcoming feature film that chronicles the lesser-known story of African American women at NASA and their crucial role into positioning the US at the forefront of the Space Race. The film is a critically acclaimed and pivotal sharing of an untold story and it’s getting a deep augmented experience to accompany it.

Powered by IBM and New York Times, Outthink Hidden is inspired by the film and seeks to shed light on lesser known historical figures around the nation. You’ll have to do some actual traveling though to uncover all that the app has to offer.

The application is an AR first for New York Times and highlights the main characters of the Hidden Figures film, Katherine Johnson, Mary Jackson, and Dorothy Vaughan. However, it also takes the educational experience a step further.  The twist on this AR app is that the different figures featured in this virtual museum are attached to locations across the US. Reminiscent of the Pokestops of Pokemon Go, you’ll have to go to static locations (there are over 150) where you activate sensors to initiate the AR renderings. Once you find them, the app offers audio and visual narratives that accompany 3D renderings of innovators in science, technology, engineering, and mathematics.

Experiences like Outhink Hidden and Pokemon Go, both on opposite ends of education and entertainment, may inspire developers to create more content that inspires users to get up, get out, and share in the experience with others. Ultimately for Outthink Hidden though, users do not have to actually travel to see all the figures available in the program. The AR content can be activated on a mobile device at a special IBM webpage, through select print editions of The New York Times, or even at physical plinths at CES 2017 in Las Vegas.

The experience is now available through the T Brand Studio AR app which can be downloaded for free on iTunes and Google Play.

Tagged with: , , , , ,