The VR Job Hub: Big Roles At The Big Hitters

So ends another busy week for VRFocus covering everything at this year’s Electronic Entertainment Expo (E3) in Los Angeles. As with every year there was much that was revealed and much that left us with questions. Even beyond videogames though there will still plenty of news, so much we frankly didn’t have enough time to cover it all. There’s news across the medical industry, education industry, design industry and a lot more besides that we will get into over the course of the week to come. Because as we always say on VRFocus, immersive technology is in use everywhere.

So, unsurprisingly there are jobs everywhere too. Here’s a selection of roles and titles currently available in the immersive technology sector that you may well be interested in.

Location

Company

Role

Link

Jacksonville, FL, USA Brooksource AR/VR Developer

Click Here to Apply

San Francisco, CA

Cybercoders Software Engineer – React Dev for Augmented Reality

Click Here to Apply

Plantation, FL, USA

OSI Engineering, Inc.

Technical Sound Designer for a Virtual/Augmented Reality

Click Here to Apply

Haifa, IL, USA

IBM

Computer Vision & Augmented Reality Researcher

Click Here to Apply

Brussels, Belgium

Epson Account Manager – Professional Display

Click Here to Apply

Mountain View, CA, US

Google Software Engineer, Virtual Reality

Click Here to Apply

San Francisco, CA, US HTC VR Intern – Content & Acquisition

Click Here to Apply

San Bruno, CA, US YouTube Software Engineer, Virtual Reality

Click Here to Apply

Vancouver, Canada VRChat Inc Online Community Manager

Click Here to Apply

San Francisco, CA, US Unity Technologies Senior Graphics Engineer (XR)

Click Here to Apply

London, UK Facebook Technical Program Manager, Social VR

Click Here to Apply

 

As always, if there was nothing in this week’s feature that was a good fit for you, you can always look at the previous edition of The VR Job Hub.

As always, if you are an employer looking for someone to fill an immersive technology related role – regardless of the industry – don’t forget you can send us the lowdown on the position and we’ll be sure to feature it in that following week’s feature. Details should be sent to myself (keva@vrfocus.com) and also Peter Graham (pgraham@vrfocus.com).

Check back with VRFocus next Sunday at the usual time of 3PM (UK) for another selection of jobs from around the industry.

The VR Job Hub: Felix & Paul Studios, Hammerhead VR, Oculus & More

We’re now long past half-way through the year and we are, in fact, about to roll over into August. Time moves fast, especially when you’re detailing with a technology such as virtual reality (VR). In but a handful of weeks we’ll be off to Cologne in Germany for another Gamescom where no doubt we’ll be seeing many updates to various videogames and hear news of new titles in the works. We may also find out more about some of the hardware in development and see some recent additions, such as the Vive Knuckles controller, in action.

But before we even get to that we’ve SIGGRAPH which takes place next week.

If you’re excited by what you’ve been reading on VRFocus and are interested in taking the plunge into this industry and one of the companies working on VR, augmented reality (AR) or mixed reality (MR), or you’re already engaged in any of the three and are looking to switch roles we as usual have a selection from around the world below. Why not see if there’s anything that takes your fancy? A new career could be just a few clicks away.

View the new listings below for more information:

Location Company Role Link
New York, NY, USA YouVisit Unity Virtual Reality Developer Click here to apply
Yorktown Heights, NY, USA IBM Research Staff Member Click here to apply
Montreal, Canada Felix & Paul Studios Application Developer Click here to apply
Montreal, Canada Felix & Paul Studios 3D (Graphic) Developer Click here to apply
Montreal, Canada Felix & Paul Studios Computer Vision Developer Click here to apply
Newcastle, UK Hammerhead VR Lead Animator Click here to apply
Newcastle, UK Hammerhead VR Systems Administrator Click here to apply
London, UK Oculus Product Manager, Social VR Click here to apply
Cork, Ireland Oculus LED Research Scientist, Modeling Click here to apply
Seattle, WA, USA Oculus Developer Relations Engineer, Oculus Platform Click here to apply
Menlo Park, CA, USA Oculus Developer Relations Engineer, Rift Click here to apply
Menlo Park, CA, USA Oculus Manager, Display Engineering Click here to apply
Palo Alto, CA, USA Tesla Simulation Engineer Click here to apply

As always don’t forget that you can also view the roles in last week’s edition of The VR Job Hub. Also if you are an employer and are looking for someone to fill a role in a VR, AR or other related areas in the industry and want that position to be featured on next week’s feature, please send details to either myself (keva@vrfocus.com) or pgraham@vrfocus.com 

We’ll be back next Sunday, as usual at 3PM BST with more roles in the VR industry as part of The VR Job Hub.

IBM Watson’s Interactive Speech now Integrated into Star Trek: Bridge Crew

Last month Ubisoft released its biggest title yet for virtual reality (VR) platforms with Star Trek: Bridge Crew. VRFocus reported that the studio planned to add voice commands using IBM Watson integration in a future update, that update has now arrived.

Using IBM Watson’s interactive speech and cognitive capabilities, Watson Speech to Text and Watson Conversation, players will now be able to talk and interact with the virtual Star Trek: Bridge Crew members for a more realistic experience, mimicking that of the multiplayer mode for an experimental beta period.

Whether players are commanding the U.S.S. Aegis and U.S.S. Enterprise NCC-1701, IBM Watson services can be used to operate crews consisting of only AI characters or a mix of AI characters and human teammates with Star Trek: Bridge Crew’s full-body avatars that include real-time lip-sync.

Developed by Red Storm Entertainment, Star Trek: Bridge Crew, is a team focused experience with players taking on one of four roles, Captain, Engineer, Tactical Officer, or Helm Officer. Each has their own part to play in successfully completing missions.

VRFocus reviewed the videogame, giving it 4 stars, saying: “Star Trek: Bridge Crew definitely appeals to the core fan base. The production values are top notch making Star Trek: Bridge Crew one of those rare VR experiences that feels like a AAA title, and likely part of most VR gamers’ collections.”

Aside from Star Trek: Bridge Crew Ubisoft has released Werewolves Within and Eagle Flight. During the Electronic Entertainment Expo (E3) 2017 last week it was revealed that the studio had formed a VR partnership with the film company SpectreVision to create Transference was announced along with a new adrenaline fueled shooter, Space Junkies.

Ubisoft hasn’t stated how long the experimental beta period will last, as further details are revealed VRFocus will keep you updated on the announcements.

Hands-on: IBM Watson Brings Voice Commands to ‘Star Trek: Bridge Crew’

IBM Watson, the artificial intelligence platform designed to understand natural language, today launched support for Star Trek: Bridge Crew (2017) across PSVR, Oculus Rift and HTC Vive.

Before the service launched today, lone players could control the ship’s other posts—Engineering, Tactical, Helm—by clicking a few boxes to issue orders. Now a sole captain (also with a mixed crew of humans and non-humans) can complete whole missions by issuing commands directly to the non-human-controller characters using natural language.

image courtesy IBM

Voice commands are enabled by IBM’s VR Speech Sandbox program, which is available on GitHub for developers to integrate speech controls into their own VR applications. The Sandbox, released in May, combines IBM’s Watson Unity SDK with two services, Watson Speech to Text and Watson Conversation.

at the Captain’s chair, image captured by Road to VR

We had a chance to go hands-on at E3 2017 with Star Trek: Bridge Crew embedded with the Watson-powered voice recognition, a feature that’s initiated during gameplay with a single button press. While talking directly to your digital crew does provide some of those iconic moments (“Engage!” and “Fire phasers!), and most orders went through without a hitch, Watson still has trouble parsing some pretty basic things. For example, Watson doesn’t understand when you use the name of ships, so “scan the Polaris” just doesn’t register. Watson also didn’t pick up on a few things that would seem pretty easy at face value. Commands like “fire on the target”, “fire on the enemy,” and “come on, let’s warp already!” fell on deaf digital ears.

IBM says their VR speech controls aren’t “keyword driven exchanges,” but are built around recognition of natural language and the intent behind what’s being said. Watson also has the capacity to improve its understanding over time, so those “Lets get the hell out of here, you stupid robots!” may actually register one day.

This however doesn’t stop a pretty weird logical disconnect that occurs when talking to a bot-controlled NPC, and it stems from the fact that I was at first imbuing the NPCs with actual intelligence. When talking directly to them, I was instinctively relying on them naturally to help me do my job, to have eyes and ears and not only understand the intent of my speech, but also the intent of the mission. A human tactical officer would have seen that we were getting fired on, and I wouldn’t have had to issue the order to keep the Bird of Prey within phaser range. I wouldn’t have to even select the target because Tactical would do it for me. IBM isn’t claiming to be able to do any of that with its cognitive computing platform, but the frustration of figuring out what Watson can and can’t do is a stark reality, especially when getting your tail-end blasted out of the final frontier.

In the end, Watson-supported voice commands may not be perfect—because when the Red Shirts are dropping like flies and consoles are exploding all over the place, the last thing you want to do is take the time to repeat an important order—but the fact that you can talk to an NPC in VR and get a pretty reliable response is amazing to say the least.

The post Hands-on: IBM Watson Brings Voice Commands to ‘Star Trek: Bridge Crew’ appeared first on Road to VR.

Tribeca Film Festival Launches ‘Storytelling With Watson’ Contest.

IBM’s Watson cognitive computing system is one of the best-known examples of cognitive system in the world. It has already been used to create new recipes and create new clothing designs and now the organisers behind the Tribeca Film Festival are challenging the creative industry to come up with ideas about how Watson can be used to create new stories.

The Storytellers with Watson competition is open to the public as well as the many directors, writers and artists who form part of Tribeca’s network or industry professionals. Participants can submit their ideas on how IBM Watson can be used to create stories in any storytelling medium, film and video, videogames, augmented reality (AR) and virtual reality (VR).

The organisers have worked together with IBM to produce use-case guidelines and examples to help inform contributors on how Watson can help with realising their creation. Guiding categories include development, pre-production, production and post-production, marketing and distribution.

“The Tribeca Film Festival has always been a celebration of innovation and cutting-edge ideas,” said Andrew Essex, CEO at Tribeca Film Festival. “Since IBM Watson has been a big influence across many industries, we’re eager to see how our creative community will apply this technology to inspiring their own creative potential. Our collaboration with IBM is important to our mission because it spurs our community to push the limits of what they think is possible and find new inspiration that can redefine their approaches to art and storytelling.”

“Cognitive computing is driving incredible advancements in what humans and machines can do together, and one of the areas where we’re seeing this accelerate is within media and entertainment,” said David Kenny, Senior Vice President of IBM Watson and Cloud Platform. “Some of the best ideas come from our developer partners, and the Tribeca Film Festival community represents a wellspring of creativity and imagination. Through this competition, we’re eager to engage this dynamic group to see how they apply Watson to solve challenges and enhance the stories they tell.”

Tribeca Film Festival 2017 VR instagram

Submissions to the contest are open now and will be accepted up until 18th May 2017. Ideas can be submitted through the submission form on the IBM website. The creator who submits the winning entry will receive a trip for two to the Tribeca Film Festival 2018, including airfare, hotel and two festival passes.

VRFocus will continue to bring you news on competitions and projects involving VR and AR.