Epic Games Reveals Big Plans for GDC 2018

It’s not long until the Game Developers Conference (GDC) 2018 opens its doors for another year, and 2018 is set to be bigger and better than ever for virtual reality (VR) and augmented reality (AR) enthusiasts. Epic Games has announced it’ll be attending, with plenty of talks to support the growth of the Unreal Engine community with more content and learning material than ever before. 

UnrealEngineatGDC2018

Epic Games will be running a booth on the show floor for attendees to play its latest videogames and those of Unreal Engine developers but its the sessions that’ll feature the most info.

Its main event is on the first day for the annual “State of Unreal” opening session. The company hasn’t yet revealed what its going to be talking about but there will be presentations from Epic Games Founder and CEO Tim Sweeney and CTO Kim Libreri, Unreal Engine licensees, partners, and special guests, with some VR sure to make an appearance.

The Unreal Engine programming schedule has been split into four categories: Sponsored Sessions,  Partner Sessions,  Technical Sessions and the GDC Education Summit.

  • Sponsored Sessions will cover topics ranging from cinematic lighting in Unreal Engine, and creating believable characters in UE4, to optimizing Fortnite: Battle Royale, photoreal virtual humans in UE4 and programmable VFX with UE4’s Niagara.
  • Partner Sessions will include a range of educational programming from some of the top creators across videogames, AR/VR and VFX, like ILMxLAB talking about location-based VR experiences including the Oscar-winning Carne y Arena and Star Wars: Secrets of the Empire.
  • Technical Sessions will offer educational opportunities from the latest techniques in photorealistic virtual humans, to bridging the gap between UX principles and game design to developing the art of Fortnite to real-time VFX and creating VFX in VR.
  • The Unreal Engine GDC Education Summit is an all day event open to educators for free, offering new learning resources directly from the Unreal Engine education team covering a multitude of topics.

There’s plenty more going on from Epic Games, head on over to its official GDC 2018 page to find out more. VRFocus will continue its coverage of Epic Games and Unreal Engine, reporting back with the latest updates.

Epic Games’ Architecture and Design Focused Unreal Studio Now in Open Beta

Epic Games’ Unreal Engine has built a reputation over the years as being one of the best pieces of middleware for the creation of videogames and more recently virtual reality (VR) titles. Whilst that’s great for videogame developers, Unreal Engine isn’t the most practical for other industries such as architecture, design and manufacturing. To aid customers working in those fields Epic Games has been working on Unreal Studio, moving it into open beta this week.

Unreal Studio

Unreal Studio has been designed as a shortcut to producing high-quality, real-time, fully immersive visual experiences,  introducing new learning tools, professional support and assets, along with the Datasmith workflow toolkit for streamlining transfer of CAD and 3ds Max data into Unreal Engine.

“The Unreal Studio open beta builds on the success of our Datasmith release. Datasmith simplifies bringing Unreal Engine into architecture and design pipelines with automatic lightmap and UV creation along with scripted workflows to organize, optimize and clean up geometry,” said Marc Petit, General Manager of Unreal Enterprise at Epic Games. “The feedback has been overwhelming—in just five months we had over 14,000 beta registrations, and a recent beta survey reported Datasmith productivity gains of 113 percent. We’re taking all the ‘boring’ work out of the process and giving users more time to be creative.”

With Datasmith customers using it can efficiently transfer CAD data from over 20 CAD sources, including Autodesk 3ds Max, Rhino 3D, SolidWorks and Inventor into Unreal Engine. HP recently announced its VR Launch Kit for Unreal Engine which included Datasmith. Unreal Studio’s assets include 100 substances from Allegorithmic for common architecture and design materials, and industry-specific templates to quickly create experiences.

International Space Station

Companies who’ve already begun using Unreal Studio include: NASA’s Hybrid Reality Lab, Soluis Group, Animech, Neoscape and Herman Miller.

“Real-time engines have primarily been designed for the gaming industry, making them impractical to use for architectural and manufacturing visualization. Until now. Unreal Studio changes the paradigm by addressing needs specific to our industry, such as importing engineering models and easily achieving visual consistency,” said Karen Hapner, Senior Visualization Designer at Herman Miller. “With Unreal Studio, I can use Unreal Engine to efficiently create interactive, immersive experiences for our customers.”

Head to the Unreal Studio website to download the free beta and find out more. VRFocus will continue its coverage of Epic Games, reporting back with the latest announcements.

The VR Diversity Initiative Returns – But What Is It?

In 2017, VRFocus was involved in creating a new type of event to encourage under-represented groups to get involved in immersive technologies such as virtual reality (VR) and augmented reality (AR). The first event was a success, and soon we’ll be undertaking the Kick-off Event for the first edition in the 2018 series.

The initial VR Diversity Initiative took place at Digital Catapult in London. Where a single day was broken down into a morning comprising several talks from experts in the field of VR and immersive technology, followed by a hands-on workshop in the afternoon.

The first VR Diversity Initiative event was led by Catherine Allen, a VR producer and curator who has since gone on to set up her own VR content exhibition and curation company called Limina, which will be holding a public-facing VR festival in the UK later this year.

The purpose of the VR Diversity Initiative is to offer support, workshops and opportunities for groups who are under-represented in media and technology, including LGBT, women, Black, Asian, the disabled and other under under-represented groups with the XR and tech industry. The initiative hopes to tackle inequality in future immersive tech by providing a free workshop for these groups. One of the biggest barriers for the public is the expensive hardware that’s needed in order to create immersive experiences. Selected participants will be able to create a rough VR prototype for the Oculus Rift and HTC Vive, guided by experienced VR developers. This will enable participants to learn the basics of building a project in either Unreal Engine or Unity, giving them the confidence to pursue a career in XR fields.

This year’s VR Diversity Initiative event will be led by VRFocus’ own Nina Salomons and is due to take place on 28th February, 2018 at Here East in London, support by Hobs Studio. This will be first of six VR Diversity Initiatives that will take place this year. The event will only be allowing a limited number of attendees, who will need to apply by filling in the application form. Each attendee will be given an opportunity to work with an experienced VR mentor in order to develop and produce a prototype within a single day.

Further information on the VR Diversity Initiative and the upcoming event can be found in the below video presentation where Nina Salomons explains in full detail what she hopes to achieve.

Further information on other upcoming projects and event for the VR Diversity Initiative will be found here on VRFocus.

Streamline VR Development With HP’s VR Launch Kit for Unreal Engine

Today has already seen computer manufacturer HP make a couple of big virtual reality (VR) related announcements, with a new Windows Mixed Reality Professional Edition headset and a revamped HP Z4 Workstation. That’s not all the company has to offer on the VR side of things, announcing a collaboration with Unreal Engine on the creation of a VR Launch Kit for Unreal Engine.

Unreal Engine 4.17

HP created the kit as a way to help streamline and accelerate the VR development process. Features include compatibility with Epic Games’ Datasmith, a Workflow Toolkit for Unreal Engine that was introduced last year; a VR Performance Profiler making it easier for companies to determine the right computer configurations when they deploy their VR solutions, giving them benchmarks and reports that indicate before rendering whether the software will run well on targeted hardware.

There will also be optimisations for real-time performance and templates to provide cross-sectioning and exploded view functionality for models, helping creators get those models into VR – especially usefully for applications like vehicle models.

Lastly, for those companies new to VR development HP will be including two data sets for creators to play with first, a bike model (seen below) and an unbuilt Frank Lloyd Wright Trinity Chapel, which should teach them how to bring VR environments to life.

Unreal Engine HP Launch Kit Bike

In addition to the VR Launch Kit for Unreal Engine announcement, HP has also extended its Device-as-a-Service (DaaS) offering to VR solutions. This is a modern service model that simplifies how commercial organisations equip users with the right hardware, support and lifecycle services to get the job done –improving end-user productivity, IT efficiency and cost predictability. The one-stop solution can not only equip customers with workstations but Windows Mixed Reality headsets as well, even third-party units.

DaaS is available now while HP’s VR Launch Kit for Unreal Engine will roll out this Spring via an Unreal Engine micro site. For any further updates from HP, keep reading VRFocus.

Entwicklerstudio Epic Games übernimmt Cloudgine

Das Entwicklerstudio Epic Games, bekannt für ihre hauseigene Unreal Engine, übernimmt für eine bisher unbekannte Summe das Cloud-Computing-Unternehmen Cloudgine. Die zukünftige Zusammenarbeit könnte zu einer Weiterentwicklung der Unreal Engine sowie eine Reduzierung der Mindestanforderungen für VR-Inhalte an PC und Konsolen führen.

Epic Games und Cloudgine – eine zukunftsträchtige Zusammenarbeit

Epic Games gab die Übernahme des schottischen Unternehmens Cloudgine bekannt. Die in Edinburgh sitzende Firma ist neben VR-Titeln wie They Came From Space vor allem bekannt für ihre Cloud-Computing-Technologie, die es ermöglicht, PC-, Konsolen- und VR-Inhalte in der Cloud zu rendern und zu bearbeiten.

Die Technologie überschreitet die technischen Grenzen der jeweiligen Hardware des Anwenders, in dem es Prozesse in Cloud-Server überträgt. Dies wurde beispielsweise in Zusammenarbeit mit Microsoft in  Crackdown 3 für Xbox One genutzt. Um die anspruchsvollen Zerstörungsanimationen der Strukturen auf der Konsole zu ermöglichen, werden die nötigen Berechnungen in der Cloud vollzogen. Die Kooperation für das Projekt wird übrigens trotz Übernahme von Epic Games weiterhin fortgesetzt.

Die künftige Zusammenarbeit ergibt durchaus Sinn, wie die Entwickler von Cloudgine selbst beschreiben:

„Seit Anbeginn basieren unsere Entwicklungsprozesse auf Epic Games Unreal Engine 4. Deshalb können unsere Cloud-Computing- und Online-Technologien zukünftig für eine Verbesserung der UE4 sowie für eine Vielzahl neuer Features sorgen. Dank der Fortschritte im Bereich der physikalischen Simulation und Netzwerktechnik könnten wir dazu beitragen, dass Entwickler die bisherigen technischen sowie kreativen Grenzen von Spielen, Filmen, Animationen und Visualisierungen bald durchbrechen.“

Durch die Integration der Cloud-Computing-Technologie in die Unreal Engine könnten zukünftig die Mindestanforderungen bei VR-Spielen und -Erfahrungen für PC und Konsolen massiv sinken. Zudem bleiben auch alternde Konsolen länger marktfähig. Welche genauen Pläne Epic Games durch die Akquirierung verfolgt, bleibt dennoch abzuwarten.

Wir werden euch über Neuigkeiten über neue Projekte aus dem Hause Epic Games auf dem Laufenden halten.

(Quellen: UploadVR | Cloudgine | Video: Cloudgine Youtube)

Der Beitrag Entwicklerstudio Epic Games übernimmt Cloudgine zuerst gesehen auf VR∙Nerds. VR·Nerds am Werk!

Sólfar Studios Announces New VR Videogame IN DEATH For 2018

Icelandic developer Sólfar Studios has had an interesting few years since its founding back in October 2014 by a group of videogame industry veterans. The Reykjavik based studio has so far produced only one virtual reality (VR) videogame title but the second has just been announced and the difference between the two titles is rather striking.

The first was, of course, EVEREST VR. An acclaimed “videogame-that-isn’t-a-videogame” as VRFocus put it in our preview back in March this year. A VR climbing experience created in partnership with leading Nordic visual effects and animation firm RVX, in which you take on Mount Everest, the highest mountain on Earth. Initially for PC VR, EVEREST VR came to PlayStation VR a couple of months ago, whilst an update to the videogame was also released at the beginning of November that incorporated unique 360 degree footage that was filmed by VR explorers.

The second title, announced today sees users get ready for battle in what Sólfar Studios are referring to as their “first core VR game”. The title, IN DEATH, asks a question of what if you die… and God isn’t there? What happens when you die, arrive in the afterlife, and find that it’s actually not all strumming harps and sitting on clouds but instead a battle for your very survival.

Which considering you’ve only just gone and died is kind of a bummer, frankly.

A first person shooter (FPS) with roguelike elements and a distinctly medieval flavour, the world of IN DEATH is procedurally generated with an “achievement-based progression system”, meaning that no run will ever be the same and you’ll need to be on the lookout as you travel through the changing castle in the clouds. As around every corner could be monsters to slay, at close range with your melee weapons or at distance with your bow and other weapons.

Currently in development using Unreal Engine, interested players can already sign up for a free closed Beta on the Sólfar Studios website. Due for release at some point next year, support for both HTC Vive and Oculus Rift has already been confirmed, although a PlayStation VR version isn’t ruled out at this time. Sólfar Studios stating that “support for additional VR platforms to follow in 2018” and that IN DEATH will “release across multiple PC and Console platforms.”.

As we get ready to move into the New Year, VRFocus will bring you further details on IN DEATH as they are revealed. In the meantime, you can check out the announcement trailer below.

The VR Job Hub: Unreal Engine, Rebellion, More…

Christmas is coming, the goose is getting fat – and if you’re the goose you might not be quite so happy with this Christmas temp job. Turns out it might be very temporary indeed. Just because we’ve hit December that does not mean that here isn’t a vast array of jobs out there for those of you who might have an interest in the fields of virtual reality (VR), augmented reality (AR) and/or mixed reality (MR).  So we’re back with yet another edition of The VR Job Hub which as usual offers you a selection of different employment opportunities from around the globe. 

This week there’s everything from videogame designers, and Unity developers to interns working at NASA and even a roles for someone in Manchester showing off Lenovo’s Star Wars: Jedi Challenges!

Check out the list below to see if there’s something that sparks your interest.

Location Company Role Link
Cary, NC, US Unreal Engine PR Manager Click here to apply
Oxford, UK Rebellion Senior Designer Click here to apply
Oxford, UK Rebellion Technical Designer Click here to apply
Manchester, UK RMG Star Wars: Jedi Challenges Sales Ambassador  Click here to apply
Redmond, WA, US Oculus Oculus Research VR/AR Software Engineer  Click here to apply
Santa Clara, CA, US Intel Corporation Unity Developer – Virtual Reality  Click here to apply
Bangalore, India Accenture ‘Virtual Reality’ Click here to apply
Hampton, VA, US NASA Internship: “Applying AR and VR visualizations to scientific data sets” Click here to apply
Bangalore, India ICS Consultancy Services Application Lead – Virtual Reality Click here to apply
Glassboro, NJ, US Rowan University Virtual Reality Developer, (Prof Serv Spec, 3) Click here to apply
Coventry, UK Jaguar Virtual Engineer – Visualisation Click here to apply

 

If none of the the above appealed to you, you can always check out last week’s edition of The VR Job Hub which last week took in roles in America, the UK and the Netherlands. Likewise, don’t forget that if you are an employer looking for someone to fill an immersive technology related role – regardless of the industry – and you want that position to be featured on next week’s VR Job Hub, then please send details to myself via keva@vrfocus.com and also pgraham@vrfocus.com.

Check back with VRFocus next Sunday at 3PM GMT and every Sunday for the latest roles in the immersive technology industry.

Google Releases ‘Resonance Audio’, a New Multi-Platform Spatial Audio SDK

Google today released a new spatial audio software development kit called ‘Resonance Audio’, a cross-platform tool based on technology from their existing VR Audio SDK. Resonance Audio aims to make VR and AR development easier across mobile and desktop platforms.

Google’s spatial audio support for VR is well-established, having introduced the technology to the Cardboard SDK in January 2016, and bringing their audio rendering engine to the main Google VR SDK in May 2016, which saw several improvements in the Daydream 2.0 update earlier this year. Google’s existing VR SDK audio engine already supported multiple platforms, but with platform-specific documentation on how to implement the features. In February, a post on Google’s official blog recognised the “confusing and time-consuming” battle of working with various audio tools, and described the development of streamlined FMOD and Wwise plugins for multiple platforms on both Unity and Unreal Engine.

Image courtesy Google

The new Resonance Audio SDK consolidates these efforts, working ‘at scale’ across mobile and desktop platforms, which should simplify development workflows for spatial audio in any VR/AR game or experience. According to the press release provided to Road to VR, the new SDK supports “the most popular game engines, audio engines, and digital audio workstations” running on Android, iOS, Windows, MacOS, and Linux. Google are providing integrations for “Unity, Unreal Engine, FMOD, Wwise, and DAWs,” along with “native APIs for C/C++, Java, Objective-C, and the web.”

This broader cross-platform support means that developers can implement one sound design for their experience that should perform consistently on both mobile and desktop platforms. In order to achieve this on mobile, where CPU resources are often very limited for audio, Resonance Audio features scalable performance using “highly optimized digital signal processing algorithms based on higher order Ambisonics to spatialize hundreds of simultaneous 3D sound sources, without compromising audio quality.” A new feature in Unity for precomputing reverb effects for a given environment also ‘significantly reduces’ CPU usage during playback.

Much like the existing VR Audio SDK, Resonance Audio is able to model complex sound environments, allowing control over the direction of acoustic wave propagation from individual sound sources. The width of each source can be specified, from a single point to a wall of sound. The SDK will also automatically render near-field effects for sound sources within arm’s reach of the user. Near-field audio rendering takes acoustic diffraction into account, as sound waves travel across the head. By using precise HRTFs, the accuracy of close sound source positioning can be increased. The team have also released an ‘Ambisonic recording tool’ to spatially capture sound design directly within Unity, which can be saved to a file for use elsewhere, such as game engines or YouTube videos.

Resonance Audio documentation is now available on the new developer site.

For PC VR users, Google just dropped Audio Factory on Steam, letting Rift and Vive owners get a taste of an experience that implements the new Resonance Audio SDK. Daydream users can try it out here too.

The post Google Releases ‘Resonance Audio’, a New Multi-Platform Spatial Audio SDK appeared first on Road to VR.

Unreal Engine 4.18 Update Brings Native Support for ARKit and ARCore, SteamVR Support for Mac

Epic Game’s Unreal Engine is making it easier to create for augmented reality in the newest 4.18 update, now including official support for Apple’s ARKit and Google’s ARCore software dev kits, and support for SteamVR on Mac.

‘Production-ready’ support for Apple’s ARKit working on iOS11 was initially announced during Apple’s iPhone 8 and iPhone X unveiling last month. Epic has however provided experimental support in their game engine for ARKit since Unreal Engine 4.17, but the new 4.18 update represents what Epic calls “significant changes” since the prior version went live back in August.

Announced on the Unreal Engine blog, the company says they’ve “streamlined workflows [for ARKit projects] making use of existing framework components, added robust handling of the passthrough camera, and increased fidelity by improving performance and prediction.”

Unreal Engine 4.18 now contains official support for ARCore developer preview too, Google’s answer to ARKit that provides a similar AR function to Google’s new Pixel 2 smartphones and soon more Android phones running 7.0 Nougat and above including Samsung S8 line.

In the 4.18 update, the game engine also includes native SteamVR support on Mac, making the same well-worn PC interfaces available on Mac and adding the ability to easily transfer projects between the two platforms.

Apple announced Valve was bringing SteamVR support to Mac during the company’s World Wide Developer Conference (WWDC) back in June, showing the audience the power of the company’s new VR Ready 27-inch iMac. Apple featured a demo running on the HTC Vive that was created by Industrial Light and Magic. Using Epic’s Unreal Engine VR Editor, they showed how developers could build VR content inside of VR itself, using Star Wars assets.

Unreal’s support for SteamVR on Mac comes alongside support for Metal 2, Apple graphics API which is getting the special VR treatment too. Apple says Metal 2 can bring up to a 10x increase in draw call throughput over the prior version, and it will include a VR-optimized display pipeline.

Check out full release notes here.

The post Unreal Engine 4.18 Update Brings Native Support for ARKit and ARCore, SteamVR Support for Mac appeared first on Road to VR.

Mechanics Themed VR Title Wrench Amongst September’s NVIDIA Edge Program Winners

Having been working with partners on both the hardware and software side of virtual reality (VR) NVIDIA is among the most aware of what needs to be done in order to push VR forward. It was NVIDIA last week that stated that GPU’s will need to be forty times more powerful than they are at present in order to generate photo realistic VR.  It continues to work on the Holodeck platform they are developing, whilst NVIDIA Inventions, the Research and Development arm of the firm is currently looking into various ways of improving VR displays – as well as those for augmented reality (AR).

One of NVIDIA’s close partners is Epic Games (with whom they are developing an ‘Enterprise-grade’ platform for VR) and their Unreal Engine development software. Back in June the pair announced the Program’, a scheme designed to reward developers, VR and non-VR alike, who showing outstanding degrees of creativity. Providing them with the latest powerful NVIDIA hardware to continue to encourage their growth. With three winners being announced every month up until Summer 2018.

“It’s no secret that the Unreal Engine development community is capable of creating some of the most awe-inspiring real-time content with UE4.” Said Epic’s Chance Ivey at the time. “Time and time again, the team here at Epic Games is amazed by the talent displayed in projects we come across.”

September’s winners have now been announced via the Unreal Engine blog, with one in particular VR related. The project in question is called Wrench. Created by Alex Moody of indie studio Digital Mistake, Wrench is a videogame where you become a mechanic, putting together cars, preparing and modifying them to take part in various motorsports events. As you can see in the protoype trailer below it is highly technical, putting the vehicles together practically from scratch with only a floor full of parts and the relevant starting piece. It’s almost like it could exist as the videogame behind your favourite racing title.

Moody, as well as other winners Michael Banks, Tim Polyak, and Elizabeth Smith for architectural visualisation Armstrong Townhouse, and Sławek Krężel’s Dynamic Grass System which reacts realistically to an array of elements, will each be receiving a GTX 1080 Ti or GTX 1080.  VRFocus will be following the further developments with Wrench, so be sure to follow VRFocus for more.