NVIDIA Launches VRWorks 360 Video SDK 2.0

When they are not preparing new lines of virtual reality (VR) ready GPUs with more cores, more processing speed and more raw power than ever before. And when they’re not filing a whole range of VR related trademarks connected to said graphics cards. NVIDIA can still often be found involved in immersive technology on the software side, one way or another.

NVIDIA VRWorks - 360 VideoPrimarily that is through acts involving NVIDIA VRWorks, NVIDIA’s development kit for VR devs, which provides a suite of APIs, libraries and engines to enable high-end graphics performance when creating things under the VR umbrella. Their latest update sees, once again, improvements made to their 360 Video SDK which has already gotten companies like STRIVR and Pixvana excited.

“When you experience a situation as if you are actually there, learning retention rates can soar.” Commented Chief Technology Officer of STRIVR, Brian Meek, to NVIDIA’s official blog. “The new Warp 360 will help ensure our customers stay fully immersed, and the functionality and performance that Turing brings to VRWorks can’t be beat.”

The new version 2.0 update accelerates the speed of stitching together 360 degree videos as well as a host of other features that mean recording and streaming in 360 degrees becomes a lot easier. It also adds additional support for NVIDIA CUDA 10 and of course the most recent additions to their GPU line-ups.

Nvidia Turing architectureUpdates include:

Ambisonic Audio – increases the immersiveness of 360-degree videos by enabling 3D, omnidirectional audio such that the perceived direction of sound sources change when viewers modify their orientation.

Custom Region of Interest Stitch – enables adaptive stitching by defining the desired field of view rather than stitching a complete panorama. This enables new use cases such as 180-degree video while reducing execution time and improving performance.

Improved Mono Stitch – increases robustness and improves image quality for equatorial camera rigs. Multi-GPU setups are now supported for up to 2x scaling.

Moveable Seams – manually adjusts the seam location in the region of overlap between two cameras to preserve visual fidelity, particularly when objects are close to the camera.

New Depth-Based Mono Stitch – uses depth-based alignment to improve the stitching quality in scenes with objects close to the camera rig and improves the quality across the region of overlap between two cameras. This option is more computationally intensive than moving the seams, but provides a more robust result with complex content.

Warp 360 – provides highly optimized image warping and distortion removal by converting images between a number of 360 projection formats, including perspective, fisheye and equirectangular. It can transform equirectangular stitched output into a projection format such as cubemap to reduce streaming bandwidth, leading to increased performance.

You can download the latest version of the VRWorks 360 Video SDK here. VRFocus will bring youj more news on the developments in VR hardware and software throughout the day.

 

OPTIS Introduces New Software For VR Prototyping

When using digital models for prototyping what will eventually be physical objects, it’s vitally important that they behave as closely as possible like objects in the real world. With that in mind OPTIS heave announced a new version of the HIM software that integrates video, audio and haptic feedback to create a more realistic simulation.

The HIM software is powered by Nvidia technology, using Quadro GPU’s to provide the power needed to create full-scale 3D prototypes with a high level of detail and realism. HIM allows for full-body tracking and motion capture so products can be evaluated by potential end-users in a virtual environment long before a physical product is produced. Collision detection uses Nvidia PhysX, part of the VRWorks software development kit. HIM also has integrated, for the first time, VR audio to create accurate simulations of sound propagation and create a more immersive experience.

The main thrust of the OPTIS HIM software is to allow team members who may be scattered across the globe to be more collaborative and become more involved in the various stages of project design and development. With a VR environment it becomes easier to pass across information, as communication barriers are lowered. In addition, questions of safety can also be addressed as HIM allows every stage of production to be modelled, including the production line, so bottlenecks or safety concerns can be tested and corrected before the product is put into production.

“Soon, robots will gain ground in the design process and in industrial manufacturing in general. OPTIS’ solutions facilitate the upstream work, the use of the virtual reality favoring human-machine interactions.” says Jacques Delacour, CEO and founder of OPTIS.

VRFocus will continue to bring you news of developments in VR.

NVIDIA Shows How Physically-based Audio Can Greatly Enhance VR Immersion

Positional audio for VR experiences—where noises sound as if they are coming from the correct direction—has long been understood as an important part of making VR immersive. But knowing which direction sounds are coming from is only one part of the immersive audio equation. Getting that directional audio to interact in realistic ways with the virtual environment itself is the next challenge, and getting it right came make VR spaces feel far more real.

Positional audio in some form or another is integrated into most VR applications today (some use better integrations and mixes than others). Positional audio tells you about the direction of various sound sources, but it misses out completely on telling you about the environment in which the sound is located, something that we are unconsciously tuned to understand as our ears and brain interpret direct sounds mixed in with reverberations, reflections, diffractions, and more complex audio interactions that change based on the shape of the environment around us and the materials of that environment. Sound alone can give us a tremendous sense of space even without a corresponding visual component. Needless to say, getting this right is important to making VR maximally immersive, and that’s where physically-based audio comes in.

Photo courtesy NVIDIA

Physically-based audio is a simulation of virtual sounds in a virtual environment, which includes both directional audio and audio interactions with scene geometry and materials. Traditionally these simulations have been too resource-intensive to be able to do quickly and accurately enough for real-time gaming. NVIDIA has dreamt up a solution which takes those calculations and runs them on their powerful GPUs, fast enough, the company says, for real-time use even in high-performance VR applications. In the video heading this article, you can hear how much information can be derived about the physical shape of the scene from the audio alone. Definitely use headphones to get the proper effect; it’s an impressive demonstration, especially to me toward the end of the video when occlusion is demonstrated as the viewing point goes around the corner from the sound source.

That’s the idea behind the company’s VRWorks Audio SDK, which was released today during the GTC 2017 conference; it’s part of the company’s VRWorks suite of tools to enhance VR applications on Nvidia GPUs. In addition to the SDK, which can be used to build positional audio into any application, Nvidia is also making VRWorks Audio available as a plugin for Unreal Engine 4 (and we’ll likely see the same for Unity soon), to make it easy for developers to begin working with physically-based audio in VR.

SEE ALSO
Latest Unity Beta Gets NVIDIA VRWorks for Enhanced Rendering Features

The company says that VRWorks Audio is the “only hardware-accelerated and path-traced audio solution that creates a complete acoustic image of the environment in real time without requiring any ‘pre-baked’ knowledge of the scene. As the scene is loaded by the application, the acoustic model is built and updated on the fly. And audio effect filters are generated and applied on the sound source waveforms.”

VRWorks Audio repurposes the company’s OptiX ray-tracing engine which is typically used to render high-fidelity physically-based graphics. For VRWorks Audio, the system generates invisible rays representing sound wave propagation, tracing the path from its origin to the various surfaces it will interact with, and eventually to its arrival at the listener.


Road to VR is a proud media sponsor of GTC 2017.

The post NVIDIA Shows How Physically-based Audio Can Greatly Enhance VR Immersion appeared first on Road to VR.

Auf der NVIDIA GTC 2017 können Teilnehmer 30.000 USD bei einem VR-Wettbewerb gewinnen

Am 8. – 11. Mai findet die GTC 2017 (GPU Technology Conference) gesponsort von NVIDIA statt. Auf diesem Event in San Jose, Kalifornien veranstaltet der Sponsor einen Wettbewerb, bei dem die Gewinner Preise in Höhe von 30.000 USD erhalten.

Eine Ausstellung über VR-Produkte

Der Wettbewerb wird in Form einer Ausstellung über Inhalte mit VR-Bezug gehalten. Die Firmen haben die Möglichkeit ihre Produkte einzureichen und später vorzustellen. Jedoch liegt der Fokus auf allem auf Gaming. Die VR-Produkte müssen außerdem die Nvidia GPU Technologie, wie z. B. GameWorks, DesignWorks oder VRWorks nutzen und gemeinsam mit einem an den Computer oder Workstation verbundenen VR-Headset funktionieren. Neben VR-Produkten können außerdem AR-Produkte eingereicht werden. Die Bewerbungsfrist endet am 15. März.

Der Veranstalter wählt aus den Einreichungen zehn Firmen oder Teams aus, die am Wettbewerb teilnehmen dürfen. Diese erhalten die Möglichkeit ihre Produkte während der GTC auf einem zur Verfügung gestellten Platz, vorzustellen. Außerdem sind sie verpflichtet eine Präsentation in Länge von fünf Minuten vor einer Jury zu halten, um die Gewinner auszumachen. Danach darf das Publikum noch drei Minuten lang Fragen stellen. Die Jury darf außerdem die vorgestellten Demos der Teilnehmer in einem Pavillon im NVIDIA VR Startup Dorf testen, in dem sich alle Partizipanten während des Wettbewerbs aufhalten.

Neben Geldpreisen und materiellen Preisen im Wert von 30.000 USD erhalten die Gewinner ebenfalls Unterstützung im Bereich PR und Marketing. Zudem sollen die Unternehmen finanziell unterstützt werden, um eine Entwicklung zu gewährleisten. Nvidia sucht mit diesem Verfahren speziell nach Start-Ups, die bisher nicht mehr als 5 Millionen USD Umsatz vorweisen. Zu dem beinhaltet die Suche das Auffinden von Entwicklern, die sich auf VR-Inhalte ohne Gaming-Bezug spezialisiert haben. Darunter fallen unter anderem die Bereiche Wissenschaft, Technik, Bildung, Kunst und Medizin.

Durch das Event sollen auch kleinere Unternehmen unterstützt werden, um ihre Produkte auf dem Markt zu etablieren und weiterzuentwickeln. Wir dürfen gespannt sein, welche VR und AR Highlights uns erwarten.

(Quellen: Nvidia)

Der Beitrag Auf der NVIDIA GTC 2017 können Teilnehmer 30.000 USD bei einem VR-Wettbewerb gewinnen zuerst gesehen auf VR∙Nerds. VR·Nerds am Werk!

NVIDIA’s New GeForce GTX 1080 Ti is a VR Beast

NVIDIA’s New GeForce GTX 1080 Ti is a VR Beast

As many of you already know, NVIDIA is one of the leading purveyors of graphics cards which are heavily used for VR. In fact, NVIDIA is the go-to solution for the majority of users, so it is always a good day when the company launches a new graphics card. This is especially true when they launch a brand new high-end graphics card. The new GeForce GTX 1080 Ti is the new top dog within NVIDIA’s lineup of graphics cards and is designed to be the fastest card that the company has offered to date. Yes, that includes the GeForce GTX Titan X Pascal, the fastest and most sought after graphics card.

The specs for the GPU itself are:

  • 12 Billion transistors
  • 1.6 GHz Clockspeed, 2 GHz Boost (1.4, 1.5 GHz boost on Titan X Pascal)
  • 3584 CUDA Cores (same as Titan X Pascal)
  • 352-bit memory bus (384-bit on Titan X Pascal)
  • 11 Gbps memory speed (10 Gbps on Titan X Pascal)
  • 11 GB of RAM (12 GB on Titan X Pascal)
  • 220 Watt TDP (250 on GTX Titan X Pascal)

The expectation is that the GTX 1080 Ti will be 35 percent faster on average than a GTX 1080, according to NVIDIA, which should mean that it will outperform the GTX Titan X Pascal in gaming and VR. The GTX 1080 Ti will ship in March and be available for $699. NVIDIA also killed the DVI port on the new GTX 1080 Ti, which won’t really be missed. It has three DisplayPort connectors and one HDMI connector, allowing for three monitor display configurations with an HDMI, which is what I’m running at home right now.

In addition to the new GPU, NVIDIA is also announcing support for VR Works inside of Unity including support for VR SLI, Multi-Res Shading, Lens Match Shading and SPS. Thanks to the VRWorks features on NVIDIA’s GPUs, benchmarks like Basemark’s VR Score saw as much as a 40% performance uplift. Also, in addition to announcing support for Unity and VR benchmarks, NVIDIA is introducing their own tool to measure VR quality called FCAT VR. This tool is built on their frame capturing technology which seeks to discover real world performance and actual frames displayed to the headset. They’ve also introduced an advanced data analysis tool with FCAT data analyzer to allow anyone to analyze a game or application’s behavior.

While we don’t exactly know how much faster it will be than the Titan X Pascal, the expectation is that it will be faster and significantly faster than the GTX 1080. This is a good thing because it means that VR developers can really start to look at enabling eye candy features in their applications and that we can start to think about possibly increasing the resolution of VR HMDs down the road as well. There are already higher resolution displays out there and graphics cards like the GTX 1080 Ti are going to be critical to enabling those high resolutions at acceptable frame rates.

Disclosure: My firm, Moor Insights & Strategy, like all research and analyst firms, provides or had provided research, analysis, advising, and/or consulting to many high-tech companies in the industry, including NVIDIA and others. I do not hold any equity positions with any companies cited.

Tagged with: , , , , , , , , , , ,

Unity’s Main Branch Now Supports NVIDIA VRWorks for Enhanced Rendering Features

The latest release of Unity, version 2017.1 now officially supports NVIDIA’s VRWorks rendering tech. VRWorks contains a number of rendering features unique to the company’s GPUs which are designed to improve performance in VR applications.

Update (7/13/17):  After several months in beta, this week Unity launched its latest main branch release, version 2017.1. Alongside some VR fixes and a few improvements noted in the full release notes, VRWorks support also comes to the main branch for the first time, allowing developers working on the release version of the game engine to install the Nvidia VRWorks plugin to enable a range of VR specific rendering features (noted below) that can improve performance and enhance visuals on Nvidia GPUs.

Now supported by the main branch, we’d expect VRWorks to remain supported on the Unity main branch going forward.


Update (4/26/17, 10:26PM PT): While NVIDIA had formerly made a branch of Unity with VRWorks support available to select developers, the company has now launched a VRWorks plugin on the Unity Asset store which is supported by the latest Unity beta (2017.1.0b2). This makes it easier for developers to enable VR rendering features unique to NVIDIA’s latest GPUs:

  • Multi-Res Shading (Maxwell & Pascal) – renders each part of an image at a resolution that better matches the pixel density of the warped image. Multi-Res Shading uses Maxwell’s multi-projection architecture to render multiple scaled viewports in a single pass, delivering substantial performance improvements.
  • Lens Matched Shading (Pascal) – uses the new Simultaneous Multi-Projection architecture of Pascal-based GPUs to provide substantial pixel shading performance improvements. The feature improves on Multi-res Shading by rendering to a surface that more closely approximates the lens corrected image that is output to the headset display. This avoids the performance cost of rendering many pixels that are discarded during the VR lens warp post-process.
  • Single Pass Stereo (Pascal) – uses the new Simultaneous Multi-Projection architecture of NVIDIA Pascal-based GPUs to draw geometry once, then simultaneously project both right-eye and left-eye views of the geometry. This lets developers effectively double the geometry in VR applications, increasing the richness and detail of their virtual world.
  • VR SLI (Maxwell and Pascal) – provides increased performance for virtual reality apps where multiple GPUs can be assigned a specific eye to dramatically accelerate stereo rendering. With the GPU affinity application programming interface, VR SLI allows scaling for systems with more than two GPUs.

NVIDIA also maintains a custom branch of Unreal Engine 4 with integrated VRWorks features.

Original Article (11/9/16): As developers explore the limitless potential of VR, performance and efficiency continue to be an essential focus of hardware and software. Unity, one of the most popular game engines for VR development, has been a long-term supporter of the medium, introducing many VR-specific features as the hardware evolved at a frantic pace over the last few years. At GDC 2016, Unity announced they would be adding support for VRWorks, Nvidia’s SDK for optimisation of VR using the company’s GPUs.

SEE ALSO
NVIDIA Says New Foveated Rendering Technique is More Efficient, Virtually Unnoticeable

At Unite 2016 this month in Los Angeles, this commitment hit an important milestone, with Nvidia providing early access to a version of Unity with native VRWorks support for select VR developers, which includes the four major features for VR graphics optimisation: VR SLI, Multi-res Shading, Lens Matched Shading, and Single Pass Stereo. Developers can apply for early access here. Nvidia says they’re working toward bringing these features into the main branch of Unity. Per the update above, the NVIDIA VRWorks plugin is available now on the Unity Asset Store, supporting Unity 2017.1.b02 or higher.

Integrated VRWorks support in Unity means faster and easier integration of VRWorks technologies for developers, which Nvidia says can result in major performance improvements thanks to features unique to their GPUs. Multi-res Shading, which has already featured in custom branches of Unreal Engine, deals with the barrel distortion required for rendering optically-correct images to a VR headset, rendering multiple viewports across a single render target, using a hardware feature called ‘multi-projection’. By shrinking the outer viewports, the render target is much more efficient, offering a 30% improvement in some cases.

Pascal-equipped systems benefit most significantly, as the Simultaneous Multi Projection technology introduced with the architecture allows VRWorks to perform Lens Matched Shading, where 16 views can be rendered at different angles in a single pass, which can be shaped to closely match the distortion of a lens in a VR headset, resulting in far fewer wasted pixels across the render target. Combined with Single-Pass Stereo—which allows for reprojecting geometry around a second viewport—means 32 views are being rendered in a single pass, which can produce a significant performance increase in pixel shading throughput compared to Maxwell and earlier GPUs.

SEE ALSO
NVIDIA Explains Pascal's 'Lens Matched Shading' for More Efficient VR Rendering

Ted Carefoot, producer at Cloudhead Games, the studio behind Unity-based The Gallery series, said of the announcement, “Optimizing VR content is always a huge challenge, so we’re very excited to be working with NVIDIA on VRWorks. Features like ‘multi-res’, and ‘lens match’ shading (MRS/LMS) are indispensable tools in the quest to make beautiful, interactive, and deeply immersive virtual worlds.”

Nvidia has also integrated VRWorks into the latest versions of Unreal Engine, Unity’s closest competing game engine for VR development.

The post Unity’s Main Branch Now Supports NVIDIA VRWorks for Enhanced Rendering Features appeared first on Road to VR.

NVIDIA Build of Unity Natively Supports VRWorks

This week Unity Technologies has been holding its annual Unite conference in California. Yesterday VRFocus reported on the company showcasing its latest Editor VR developments, as well as Google using the event to announce the launch date for Daydream View. Now NVIDIA has revealed a new initiative in conjunction with Unity with a specialist build of the game engine for select virtual reality (VR) partners.

Both companies are looking to help studios that use Unity build great immersive videogame experiences, so today they’re allowing certain VR partners to have early access to an NVIDIA-version of Unity Engine with native VRWorks feature support. This build will also include support for the various NVIDIA VRWorks features, including VR SLI, Multi-res Shading, Lens Matched Shading, and Single Pass Stereo.

NV_VRWORKS_KV_Hero_V1

As an added bonus for studios, the graphics card manufacturer will be introducing a plug-in for NVIDIA Ansel technology to the Unity asset store for free, with more GameWorks plug-ins to come.

“Optimizing VR content is always a huge challenge, so we’re very excited to be working with Nvidia on VRWorks. Features like ‘multi-res’, and ‘lens match’ shading (MRS/LMS) are indispensable tools in the quest to make beautiful, interactive, and deeply immersive virtual worlds,” said Ted Carefoot, producer, Cloudhead Games.

As one of the most commonly used engines in the world, Unity is used by a large portion of VR developers. With VRWorks having direct support means integration of its technologies are a lot simpler for developers working on that platform.

“Improvements in performance and quality are of critical importance to VR applications companies like AltspaceVR, and VRWorks is a huge step in the right direction,” commented Eric Romo, CEO, AltspaceVR. “We’re looking forward to integrating these improvements and are excited to work with Nvidia on this next level of VR.”

VR developers interested in getting early access to this build can apply directly to NVIDIA at:  https://developer.nvidia.com/nvidia-vrworks-and-unity.

VRFocus will continue its coverage of NVIDIA and Unity, reporting back with any further announcements.

NVIDIA Releases VR Funhouse Mod Kit Along With Five Playable Mods, All Open Sourced

Today NVIDIA has revealed some juicy additions to its open sourced VR Funhouse application, described as “the world’s most advanced VR game”, which allows users to create more of their own experiences using its GameWorks, PhysX, and VRWorks technology. There is a new editor, mods, and availability to use all of this as of today.

During a press conference yesterday, Victoria Rege, who is involved with NVIDIA’s VR ventures and has produced work as a product marketer, shared the news that there is now a mod kit available for VR Funhouse, which allows users to access the game’s Unreal Engine 4 (UE4) blueprints and assets to create what they will via Steam Workshop. “The idea is that people can create their own levels, they can create their own actual games and experiences using everything that will be provided.”

Zero_Grav_Goo_1_1472656961

As well as this new editor, there are also five mods added to VR Funhouse itself that expand on the already available games: Super Whack-A-Mole; Zero-Gravity Goo, where the water that was shot into the clowns’ mouths is replaced with green sludge; Big Top; Great Moles of Fire, where their hair is replaced with fire; and Tommy Gun, where they replaced the pistols with tommy guns for the shooting gallery. These are all available to download for free on the NVIDIA VR Funhouse application, which is also free on Steam.

“It’s not necessarily about building new levels for Funhouse, but we hope that happens. It’s more about enabling the community to build more realistic, more physically accurate types of games that people can enjoy,” said Rege.

Tommy_Gun_2_1472656898

VRFocus asked whether or not additional mods would become more frequent, to which Rege replied: “We’re never short of ideas, I actually have lots of requests for Dane, and we are actually going to be working on some downloadable content, some future games that will be announced later in time, so we’re still working on some things. I can’t comment on how many, but we’re definitely going to build and explore.”

As well as these two big bits of news, it is also shared that the full source code for VR Funhouse is available on Github to create their own experiences.

There’s no doubting that NVIDIA VR Funhouse has been a hit, as there have been 100,000 downloads – which is what is estimated to be the number of Vives that have been bought. Rege also went on to say that one of the leading developers who has used VR Funhouse is Sulphur Studios, creators of Everest VR, for its realistic snow physics.

For more on the latest news, updates, and features in the world of VR, make sure to check back with VRFocus.