NVIDIA Launches VRWorks 360 Video SDK 2.0

When they are not preparing new lines of virtual reality (VR) ready GPUs with more cores, more processing speed and more raw power than ever before. And when they’re not filing a whole range of VR related trademarks connected to said graphics cards. NVIDIA can still often be found involved in immersive technology on the software side, one way or another.

NVIDIA VRWorks - 360 VideoPrimarily that is through acts involving NVIDIA VRWorks, NVIDIA’s development kit for VR devs, which provides a suite of APIs, libraries and engines to enable high-end graphics performance when creating things under the VR umbrella. Their latest update sees, once again, improvements made to their 360 Video SDK which has already gotten companies like STRIVR and Pixvana excited.

“When you experience a situation as if you are actually there, learning retention rates can soar.” Commented Chief Technology Officer of STRIVR, Brian Meek, to NVIDIA’s official blog. “The new Warp 360 will help ensure our customers stay fully immersed, and the functionality and performance that Turing brings to VRWorks can’t be beat.”

The new version 2.0 update accelerates the speed of stitching together 360 degree videos as well as a host of other features that mean recording and streaming in 360 degrees becomes a lot easier. It also adds additional support for NVIDIA CUDA 10 and of course the most recent additions to their GPU line-ups.

Nvidia Turing architectureUpdates include:

Ambisonic Audio – increases the immersiveness of 360-degree videos by enabling 3D, omnidirectional audio such that the perceived direction of sound sources change when viewers modify their orientation.

Custom Region of Interest Stitch – enables adaptive stitching by defining the desired field of view rather than stitching a complete panorama. This enables new use cases such as 180-degree video while reducing execution time and improving performance.

Improved Mono Stitch – increases robustness and improves image quality for equatorial camera rigs. Multi-GPU setups are now supported for up to 2x scaling.

Moveable Seams – manually adjusts the seam location in the region of overlap between two cameras to preserve visual fidelity, particularly when objects are close to the camera.

New Depth-Based Mono Stitch – uses depth-based alignment to improve the stitching quality in scenes with objects close to the camera rig and improves the quality across the region of overlap between two cameras. This option is more computationally intensive than moving the seams, but provides a more robust result with complex content.

Warp 360 – provides highly optimized image warping and distortion removal by converting images between a number of 360 projection formats, including perspective, fisheye and equirectangular. It can transform equirectangular stitched output into a projection format such as cubemap to reduce streaming bandwidth, leading to increased performance.

You can download the latest version of the VRWorks 360 Video SDK here. VRFocus will bring youj more news on the developments in VR hardware and software throughout the day.

 

OPTIS Introduces New Software For VR Prototyping

When using digital models for prototyping what will eventually be physical objects, it’s vitally important that they behave as closely as possible like objects in the real world. With that in mind OPTIS heave announced a new version of the HIM software that integrates video, audio and haptic feedback to create a more realistic simulation.

The HIM software is powered by Nvidia technology, using Quadro GPU’s to provide the power needed to create full-scale 3D prototypes with a high level of detail and realism. HIM allows for full-body tracking and motion capture so products can be evaluated by potential end-users in a virtual environment long before a physical product is produced. Collision detection uses Nvidia PhysX, part of the VRWorks software development kit. HIM also has integrated, for the first time, VR audio to create accurate simulations of sound propagation and create a more immersive experience.

The main thrust of the OPTIS HIM software is to allow team members who may be scattered across the globe to be more collaborative and become more involved in the various stages of project design and development. With a VR environment it becomes easier to pass across information, as communication barriers are lowered. In addition, questions of safety can also be addressed as HIM allows every stage of production to be modelled, including the production line, so bottlenecks or safety concerns can be tested and corrected before the product is put into production.

“Soon, robots will gain ground in the design process and in industrial manufacturing in general. OPTIS’ solutions facilitate the upstream work, the use of the virtual reality favoring human-machine interactions.” says Jacques Delacour, CEO and founder of OPTIS.

VRFocus will continue to bring you news of developments in VR.

NVIDIA Shows How Physically-based Audio Can Greatly Enhance VR Immersion

Positional audio for VR experiences—where noises sound as if they are coming from the correct direction—has long been understood as an important part of making VR immersive. But knowing which direction sounds are coming from is only one part of the immersive audio equation. Getting that directional audio to interact in realistic ways with the virtual environment itself is the next challenge, and getting it right came make VR spaces feel far more real.

Positional audio in some form or another is integrated into most VR applications today (some use better integrations and mixes than others). Positional audio tells you about the direction of various sound sources, but it misses out completely on telling you about the environment in which the sound is located, something that we are unconsciously tuned to understand as our ears and brain interpret direct sounds mixed in with reverberations, reflections, diffractions, and more complex audio interactions that change based on the shape of the environment around us and the materials of that environment. Sound alone can give us a tremendous sense of space even without a corresponding visual component. Needless to say, getting this right is important to making VR maximally immersive, and that’s where physically-based audio comes in.

Photo courtesy NVIDIA

Physically-based audio is a simulation of virtual sounds in a virtual environment, which includes both directional audio and audio interactions with scene geometry and materials. Traditionally these simulations have been too resource-intensive to be able to do quickly and accurately enough for real-time gaming. NVIDIA has dreamt up a solution which takes those calculations and runs them on their powerful GPUs, fast enough, the company says, for real-time use even in high-performance VR applications. In the video heading this article, you can hear how much information can be derived about the physical shape of the scene from the audio alone. Definitely use headphones to get the proper effect; it’s an impressive demonstration, especially to me toward the end of the video when occlusion is demonstrated as the viewing point goes around the corner from the sound source.

That’s the idea behind the company’s VRWorks Audio SDK, which was released today during the GTC 2017 conference; it’s part of the company’s VRWorks suite of tools to enhance VR applications on Nvidia GPUs. In addition to the SDK, which can be used to build positional audio into any application, Nvidia is also making VRWorks Audio available as a plugin for Unreal Engine 4 (and we’ll likely see the same for Unity soon), to make it easy for developers to begin working with physically-based audio in VR.

SEE ALSO
Latest Unity Beta Gets NVIDIA VRWorks for Enhanced Rendering Features

The company says that VRWorks Audio is the “only hardware-accelerated and path-traced audio solution that creates a complete acoustic image of the environment in real time without requiring any ‘pre-baked’ knowledge of the scene. As the scene is loaded by the application, the acoustic model is built and updated on the fly. And audio effect filters are generated and applied on the sound source waveforms.”

VRWorks Audio repurposes the company’s OptiX ray-tracing engine which is typically used to render high-fidelity physically-based graphics. For VRWorks Audio, the system generates invisible rays representing sound wave propagation, tracing the path from its origin to the various surfaces it will interact with, and eventually to its arrival at the listener.


Road to VR is a proud media sponsor of GTC 2017.

The post NVIDIA Shows How Physically-based Audio Can Greatly Enhance VR Immersion appeared first on Road to VR.

Auf der NVIDIA GTC 2017 können Teilnehmer 30.000 USD bei einem VR-Wettbewerb gewinnen

Am 8. – 11. Mai findet die GTC 2017 (GPU Technology Conference) gesponsort von NVIDIA statt. Auf diesem Event in San Jose, Kalifornien veranstaltet der Sponsor einen Wettbewerb, bei dem die Gewinner Preise in Höhe von 30.000 USD erhalten.

Eine Ausstellung über VR-Produkte

Der Wettbewerb wird in Form einer Ausstellung über Inhalte mit VR-Bezug gehalten. Die Firmen haben die Möglichkeit ihre Produkte einzureichen und später vorzustellen. Jedoch liegt der Fokus auf allem auf Gaming. Die VR-Produkte müssen außerdem die Nvidia GPU Technologie, wie z. B. GameWorks, DesignWorks oder VRWorks nutzen und gemeinsam mit einem an den Computer oder Workstation verbundenen VR-Headset funktionieren. Neben VR-Produkten können außerdem AR-Produkte eingereicht werden. Die Bewerbungsfrist endet am 15. März.

Der Veranstalter wählt aus den Einreichungen zehn Firmen oder Teams aus, die am Wettbewerb teilnehmen dürfen. Diese erhalten die Möglichkeit ihre Produkte während der GTC auf einem zur Verfügung gestellten Platz, vorzustellen. Außerdem sind sie verpflichtet eine Präsentation in Länge von fünf Minuten vor einer Jury zu halten, um die Gewinner auszumachen. Danach darf das Publikum noch drei Minuten lang Fragen stellen. Die Jury darf außerdem die vorgestellten Demos der Teilnehmer in einem Pavillon im NVIDIA VR Startup Dorf testen, in dem sich alle Partizipanten während des Wettbewerbs aufhalten.

Neben Geldpreisen und materiellen Preisen im Wert von 30.000 USD erhalten die Gewinner ebenfalls Unterstützung im Bereich PR und Marketing. Zudem sollen die Unternehmen finanziell unterstützt werden, um eine Entwicklung zu gewährleisten. Nvidia sucht mit diesem Verfahren speziell nach Start-Ups, die bisher nicht mehr als 5 Millionen USD Umsatz vorweisen. Zu dem beinhaltet die Suche das Auffinden von Entwicklern, die sich auf VR-Inhalte ohne Gaming-Bezug spezialisiert haben. Darunter fallen unter anderem die Bereiche Wissenschaft, Technik, Bildung, Kunst und Medizin.

Durch das Event sollen auch kleinere Unternehmen unterstützt werden, um ihre Produkte auf dem Markt zu etablieren und weiterzuentwickeln. Wir dürfen gespannt sein, welche VR und AR Highlights uns erwarten.

(Quellen: Nvidia)

Der Beitrag Auf der NVIDIA GTC 2017 können Teilnehmer 30.000 USD bei einem VR-Wettbewerb gewinnen zuerst gesehen auf VR∙Nerds. VR·Nerds am Werk!

NVIDIA’s New GeForce GTX 1080 Ti is a VR Beast

NVIDIA’s New GeForce GTX 1080 Ti is a VR Beast

As many of you already know, NVIDIA is one of the leading purveyors of graphics cards which are heavily used for VR. In fact, NVIDIA is the go-to solution for the majority of users, so it is always a good day when the company launches a new graphics card. This is especially true when they launch a brand new high-end graphics card. The new GeForce GTX 1080 Ti is the new top dog within NVIDIA’s lineup of graphics cards and is designed to be the fastest card that the company has offered to date. Yes, that includes the GeForce GTX Titan X Pascal, the fastest and most sought after graphics card.

The specs for the GPU itself are:

  • 12 Billion transistors
  • 1.6 GHz Clockspeed, 2 GHz Boost (1.4, 1.5 GHz boost on Titan X Pascal)
  • 3584 CUDA Cores (same as Titan X Pascal)
  • 352-bit memory bus (384-bit on Titan X Pascal)
  • 11 Gbps memory speed (10 Gbps on Titan X Pascal)
  • 11 GB of RAM (12 GB on Titan X Pascal)
  • 220 Watt TDP (250 on GTX Titan X Pascal)

The expectation is that the GTX 1080 Ti will be 35 percent faster on average than a GTX 1080, according to NVIDIA, which should mean that it will outperform the GTX Titan X Pascal in gaming and VR. The GTX 1080 Ti will ship in March and be available for $699. NVIDIA also killed the DVI port on the new GTX 1080 Ti, which won’t really be missed. It has three DisplayPort connectors and one HDMI connector, allowing for three monitor display configurations with an HDMI, which is what I’m running at home right now.

In addition to the new GPU, NVIDIA is also announcing support for VR Works inside of Unity including support for VR SLI, Multi-Res Shading, Lens Match Shading and SPS. Thanks to the VRWorks features on NVIDIA’s GPUs, benchmarks like Basemark’s VR Score saw as much as a 40% performance uplift. Also, in addition to announcing support for Unity and VR benchmarks, NVIDIA is introducing their own tool to measure VR quality called FCAT VR. This tool is built on their frame capturing technology which seeks to discover real world performance and actual frames displayed to the headset. They’ve also introduced an advanced data analysis tool with FCAT data analyzer to allow anyone to analyze a game or application’s behavior.

While we don’t exactly know how much faster it will be than the Titan X Pascal, the expectation is that it will be faster and significantly faster than the GTX 1080. This is a good thing because it means that VR developers can really start to look at enabling eye candy features in their applications and that we can start to think about possibly increasing the resolution of VR HMDs down the road as well. There are already higher resolution displays out there and graphics cards like the GTX 1080 Ti are going to be critical to enabling those high resolutions at acceptable frame rates.

Disclosure: My firm, Moor Insights & Strategy, like all research and analyst firms, provides or had provided research, analysis, advising, and/or consulting to many high-tech companies in the industry, including NVIDIA and others. I do not hold any equity positions with any companies cited.

Tagged with: , , , , , , , , , , ,