Adobe Answers Facebook With ‘Sidewinder’ Volumetric Video Technique

Adobe Answers Facebook With ‘Sidewinder’ Volumetric Video Technique

Earlier this year Facebook debuted a technique for capturing volumetric content from a 360-degree camera — meaning you can move freely around inside the footage to see the action from different angles.  This week Adobe debuted a project that seems to produce a very similar effect.

The concept is called Project Sidewinder and it was presented on stage during Adobe’s MAX creator’s conference in Las Vegas. The project may or may not become a part of the company’s products. It was presented as one of 11 concepts this week showcasing Adobe’s future-facing efforts to help creators work more efficiently. Another VR-related concept Adobe showed, called SonicScape, looks like it could easily fit into the workflow of people using Adobe’s Premiere video editing software.

The techniques from both Adobe and Facebook look like they are extremely limited, with some stretched and distorted artifacts becoming more and more visible the further you move away from the camera’s actual position. Nonetheless, for small movements in VR the techniques can powerfully enhance a persons’s sense of presence in captured content.

Check it out in the video from MAX as Silicon Valley’s Kumail Nanjiani tests out the feature.

Tagged with: ,

Adobe’s Project SonicScape Visualizes Audio For Easier 360 Editing

Adobe’s Project SonicScape Visualizes Audio For Easier 360 Editing

Adobe previewed a concept it is working on that would make it easier for creators working on VR videos to place and align sound.

Producing high-quality 360-degree video content has traditionally been a difficult affair at all stages of production, from capture to delivery. However, a constant stream of new cameras, editing tools and streaming techniques are on the way to make the process easier.

With Adobe’s SonicScape, the tool visualizes the location of audio within a spherical VR video using colorful bubbles. The visualization makes it easy to click and drag the sound to align it with the video, which could be ideal for cases where a microphone’s audio isn’t synced to the same location as the picture. The tool could make it easy for creators to enhance how immersive the sound is in their 360-degree projects, so when you turn your head in different directions while wearing a head-mounted display the audio seems to come from the right spot. It’s also possible to place additional sound effects in specific spots within the sphere, which could make it easier for creators to layer in complex soundscapes in 360-degree projects pulling from a library of effects.

The concept was revealed at Adobe’s MAX conference in Las Vegas as part of the company’s forward-looking projects. Project SonicScape may or may not be incorporated into other Adobe tools. Last year, Adobe premiered tools in a similar concept format that were ideal for its Adobe Premiere Pro video editing suite, enabling creators to easily playback and preview 360-degree content while wearing a head-mounted display and using hand controls like Oculus Touch. Those tools just started shipping as part of the actual Adobe product line and if SonicScape improves the workflow for content creators it could be a similarly useful step forward for 360-degree video creators. If useful, SonicScape could find its way into a future iteration of Adobe’s products.

Check out the video below for a look at the software:

Tagged with: ,

Adobe Premiere Pro: Mit Project CloverVR direkt in VR arbeiten

Adobe Premiere Pro hat sich als beliebtes Tool zum Bearbeiten von Videos durchsetzen können. Nun greift das Software-Unternehmen stärker den VR-Bereich an: Mit Project CloverVR veröffentlichen die Entwickler ein neues Interface, welches euch 360-Grad-Videos komplett in der virtuellen Realität editieren lässt.

Adobe Project CloverVR: Schneiden in VR

Das Interface hatte Adobe bereits 2016 bei seinem Sneak Peek vorgestellt, doch nicht alle dort gezeigten Ideen bisher umgesetzt. CloverVR konnte sich aber durchsetzen und steht nun zum Download bereit. Mit dem neuen VR-Interface soll die Bearbeitung von 360 Grad Content wesentlich schneller gehen, da Anwender beispielsweise alle Schnitte und Ausrichtungen direkt in der VR-Ansicht auf ihre Tauglichkeit prüfen können.

Adobe Premiere Pro CC lässt sich nur im Abonnement beziehen und kostet knapp 24 Euro im Monat. Für das Komplettpaket mit allen Applikationen der Creative Cloud veranschlagt Adobe zur Zeit knapp 60 Euro inklusive Mehrwertsteuer. In der Creative Cloud sind solche Schwergewichte wie Adobe Photoshop und das Layout-Programm Adobe InDesign enthalten. Insgesamt führt der Entwickler über 20 Applikationen auf.

Aktuell findet die Adobe MAX in Las Vegas statt. Dort zeigt beispielsweise Nvidia, dass die eigenen GPUs eine 8k-Bearbeitung in Echtzeit mit Adobe Premier Pro CC schaffen. Zum Einsatz kommt die aktuelle Generation der Quadro-Workstation-Grafikkarten.

(Road to VR)

Der Beitrag Adobe Premiere Pro: Mit Project CloverVR direkt in VR arbeiten zuerst gesehen auf VR∙Nerds. VR·Nerds am Werk!

Adobe Tech Could Give Cheap 360 Cameras 6 DOF Upgrades

Adobe Tech Could Give Cheap 360 Cameras 6 DOF Upgrades

Virtual reality cameras have been hot hardware in the industry this past month and now Adobe is bringing innovative new software to the table as well.

Earlier this week, Variety reported that Adobe’s head of Research Gavin Miller is claiming a new process for 360 video post-processing.

Adobe’s system can apparently convert standard monoscopic 360 video into a much more compelling format that includes three-dimensional visuals. This process can reportedly also bring 6 DOF (degrees-of-freedom, the ability to lean in toward objects and take steps in any direction) explorability to 360 videos inside a VR headset.

These are both features highlighted by Facebook’s new x24 and x6 VR cameras, which the social media giant unveiled last week. With Adobe’s approach, however, you won’t need to shell out big time for a bleeding edge rig. You’ll simply be able to upgrade the footage from a less expensive 360 cam using what Miller refers to as a “structure-from-motion” algorithm.

In a research paper published by the Adobe Research team, more details are provided about how this algorithm works. An excerpt form the paper’s introduction reads that:

“We present an algorithm that enhances monoscopic 360-videos with a 6-DOF and stereoscopic VR viewing experience. Given an input monoscopic 360-video, in an offline stage we infer the camera path and the 3D scene geometry by adapting standard structure-from-motion techniques to work with 360 videos.

We then playback the input video in a VR-headset where we track the 6-DOF motion of the headset and synthesize novel views that correspond to this motion. We synthesize a new view for each eye in parallel to achieve the stereoscopic viewing experience. Our main contribution is a novel warping algorithm that synthesizes novel views on the fly by warping the original content.

Unlike any other previous method, this warping technique directly works on the unit sphere and therefore minimizes distortions when warping spherical panoramas. Moreover, we optimize our warping solution for GPUs and achieve VR frame rates.”

The line for cutting edge features moves forward quickly in immersive tech. Last year, the HTC Vive proved that once consumers get wind of a compelling feature like hand controls or room-scale, other companies are on the clock to create parity and stay competitive.

Now that 3D 6 DOF has been proven possible for 360 video, anything else will likely seem sub-par. Miller officially presented the algorithm this week on a panel at NAB 2017.

Tagged with:

Facebook Unveils Two New Volumetric Video ‘Surround360’ Cameras, Coming Later this Year

Facebook today announced two new additions to the Surround360 hardware initiative that are poised to make 360 video more immersive. Unveiled at the company’s yearly developer conference, F8, the so-called x24 and x6 cameras are said to capture 360 video with depth information, giving captured video six degrees of freedom (6DoF). This means you can not only move your vantage point up/down, left/right like before, but now forwards/backwards, pitch, yaw and roll are possible while in a 360 video.

Even the best stereoscopic 360 videos can’t provide this sort of movement currently, so the possibility of a small, robust camera(s) that can, is pretty exciting—because let’s face it, when you’re used to engaging with the digital world thanks to the immersive, positional tracking capabilities of the Oculus Rift, HTC Vive, or PSVR, you really notice when it’s gone. Check out the gif below to see exactly what that means.

Originally announced at last year’s F8 as an open source hardware platform and rendering pipeline for 3D 360 video for VR that anyone could construct or iterate on, Facebook is taking their new Surround360 reference designs in a different direction. While Facebook doesn’t plan on selling the 360 6DoF cameras directly, the company will be licensing the x24 and x6 designs—named to indicate the number of on-board sensors—to a select number of commercial partners. Facebook says a product should emerge sometime later this year.

The rigs are smaller than the original Surround360, now dubbed Surround360 ‘Open Edition’, but are critically smaller than rigs capable of volumetric capture like unwieldy rigs like HypeVR’s high-end camera/LIDAR camera.

Specs are still thin on the ground, but the x24 appears to be around 10 inches in diameter (257mm at its widest, 252mm at its thinnest), and is said to capture full RGB and depth at every pixel in each of the 24 cameras. It is also said to oversample 4x at every point in full 360, providing “best in-class image quality and full-resolution 6DoF point clouds.”

The x6, although not specified, looks to be about half the diameter at 5 inches, and is said to oversample by 3x. No pricing info has been made public for either camera.

Facebook says depth information is captured for every frame in the video, and because it outputs in 3D, video can be feed into existing visual effects (VFX) software tools to create a mashup of live-action capture and computer-generated imagery (CGI). Take a look at the gif below for an idea of what’s possible.

Creating good-looking 6DoF 360 video is still an imperfect process though, so Facebook is also partnering with a number of post-production companies and VFX studios to help build out workflows and toolchains. Adobe, Otoy, Foundry, Mettle, DXO, Here Be Dragons, Framestore, Magnopus, and The Mill are all working with Facebook in some capacity.

“We’ve designed with Facebook an amazing cloud rendering and publishing solution to make x24’s interactive volumetric video within reach for all,” said Jules Urbach, Founder & CEO Otoy. “Our ORBX ecosystem opens up 28 different authoring and editing tools and interactive light field streaming across all major platforms and browsers. It’s a simple and powerful solution this game-changing camera deserves.”

Keep an eye on this article, as we’ll be updating information as it comes in.

The post Facebook Unveils Two New Volumetric Video ‘Surround360’ Cameras, Coming Later this Year appeared first on Road to VR.

Adobe Plans to Integrate with HoloLens and Amazon Alexa

Adobe have unveiled new technology to integrate their new advertising products with Amazon’s digital assistant Alexa and Microsoft’s HoloLens mixed reality (MR) product.

The new technology uses Adobe Sensei, a machine learning platform in order to facilitate the integration. Adobe suggest that with the new technology, it would be possible for someone wearing the HoloLens to be standing in Times Square in New York, and the iconic billboards around them would all display personalised adverts.

The HoloLens integration would also allow retail employee using the technology to see what products within a store are doing well, in order to better emphasise different products and improve store layout.

Adobe are also wanting to personalise the experience for those using Amazon Alexa, for example, using Adobe’s Experience Cloud to ask Alexa for the number of air miles they might have. Then Alexa and the Adobe Experience Cloud could combine this information to be able to alert the user when certain relevant offers or promotions become available.

Adobe is experimenting heavily with virtual reality and augmented reality advertising and analytics. They are working on technology to introduce interactive adverts to the VR cinema viewing experience, which they displayed at the Mobile World Congress in Barcelona in February.

You can watch a video going into detail on how the integration will work below.

VRFocus will continue to bring you news on Adobe’s VR projects.

Adobe Developing VR Advertising Systems

Adobe is showing off a project at Mobile World Congress (MWC) in Barcelona this week concerning advertising in virtual reality (VR). At the moment it is only a prototype from Adobe’s research labs and involves watching 2-D videos theatre-style.

Presently the adverts work thus: If you are watching a video through a virtual theatre such as the ones provided by Netflix and Hulu, the system will interrupt viewings of the video, similar to traditional TV adverts, but can also show additional information to either side or above the user, such as further information on the product, access to coupons or even the opportunity to tweet from within the VR world.

Adobe Primetime Director of Product Management Campbell Foster said in an interview with Variety; “With 360, it’s not clear what content is going to look like beyond gaming. For entertainment, VR is a lot more promising than AR.”

adobe-vr-2

Since the product is still in early stages, it’s currently focused on the theatre-style viewing, and is optimised for use with mobile VR platforms. Mr. Foster also said that it was possible that VR adverts could eventually take advantage of the features currently used in Adobe’s traditional advertising, such as detailed analytics and consumer targeting.

Mr. Foster also suggested that when the ad unit was developed further, it might be possible to ‘beam’ the user out of the theatre setting and place them somewhere else for the duration of the advert, and then return them when the advert was complete.

For further news on Adobe’s VR projects, it will be on VRFocus.

Adobe and Goldsmiths Research Highlights Focused Use of VR & AR in Brands Christmas Marketing

Marketing agencies have been steadily expanding their use of immersive technologies like virtual reality (VR) and augmented reality (AR) for a while now, utilising it to spread brand awareness. This can be found even more so in the run up to Christmas as new research conducted by Adobe and Goldsmiths, University of London indicates.

The study found that over a quarter (27 percent) of consumers are now expecting brands to use VR and AR as part of their Christmas experiences. And when over 500 UK marketers preparing their Christmas campaigns were surveyed, over two thirds of respondents surveyed (68 percent) felt that using technologies like VR provides brands with a competitive edge. 32 percent of marketers also agreed that the tech helps drive customer loyalty to the brand, and more than half (55 percent) believe that it is useful in attracting potential customers.

Adobe and Goldsmiths VR Marketing Research

But there is a flip side to this interest in VR and AR marketing. While marketers are interested in the opportunities VR holds they don’t widespread usage occurring until at least next Christmas. 32 percent said that implementing campaigns at present are still too difficult, with suggestions that budgets and lack of knowledge are some of the key challenges to implementation. Tying into this, actual awareness of VR is still relatively low. A YouGov poll of over 2,000 UK adults, found that 44 percent admitted to not seening these technologies used around the festive season. But the poll did find 29 percent of consumers would be interested in seeing these technologies used in future, with 22 percent saying emerging technologies would attract them to a specific brand at Christmas.

 

John Watton, EMEA Marketing Director, Adobe says: “Our research has revealed that both marketers and consumers are only just beginning to get to grips with emerging technologies like VR and AR. The demand is increasing, but many organisations are still evaluating whether they have a viable place in their marketing strategies. There can be no better time than Christmas to get a better understanding of how, or indeed if, brands are creating new and deeper connections with potential and existing customers using these technologies – in an era when the customer experience is everything. The examples explored in this report really bring to life the possibilities of The Future of Experience and I am excited to see how brands continue to innovate around the customer experience in 2017.”

“This Christmas, we’re seeing early adopters and progressive organisations harness the power of emerging technologies to engage customers and extend the reach of their brands. However, the relative lack of awareness and readiness amongst the wider marketing community for harnessing the power of the emerging technologies driving empathetic customer experiences like serendipity and adaptability is staggering,” said Dr Chris Brauer, Director of Innovation at Goldsmiths, University of London. “The research shows widespread lack of recognition and emphasis on the need for brands to market both through and to smart machines for meaningful engagement with customers. There is a revolution in marketing underway and the exceptional use cases in our research demonstrate progressive brands experimenting and shaping the possibilities before the Future of Experience becomes the mainstream present in 2017.”

VRFocus has reported on many instances of VR being used for marketing, and as adoption of the tech grows so will its advertising potential. As further developments continue VRFocus will bring you the latest updates.

Adobe präsentiert Project Dali

Adobe scheint an einer eigenen Lösung für das Zeichnen im virtuellen Raum zu arbeiten und hat auf dem eigenen Blog Project Dali vorgestellt. Im Video von Adobe ähnelt die Anwendung Tilt Brush von Google und Quill von Oculus, doch die Palette an Werkzeugen und Pinseln scheint bereits jetzt sehr groß zu sein.

Adobe präsentiert Project Dali

Adobe spricht in dem Post davon, dass es bisher schwierig war, Zeichnungen und Modelle in den virtuellen Raum zu bringen, wenn man nur an einem Monitor arbeitet. Dieses Problem möchte Adobe mit Project Dali lösen und lässt den Künstler direkt in der virtuellen Welt agieren und seine Kreationen erschaffen. Damit sollen Hürden für den Künstler abgebaut werden und die Technik soll der Idee nicht mehr im Weg stehen. Aktuell befindet sich die Anwendung noch in einer frühen Phase und der Mehrwert zu Tilt Brush wird noch nicht direkt deutlich.

Im Moment gibt Adobe auch noch keine weiteren Details zu dem Programm bekannt. Das Unternehmen gibt ebenso noch keine Infos darüber, wann wir mehr über das Projekt erfahren werden. Im Video malt der Künstler nur mit den Controllern der HTC Vive. Sicherlich wird Adobe aber auch an einem Support für die Oculus Rift und Oculus Touch Controller arbeiten.

Der Beitrag Adobe präsentiert Project Dali zuerst gesehen auf VR∙Nerds. VR·Nerds am Werk!