INT Announces 2,228 PPI High Pixel Density AMOLED for VR Headsets

When we saw the news that JDI, the Japanese display conglomerate founded by Sony, Toshiba and Hitachi, were developing a 1001ppi LCD display for VR headsets, it was clear that the race for ever-higher pixel densities was still alive and well. Now, we learn INT, a Taiwan-based display design firm, is developing a 2228 ppi AMOLED specifically designed for VR headsets.

Announced in a fairly sparse press release, not much is known about the company’s 2228 ppi AMOLED, which is built on a glass substrate. INT hasn’t provided any specs outside of the display’s pixel density and the fact that it’s an on-glass AMOLED.

High pixel densities are necessary to reduce screen door effect – the visible lines between pixels, which when magnified by VR lenses, become much more apparent. To boot, the company says the glassbased display “is much more economical and can be made in larger size, thus improve FOV significantly.”

Image courtesy INT

Because INT hasn’t detailed the size of the panel, it’s impossible to say where it fits on the spectrum of VR hardware. It could be a ~3 inch panel that would essentially replace standard displays like you find in the Oculus Rift or HTC Vive, or an incredibly small microdisplay destined to function in headsets such as Varjo’s ‘bionic display’, which uses two displays per eye—a standard resolution ‘context display’ and a much smaller, but higher ppi ‘focus display’ that is mirrored to the fovea region of the eye and synced via eye-tracking to essentially increase the perceived overall resolution.

Considering however the company says it can be produced in a larger size format, we’re hopeful that means it’s possible to manufacture a more wide-reaching standard display size.

For comparison, both the Vive and Rift use a pair of 1080 × 1200 displays with a ppi of ~456. Currently the market leaders in pixel density are Samsung Odyssey and HTC Vive Pro, both with the same Samsung-built panel at ~615 ppi. So the new INT display should have around a 390% higher ppi than Rift/Vive, and around 260% more than HTC Vive Pro/Samsung Odyssey—a staggering increase that would likely require foveated rendering, a technique that displays a VR scene at the center of the user’s photo-receptor-dense fovea, and at its highest resolution.

The image below (from JDI) demonstrates the dramatic increase in acuity that can be had from such high-pixel-dense displays.

Image courtesy JDI

Dubbing it UHPD (Ultra High Pixel Density), INT is slated to show off their display at the Poster Session of the upcoming SID Display Week, which takes place May 22 – 24 in Los Angeles, CA.

Both JDI and Google are presenting high-pixel-density displays at SID Display Week, with Google showing their 1443ppi OLED on-glass display there as well. We’ll certainly be reporting on whatever comes out of it, so check back then.

The post INT Announces 2,228 PPI High Pixel Density AMOLED for VR Headsets appeared first on Road to VR.

How To Get Any Android App Running On A Daydream Standalone VR Headset

How To Get Any Android App Running On A Daydream Standalone VR Headset

One of the most intriguing hidden features of the first Daydream standalone headset — the Lenovo Mirage Solo — is its ability to play normal Android apps in a flat 2D window.

You can even interact with the apps using the simple Daydream pointer controller to ‘touch’ a virtual touchscreen for app interaction. If you have a compatible USB-C dongle, you could even hook up a wired keyboard. Here’s what PUBG looks like:

We haven’t successfully tested a gamepad yet. Also, though the 2D Netflix app appears to support downloads on the go, at the time of this testing DRM restrictions appear to keep those videos from playing properly.

It takes a few steps to get this up and running, and the apps don’t appear in your library when you’re done. You have to launch the apps by going to a settings menu and selecting a link for the app’s Play Store store listing. You can launch the app from that page.

Here’s what you need to do to enjoy Android apps on a Daydream standalone:

1: Visit Play.google.com in a browser and log into the same Google account that is also logged into the Daydream standalone.

2. Search for the app you want.

3. Buy/Install the app and select the listing for “Lenovo” from the menu.

4. The Daydream headset should start downloading the app (assuming it is connected to Wi-Fi). You can check the status by putting on the headset and pressing the bottom button on the Daydream controller.

5. You can also check the download status in detail by clicking on it. Once the app is downloaded, you need to go to your settings from this same menu. Click the icon in the top right corner — it looks like a gear.

Click “all settings”.

Make sure you remember how you got to “Apps & notifications” because you’ll be visiting this menu a lot to a.) grant app permissions and b.) open your apps.

If you have a lot of apps click “see all” apps

If you need to grant the app permissions, do so first. Then click on the app you want to launch.

Swipe down on the pad to get to “App details” and that’ll open up the Google Play store page.

Click the “open” button and enjoy your Android library in VR. 

Tagged with: ,

Google Announces VR Labs For Remote Learning

It has already become clear that immersive technology such as virtual reality (VR) has a lot to offer the world of education. Several higher education facilities have begun to experiment with the technology, and at Google I/O, it was announced that Google will be rolling out a VR laboratory that can be used by remote learning students.

For many modern students, education needs to be balanced with work, either part time or full time. Those students can find a great deal of benefit by taking online classes, but still face obstacles for practical activities, which is where VR comes in.

Google VR Labs / Labster

For courses such as biology, there are few alternatives to practical experience. A practical lab led by an experiences teacher is a vital part of the course for many qualifications, including biology, but this leaves out those remote students who might be unable to attend these practicals.

In order to enable students to participate in these practical classes, Google have teamed up with a company called Labster to create a VR laboratory which means that colleges and universities will be able to offer fully online, remote courses such as biology.

The Google and Labster technology uses advances simulations to reflect real-world outcomes and mathematically accurate equations which have been built using the Google Daydream platform. This has advantages for the academic institutions in that the VR simulations are cheaper than the equivalent equipment costs.

Remote students will be able to engage with more courses, meaning that institutions can offer more places to students. The students will also be able to take advantage of lab time when and where they want, for as long as they want – not a privilege available with a real-world physical lab.

Google VR Labs / Labster

32% of students took an online course during 2017, so the provision for VR online learning has many implications for the future of learning. VRFocus will continue to bring you news on VR in training and education.

Google Open Sources Seurat To Bring PC-Level VR To Mobile

Google Open Sources Seurat To Bring PC-Level VR To Mobile

This year’s Google I/O developer conference might not have had much to share about VR, but one of the biggest reveals of last year’s event is now available to all.

Last week Google made its Seurat VR tool open source for anyone to use. For those that don’t know, Seurat is designed to render high-fidelity scenes on mobile and standalone VR headsets in real time. The system was already used to achieve PC-level graphics in a Star Wars VR experience seen last year as well as Blade Runner: Revelations, which launched alongside Google and Lenovo’s standalone VR headset, the Mirage Solo, late last week.

Seurat achieves this high quality by identifying assets and resources that aren’t immediately obvious to a VR user and then stripping the given scene of them, freeing up processing power to produce higher quality textures and more. As Google’s VP of VR and AR, Clay Bavor, noted in the tweet above, Seurat helped turn one scene in Blade Runner from 45 million polygons into 300,000, drastically reducing demand on the phone.

This could be a vital tool in helping mobile VR catch up with its more powerful PC-based siblings, which is especially exciting as headsets like Oculus Go start to make the medium more accessible than it’s ever been before.

Seurat isn’t the only area in which Google is looking to push increased fidelity in VR, though. Earlier this year the company also released a PC VR experience showcasing its work in lightfield capture that recreates real-world environments with stunning realism. Exactly which of these many technologies does end up being widely adopted by developers remains to be seen.

Tagged with:

Google I/O 2018: Cloud Anchors und AR Maps

Google startete gestern die Google I/O Konferenz 2018 und nutzte die Bühne, um neue Augmented Reality Funktionen und Anwendungen für Android und iOS vorzustellen. Zukünftig sollen euch AR Maps bei der Navigation unterstützen und mit Cloud Anchors sollen AR Apps ermöglicht werden, welche von mehreren Nutzern gleichzeitig genutzt und betrachtet werden können.

AR Maps

Mit den AR Maps zeigte Aparna Chennapragada, VP of Product for AR/VR bei Google, wie wir demnächst auch zu Fuß komfortabel von Google durch die Städte navigiert werden. Eine Kombination aus der Kamera des Smartphones, ARCore, Street View und Maps sorgt dafür, dass euch der richtige Weg direkt als Hologramm in der echten Welt angezeigt werden kann. Hierbei könnt ihr entweder auf schnöde Pfeile setzen, oder euch von einem kleinen Fuchs den Weg weisen lassen. Doch damit nicht genug: Demnächst müsst ihr nicht mehr nach den Bewertungen für ein Restaurant in der Nähe suchen, sondern die Bewertungen hängen direkt als Schild an der entsprechenden Location.  Da reine GPS-Daten für ein solches Unterfangen nicht ausreichen, setzt Google auf ein VPS (visual positioning system) um die Position und Ausrichtung genauer bestimmen zu können. Hierbei werden markante Punkte der echten Welt zur akkuraten Bestimmung genutzt. Dieses System soll auch in Innenräumen verwendet werden können.

Cloud Anchors

Mit Cloud Anchors stellte Google gestern eine neue Funktion für den ARCore vor, welche es ermöglicht, AR-Inhalte mit mehreren Personen zu nutzen. Doch hierbei wird nicht nur ein Spiel oder dessen Inhalte geteilt, sondern auch die genaue Position. Google zeigte auf der Konferenz, wie beispielsweise Just a Line zu einem Multiplayer-Hit werden könnte:

Neben den Cloud Anchors wird der ARCore auch bald eine bessere Erkennung von Flächen erhalten, damit noch mehr Wände und Oberflächen als Spielfelder erkannt werden können.  Außerdem sollen Objekte auch als symbolische Marker dienen können, um Beispielsweise die Verpackung eures Spiels zu erweitern:

(Quelle: Road to VR)

Der Beitrag Google I/O 2018: Cloud Anchors und AR Maps zuerst gesehen auf VR∙Nerds. VR·Nerds am Werk!

Google to Support Multi-user Shared AR Apps with ‘Cloud Anchors’ in ARCore

Today at Google I/O 2018, Google announced upgrades to its ARCore augmented reality technology, including a new feature called Cloud Anchors which will let developers create multi-user applications where devices can share an augmented environment.

ARCore launched earlier this year, bringing AR functionality to 100 million Android smartphones, by Google’s count. The software allows developers to build rich AR experiences for smartphones by leveraging the camera for positional tracking and some environment mapping. Today at Google I/O 2018, the company revealed the latest upgrades to ARCore.

One of the biggest announcements was Cloud Anchors, a new feature which allows developers to connect multiple users together over the web to create multi-user AR experiences which share a synchronized augmented space. Crucially, Cloud Anchors offers support for both Android and iOS. The company showed an example of the new feature in action with an experimental version of their Just a Line app, which showed users drawing and playing together across multiple devices.

Just a Line will be updated with multiplayer support “in the coming weeks,” according to Google, and will launch on Android and iOS.

Other upgrades to ARCore include Plane Detection, which allows developers to attach objects more realistically to more surfaces, including textured walls. A new Augmented Images feature will allow images to function as symbolic markers, allowing developers to attach or overlay content onto the image. This could enable artwork and other flat images to come to life as the company shows in this example:

Furthermore Google announced Sceneform for ARCore which aims to help Java developers—who don’t usually work with 3D content—more easily develop AR apps without needing to learn 3D APIs like OpenGL. The company says that Sceneform is highly optimized for mobile—with special attention paid to performance, memory utilization, and file sizes—and can help developers build AR apps from the ground up as well as add AR to existing applications.

Google says that developers can begin working with these new features today, and find supporting documentation on the ARCore developer website.

The post Google to Support Multi-user Shared AR Apps with ‘Cloud Anchors’ in ARCore appeared first on Road to VR.

Google Teases AR Maps Integration to Help You Navigate By Sight

Aparna Chennapragada, VP of Product for AR/VR at Google, took to the stage at Google’s I/O developer conference today to tease some of the work the company is doing to integrate augmented reality into Google Maps.

Chennapragada says the team has combined the smartphone’s camera, computer vision, Street View and Maps to “reimagine walking navigation,” essentially letting you view navigational cues by using some of the things already built into Google’s augmented reality-able Camera app thanks to ARCore.

Chennapragada presented a few possible use-cases including a simple walking navigation scenario replete with blinking navigational arrows superimposed into the physical world.

Image courtesy Google

Teasing a bit more, Chennapragada mentioned other possible features including integration of landmark recognition, and even a little fox-buddy to help lead the way.

Image courtesy Google

“Enabling these kinds of experiences though, GPS alone doesn’t cut it. So that’s why we’ve been working on VPS – visual positioning system, that can estimate precise positioning and orientation,” she said.

First revealed at last year’s I/O, VPS is said to use the visual features of physical world to position you within it more precisely. While not specifically mentioned during the presentation, it was previously touted for its ability to take you where GPS can’t, i.e. out of satellite range with the ability to give you turn-by-turn directions indoors.

While Google tiptoed around any specific announcements of when such a AR Maps feature could be coming, the demo was certainly a promising step in a decidedly augmented direction.

The post Google Teases AR Maps Integration to Help You Navigate By Sight appeared first on Road to VR.

Oculus Go vs. Lenovo Mirage Solo: Which Is The Better Buy?

Oculus Go vs. Lenovo Mirage Solo: Which Is The Better Buy?

This weekend I placed four VR headsets out on a table: Daydream View, Gear VR, Mirage Solo and Oculus Go.

The Pixel 2 started dropping frames, so I retired Daydream View without anyone really playing with it. A family member visited Pennywise’ house in Gear VR, but there was a pop up sitting between the player and his world the whole time that I couldn’t figure out how to dismiss quickly. Gear VR was next to be eliminated in this VR battle royale.

Mirage Solo’s two major upgrades over Oculus Go performed well. Chromecast integration made it easy to see what the person in VR sees. When a family member needed help figuring out what to do it was as simple as looking over at the TV to see what’s going on and offer help. The addition of positional tracking in some apps also made me more confident Mirage Solo wouldn’t make anyone sick. I found it almost magical to turn the headset on and immediately have the freedom to move my head around. For two years now I’ve been setting up external tracking equipment for the consumer Oculus Rift and HTC Vive and, finally in 2018, the Mirage Solo does this on its own. But, alas, the Mirage Solo was retired eventually too as I ran out of things I wanted to show people.

The VR battle royale ended with a clear winner: Oculus Go.

Overall, there’s such a dearth of content on the Mirage Solo that there isn’t a search button. You can find a combination of YouTube videos and available apps from a series of panels hanging in mid-air, and that’s it. There’s also no Google Chrome browser available on the Lenovo Mirage Solo. This, in my view, is a show-stopping omission. Sure you can get to YouTube videos but what about the rest of the Web?

And on that point, watching Netflix or surfing the web hands-free with Oculus Go in bed is very relaxing. It’s as if you’ve installed a big screen TV right on your ceiling and it even comes with a nice Internet browser. The $250 64GB Oculus Go includes the same amount of storage as the $400 Mirage Solo, and the Solo can be expanded with extra storage via micro SD card. It’s a nice inclusion alongside the added head tracking and Chromecast integration, but doesn’t do enough to justify a $150 expense over Oculus Go.

For most buyers, a $400 gadget you don’t use very much is a lot harder to justify than $200. In fact, I don’t think I’d recommend Mirage Solo even if it was offered at the same price as Oculus Go. Overall, Mirage Solo lacking an included Internet browser is going to make the device less useful than Oculus Go for many people. It’s just too convenient to check a few sites between visits to worlds, and that’s not even accounting for easy access to Web-based worlds through the Oculus browser.

The absence of Chrome is an astonishing omission for a device powered by Google. I take it as a sign the tech giant decided to hold its inclusion until Google produces more capable standalone hardware that can run software like Job Simulator and Tilt Brush. For now, though, I think most people will find Oculus Go offering enough content and features to make its occasional use worth the investment. Mirage Solo just doesn’t offer the same value.

Tagged with: ,

Google Open Sources Seurat, a ‘Surface Light-field’ Rendering Tool for 6DOF Mobile VR

Google announced Seurat at last year’s I/O developer conference, showing a brief glimpse into the new rendering technology designed to reduce the complexity of ultra high-quality CGI assets so they can run in real-time on mobile processors. Now, the company is open sourcing Seurat so developers can customize the tool and use it for their own mobile VR projects.

“Seurat works by taking advantage of the fact that VR scenes are typically viewed from within a limited viewing region, and leverages this to optimize the geometry and textures in your scene,” Google Software Engineer Manfred Ernst explains in a developer blogpost. “It takes RGBD images (color and depth) as input and generates a textured mesh, targeting a configurable number of triangles, texture size, and fill rate, to simplify scenes beyond what traditional methods can achieve.”

Blade Runner: Revelations, which launched last week alongside Google’s first 6DOF Daydream headset Lenovo Mirage Solo, takes advantage of Seurat to a pretty impressive effect. Developer studio Seismic Games used the rendering tech to bring a scene of 46.6 million triangles down to only 307,000, “improving performance by more than 100x with almost no loss in visual quality,” Google says.

Here’s a quick clip of the finished scene:

To accomplish this, Seurat uses what the company calls ‘surface light-fields’, a process which involves taking original ultra-high quality assets, defining a viewing area for the player, then taking a sample of possible perspectives within that area to determine everything that possibly could be viewed from within it.

This is largely useful for developers looking to create 6DOF experiences on mobile hardware, as the user can view the scene from several perspectives. A major benefit, the company said last year, also includes the ability to add perspective-correct specular lightning, which adds a level of realism usually considered impossible on a mobile processors’ modest compute overhead.

Google has now released Seurat on GitHub, including documentation and source code for prospective developers.

Below you can see an image with with Seurat and without Seurat (click to expand):

The post Google Open Sources Seurat, a ‘Surface Light-field’ Rendering Tool for 6DOF Mobile VR appeared first on Road to VR.

Googles revolutionäres Render-Tool Seurat wird Open Source

Sie soll Grafik in Kinoqualität auf relativ schwacher Hardware ermöglichen: Vor knapp einem Jahr stellte Google Seurat vor, seitdem war es um die Render-Technologie ziemlich ruhig geworden. Dabei zielt sie auf mobile Headsets ab, wie sie jetzt beispielsweise mit der Lenovo Mirage Solo erschienen ist. Damit mehr Entwickler Seurat einsetzen können, hat Google den Renderer jetzt als Open Source freigegeben.

Seurat wird Open Source

Für Aufsehen sorgte letztes Jahr Seurat, das Google auf seiner Entwicklerkonferenz I/O 2017 erstmals vorstellte. Der Render-Ansatz ermöglicht es, Rechenzeiten um ein Vielfaches zu reduzieren. So soll beispielsweise eine Szene aus dem Star-Wars-Spiel Rouge One statt einer Stunde nur noch 13 Millisekunden benötigen, die Polygonzahl sank von rund 50 Millionen auf gerade mal 72.000.

Um das zu erreichen, legen Entwickler fest, wo sich der Anwender gerade in der virtuellen Realität befindet. Seurat berechnet dann 3D-Modelle in dieser Abhängigkeit, wodurch sich die Komplexität wesentlich reduziert. Allerdings ist der Rechenaufwand wohl immer noch enorm, weshalb die Szenen vorberechnet werden, bevor sie in den VR-Titel wandern. Wie lange diese 3D-Modell-Berechnungen dauern, hat Google bisher nicht verraten.

Auf GitHub stellt Google nun den Code des Render-Ansatzes als Open Source sowie Dokumentationen und Tools für Windows und Linux bereit, Entwickler können somit direkt loslegen. Angesichts dessen, dass autarke Brillen wie die auf Googles Ökosystem basierende Mirage Solo erschienen sind, sicherlich ein willkommener Schritt. Zwar hat man nicht mehr viel in den letzten Monaten von der Engine gehört, jedoch kam sie beispielsweise in Blade Runner: Revelations zum Einsatz. Der VR-Titel erschien bisher exklusiv für die Mirage Solo/Google Daydream.

(Quelle: VR Focus)

Der Beitrag Googles revolutionäres Render-Tool Seurat wird Open Source zuerst gesehen auf VR∙Nerds. VR·Nerds am Werk!