Lenovo Mirage Solo Features Major Upgrades Over Oculus Go

Lenovo Mirage Solo Features Major Upgrades Over Oculus Go

Leaning around in Oculus Go is not fun. The urge to try anyway can be strong, as happened to me when Games Editor David Jagneaux was showing me photos in his virtual loft. There was a photo he had placed on the far wall I couldn’t see because his head blocked it. Facebook’s Oculus Go doesn’t register leaning movement and, because he was in Oculus Go too, David couldn’t simply lean out of the way. At best, this limitation of the $200 headset is an inconvenience, but at worst it might make some users more likely to experience simulator sickness.

The Lenovo Mirage Solo is not held back by limited head movement. The first standalone headset powered by Google’s technology offers six degrees of freedom (6DoF) for head movement. We should get our hands on the device this weekend and I’ll have first impressions up soon after, but in the meantime we dug through the reviewer’s guide for the $400 standalone and confirmed a number of features are included that are likely to represent a major step up compared with the Facebook headset.

Here’s a look at some of the other things we plan to test and compare to Oculus Go as soon as we get our hands on the headset:

Chromecast Mirroring

Second screen viewing is a major omission at launch of Oculus Go, though the company is likely to add it. That feature, however, successfully made the jump from Daydream View to the Daydream OS included with the Mirage Solo.

This inclusion means that if you’ve got a Chromecast it should be super easy for Mirage Solo players to mirror what they see in VR to a TV nearby. This opens up lots of opportunities to introduce VR to first timers, or enjoy it together as a group.

SD Card Storage Expansion

I ended up spending around $273 after tax on Oculus Go for the 64GB version because I didn’t want to find myself limited in how much I could install on the headset. I could expect to spend around $428 for Mirage Solo with 64GB of included storage plus room for a micro SD card that supports up to 256 GB more.

For some, lack of SD card support in Oculus Go could be the deciding factor in not buying the system.

Easier Screenshots and Video

Taking screenshots and video on Oculus Go can be done by accessing a menu, and I’ve had mixed results trying to activate the feature. The reviewer’s guide for the Mirage Solo lists a simple controller shortcut to record video and take screenshots though. That’s pretty convenient.

Specifications

What matters is the overall experience and we can’t speak to that until spending significant time with Mirage Solo, but the specifications outline a device that appears to be a step above Go in some respects. One exception is the Solo apparently lacks integrated audio in comparison with Oculus Go. Here’s a look at the specifications:

Processor: Qualcomm Snapdragon 835
Memory: 4GB LPDDR4
Storage: 64GB UFS (Micro-SD up to 256GB)
Operating System: Daydream OS
Display: 5.5” QHD Display (2560×1440), IPS, 70% Color Gamut, 75Hz (Dual Fresnel-Aspheric, 110° FOV)
Camera: app WorldSense Tracking Camera, Lite-On 6BF 11238
Battery: 4000 mAh (2.5 hours continuous use)
Dimensions: 8.03″ x 10.61″ x 7.08″ (204.01 x 269.5 x 179.86 mm) 1.42 bls (645g)
Microphone: 2x
Audio: 3.5mm Headphone Jack
Power: USB Type-C

Tagged with: ,

Erstes Google Doodle für die virtuelle Realität

Die Google Doodle auf der Suchmaschinen-Seite erfreuen sich großer Beliebtheit und illustrieren jeweils ein Tagesthema. Auch interaktive Doodle laden manchmal zum Spiel ein, nun hat Google das erste VR-Doodle veröffentlicht. Das Thema von Back to the Moon ist ein absoluter Klassiker und ist für Smartphones und PC-Brillen sowie als 360-Grad-Video erhältlich.

Google Doodle VR

VR Google Doodle Back to the Moon

Am aktuellen VR Google Doodle mit dem Titel Back to the Moon waren gleich mehrere talentierte Köpfe beteiligt, nämlich Google Spotlight Stories, Next Studios, Google Doodle und Google Arts and Culture. Gemeinsam haben sie die erste Doodle-Erfahrung in 360 Grad und VR ins Leben gerufen. Das Thema passt bestens, schließlich geht es um den französischen Visionär Georges Méliès. Der 1861 geborene Illusionist revolutionierte den Film und gilt beispielsweise als Erfinder der Stop Motion.

1902 erschien sein bekanntestes Werk, das sich an Jules Vernes bekannten Roman Reise zum Mond anlehnte und als erster Science Fiction der Filmgeschichte gilt. Nun widmet Google dem Pionier ein Doodle, das man zum Beispiel auf YouTube in 360 Grad betrachten kann. Es geht aber noch besser, denn über eine dem Doodle gewidmete Seite findet man Links zum VR Google Doodle. So stellt der Suchmaschinenbetreiber Apps für Android und iOS bereit, außerdem lässt sich die Erfahrung kostenlos via Steam für Oculus Rift und HTC Vive laden. Interessant ist, dass Google auch den Viveport von HTC beliefert. Wer mehr über die Hintergründe zum Projekt erfahren möchte, findet ebenfalls weiterführendes Futter auf der Google-Doodle-Seite.

Der Beitrag Erstes Google Doodle für die virtuelle Realität zuerst gesehen auf VR∙Nerds. VR·Nerds am Werk!

Google Announces first VR Google Doodle

The Google Doodle has becomes one of Google’s most beloved traditions, celebrating accomplishments and events both modern and historical. As Google moves into the immersive technology space, it has announced it first virtual reality (VR) Google Doodle called Back To The Moon.

The interactive VR Google Doodle has been created to celebrate the life and work of French illusionist and film director Georges Méliès.

Georges Méliès is known to fans of science fiction and early cinema as one of the pioneers of sci-fi and fantasy, and on of the first filmmakers to experiment with what we would now called visual effects or special effects. Méliès is regarded as one of the first sci-fi filmmakers for this reason.

Méliès was the creator of iconic films A Trip To The Moon, which features the image of a spaceship striking the eye of the Moon, an image that has become iconic to this day. The Google Doodle draws upon this work to create an interactive VR work that pays homage to his films.

The Back To The Moon Google Doodle expands into a short film which makes reference to several of Méliès works, from his early experiments such as The One-Man Band, which used pioneering special effects to allow Méliès to appear to be playing multiple instruments simultaneously. The Doodle also makes reference to The Impossible Voyage, in which Méliès plays an explorer who travels the world, the oceans and even to the sun.

Hundreds of filmmakers have been influenced by the work of Méliès, who was known as a trailbreaker and pioneer in his own time who experimented with new and cutting-edge technology. It therefore seems thematically appropriate for his tribute to be in a cutting-edge medium.

The Doodle was created as a collaboration between Google Spotlight Stories, Next Studios, Google Doodle and Google Arts and Culture. The Google Doodle can be found on the Google homepage, or you can check out the video below for the full story.

For further news on VR content for Google and other upcoming immersive projects, keep checking back with VRFocus.

Google Demonstrates Promising Low-cost, Mobile Inside-out Controller Tracking

A number of standalone VR headsets will be hitting the market in 2018, but so far none of them offer positional (AKA 6DOF) controller input, one of the defining features of high-end tethered headsets. But we could see that change in the near future, thanks to research from Google which details a system for low-cost, mobile inside out VR controller tracking.

The first standalone VR headsets offering inside-out positional head tracking are soon to hit the market: the Lenovo Mirage Solo (part of Google’s Daydream ecosystem), and HTC Vive Focus. But both headsets have controllers which track rotation only, meaning that hand input is limited to more abstract and less immersive movements.

Detailed in a research paper (first spotted by Dimitri Diakopoulos), Google says that the reasons behind the lack of 6DOF controller tracking on many standalone headsets is because of hardware expense, computational cost, and occlusion issues. The paper, titled Egocentric 6-DoF Tracking of Small Handheld Objects goes on to demonstrate a computer-vision based 6DOF controller tracking approach which works without active markers.

Authors Rohit Pandey, Pavel Pidlypenskyi, Shuoran Yang, and Christine Kaeser-Chen, all from Google, write, “Our key observation is that users’ hands and arms provide excellent context for where the controller is in the image, and are robust cues even when the controller itself might be occluded. To simplify the system, we use the same cameras for headset 6-DoF pose tracking on mobile HMDs as our input. In our experiments, they are a pair of stereo monochrome fisheye cameras. We do not require additional markers or hardware beyond a standard IMU based controller.”

The authors say that the method can unlock positional tracking for simple IMU-based controllers (like Daydream’s), and they believe it could one day be extended to controller-less hand-tracking as well.

SEE ALSO
Qualcomm Snapdragon 845 VRDK to Offer Ultrasonic 6DOF Controller Tracking

Inside-out controller tracking approaches like Oculus’ Santa Cruz use cameras to look for for IR LED markers hidden inside the controllers, and then compare the shape of the markers to a known shape to solve for the position of the controller. Google’s approach effectively aims to infer the position of the controller by looking at the users arms and hands, instead of glowing markers.

To do this, they captured a large dataset of images from the headset’s perspective, which show what it looks like when a user holds the controller in a certain way. Then they trained a neural network—a self-optimizing program—to look at those images and make guesses about the position of the controller. After learning from the dataset, the algorithm can use what it knows to infer the position of the controller from brand new images fed in from the headset in real time. IMU data from the controller is fused with the algorithm’s positional determination to improve accuracy.

Image courtesy Google

A video, which has since been removed, showed the view from the headset’s camera, with a user waving what looked like a Daydream controller around in front of it. Overlaid onto the image was a symbol marking the position of the controller, which impressively managed to follow the controller as the user moved their hand, even when the controller itself was completely blocked by the user’s arm.

Image courtesy Google

To test the accuracy of their system, the authors captured the controller’s precise location using a commercial outside-in tracking system, and then compared to the results of their computer-vision tracking system. They found a “mean average error of 33.5 millimeters in 3D keypoint prediction,” (a little more than one inch). Their system runs at 30FPS on a “single mobile CPU core,” making it practical for use in mobile VR hardware, the authors say.

Image courtesy Google

And there’s still improvements to be made. Interpolation between frames is suggested as a next step, and could significantly speed up tracking, as the current model predicts position on a frame-by-frame basis, rather than sharing information between frames, the team writes.

As for the dataset which Google used to train the algorithm, the company plans to make it publicly available, allowing other teams to train their own neural networks in an effort to improve the tracking system. The authors believe the dataset is the largest of its kind, consisting of some 547,000 stereo image pairs, labeled with precise 6DOF position of the controller in each image. The dataset was compiled from 20 different users doing 13 different movements in various lightning conditions, they said.

– – — – –

We expect to hear more about this work, and the availability of the dataset, around Google’s annual I/O developer conference, hosted this year May 8th–10th.

The post Google Demonstrates Promising Low-cost, Mobile Inside-out Controller Tracking appeared first on Road to VR.

Google Adds Oculus Rift Support To Google Chrome For VR Web Browsing

If you are a Google Chrome user then you will be happy to know that the latest version of the web browser now supported the Oculus Rift headset, meaning users can now browse the web in virtual reality (VR).

The feature means that users with an Oculus Rift headset and Google Chrome version number 66 will be able to turn on the new feature and enjoy a complete VR, web browser experience. This feature was found by a user on Reddit who noticed that there was an option to turn on ‘Oculus hardware support’ on the “Experiments” section of Google Chrome. It has since be confirmed to work by a number of people and has been met with positive praise.

Until now, VR web browsing experiences have been mostly limited to the mobile version of Google Chrome, taking advantage of 360-degree videos and content. Google mentioned last year how they were planning to make it possible to experience VR content without needing to move to a separate app on mobile devices, ensuring that content could be enjoined within Google Chrome. This became a reality when Google released the Cardboard platform and later went on to release the Google Daydream platform. These platforms allowed mobile users the chance to enjoy immersive VR content on their mobile device.

Now with the support for Oculus Rift built into the Windows 10 version of Google Chrome users have a new way to immersive themselves in a high-end, VR solution. Though most of the internet is not designed for VR viewing, a fair amount of websites and content is available, with the content growing as time passes. Of course, with Google and third-parties focusing on the mobile applications of VR, things have been slow to kick off on the desktop side of things but with this support now in Google Chrome this could all be about to change.

Support for Oculus Rift in Google Chrome is another step towards including the WebVR standard into browsers, allowing developers to deploy VR applications over the web without the the need for any downloads. As Google continue to focus on bring this to reality within Google Chrome, users will be sure to see some exciting developments in the coming months. VRFocus will be sure to bring you all the latest on Google Chrome’s VR movements in the future, so stay tuned for more.

Chrome erhält WebVR Support für alle wichtigen VR-Brillen

Google hat nun das Update 66 veröffentlicht und ohne großen Wirbel darum zu machen dem Chrome Browser eine WebVR-Integration spendiert. Glücklicherweise zeigt sich Google ziemlich offenherzig bei den unterstützen Brillen, und so könnt ihr mit HTC Vive, Oculus Rift und Windows Mixed Reality Headsets das neue Feature nutzen.

Chrome erhält WebVR Support für alle wichtigen VR-Brillen

Während für Android eine Integration schon lange bereitsteht, folgt nun auch endlich der WebVR-Support für die PC-Brillen. Wenn ihr die Funktion aktivieren wollt, müsst ihr unter Windows 10 nur chrome://flags in die Adresszeile des Browsers eingeben, mit STRG+F oder dem Suchfeld nach WebVR und OpenVR hardware support suchen und die Funktionen in Chrome aktivieren. Nutzer einer Oculus Rift können außerdem Oculus hardware support aktivieren, damit kein Umweg über SteamVR nötig ist.

WebVR-Inhalte für den PC zu finden gestaltet sich im Moment jedoch etwas schwieriger beziehungsweise ist die Auswahl noch limitiert. Hier ein paar Vorschläge von Road to VR:

Der Vorteil von WebVR-Inhalten ist, dass sie nicht heruntergeladen werden müssen sondern direkt im Browser funktionieren. Dafür muss man jedoch Abstriche bei den visuellen Möglichkeiten machen. Dennoch könnte die Technologie besonders für kleinere Präsentationen, einfache Inhalte oder Plattformen interessant sein.

(Quelle: Road to VR)

Der Beitrag Chrome erhält WebVR Support für alle wichtigen VR-Brillen zuerst gesehen auf VR∙Nerds. VR·Nerds am Werk!

Google: Open-Heritage-Projekt zum Erhalt gefährdeter Kulturschätze

Google verkündete die Zusammenarbeit mit der Non-Profit-Organisation CyArk, um uralte Kulturstätten und durch Einsturz oder Zerstörung gefährdete Weltwunder als 3D-Modell digital abzusichern und für die Ewigkeit zu konservieren. Damit können auch nachfolgende Generationen zukünftig ein vielleicht vergangenes Stück Geschichte in virtueller Form nacherleben.

Google und CyArk – Projekt zum digitalen Erhalt von historischen Kulturstätten

Eine Kollaboration zwischen Google Arts & Culture und 3D-Laser-Scanning-Organisation CyArk sorgt für die Bereitstellung eines Projekts zum digitalen Erhalt gefährdeter Weltkulturerben. Gründe für die Zerstörung der monumentalen Bauwerke sind oft Krieg, Tourismus und Naturkatastrophen, doch auch der Zahn der Zeit nagt kontinuierlich an der Stabilität der beeindruckenden Bauwerke.

Das Open-Heritage-Projekt nutzt die Scan-Technologie von CyArk, um detaillierte Informationen über imposante Weltwunder, uralte Kulturstätten und monumentale Bauwerke zu sammeln. Dabei werden die Daten in enormer Präzision, in Millimeterarbeit aufgenommen und daraufhin zu VR-Kreationen gerendert und katalogisiert. Das Archiv der gemeinnützigen Organisation wird bereits seit Längerem befüllt, nun erhalten die Besucher die Möglichkeit, die 3D-Modelle in der virtuellen Realität und im Browser genau unter die Lupe zu nehmen. Durch die präzise Analyse entsteht zudem ein praktischer Nebeneffekt: Durch die exakte Aufnahme der realen Gegenstücke werden gefährliche Stellen erkannt, wodurch Restaurationsprozesse begonnen werden können.

https://l3apq3bncl82o596k2d1ydn1-wpengine.netdna-ssl.com/wp-content/uploads/2018/04/OpenHeritage_2.gif

Derzeit befinden sich insgesamt 25 begehbare Orte aus 18 verschiedenen Ländern im digitalen Archiv. Zu den imposanten Aufnahmen werden verschiedene Audiofiles und Textstellen hinzugefügt, um historische Hintergründe und weitere interessante Informationen zu erhalten. Diese können entweder direkt im Browser auf der offiziellen Webseite oder mit der App für iOS und Android betrachtet werden. Um die Bauwerke in virtueller Umgebung zu besuchen, könnt ihr Google Poly mit einer Google Daydream verwenden. Zudem stellt das Projekt die bisher gesammelten Datensätze kostenlos zum Download zur Verfügung. Dafür müsst ihr lediglich dieses Formular ausfüllen.

(Quellen: Upload VR | VR Scout | Open Heritage Project | Video: Google Arts & Culture Youtube)

Der Beitrag Google: Open-Heritage-Projekt zum Erhalt gefährdeter Kulturschätze zuerst gesehen auf VR∙Nerds. VR·Nerds am Werk!

Google: Entwicklung eines AR-Mikroskops zur Krebserkennung

Die fortschreitende Technologisierung und Weiterentwicklung der AR- und VR-Technologie bietet nicht nur für den Unterhaltungsbereich enormes Potenzial, sondern erweist sich auch für unsere Gesundheit durchaus positiv. Ein Forscherteam von Google veröffentlichte kürzlich den Prototyp eines AR-Mikroskops, das imstande ist, Krebszellen in Echtzeit zu erkennen. Die Ergebnisse sorgen nicht nur für mehr Akkuratheit, sondern auch deutlich mehr Geschwindigkeit in der Diagnose von Krankheiten und sind dadurch in der Lage Leben zu retten.

Google – AR-Mikroskop zur Krebsfrüherkennung

Google präsentierte auf der American Association for Cancer Research (AACR) in Chicago (USA) seinen neusten Prototyp eines AR-Mikroskops, das dank intelligenten Algorithmen Krebszellen in Echtzeit identifiziert.

Das mit einer Kamera modifizierte Lichtmikroskop wird mit einem Computer verbunden, der die vorliegenden Zellen analysiert und die Ergebnisse direkt auf dem Sichtfeld präsentiert. Durch die Algorithmen werden Tumore daraufhin deutlich sichtbar angezeigt. In der Praxis ist das Analysieren der Zellen ein aufwendiger und langwieriger Prozess, der Unmengen an Zeit und Ressourcen der Ärzte vergeudet. Die neue Technologie könnte diesen Aufwand massiv reduzieren und dadurch eine frühere Behandlung für Betroffene ermöglichen.

Google-AR-Krebserkennung-Mikroskop-Forschung

Derzeit ist das AR-Mikroskop in der Lage Brust- und Prostatakrebs zu erkennen, jedoch soll es zukünftig dank eines intelligenten Lernverfahrens auch weitere Krebszellen identifizieren. Dafür arbeitet ein großes Team von Pathologen zusammen, um den Algorithmus mit Bildern von Krebszellen weiterzuentwickeln. Zudem soll die AR-Modifikation auch in ältere Lichtmikroskope nachträglich nachrüstbar sein, ohne große Kosten zu verursachen.

Neben der Tumorerkennung soll das neue Hilfsgerät das Potenzial besitzen, zukünftig auch bei der Diagnose von infektiösen Krankheiten wie Tuberkulose und Malaria, besonders in Entwicklungsländern beizutragen.

Das AR-Mikroskop wird also nicht nur für mehr Akkuratheit und schnellere Ergebnisse in der Diagnose sorgen, sondern auch eine breite Anzahl an verschiedenen Krankheiten erkennen. Dies erleichtert die Arbeit der Ärzte ungemein und kann dadurch zukünftig Menschenleben retten.

(Quellen: Google Research Blog | VR Focus | Video: Google Youtube)

Der Beitrag Google: Entwicklung eines AR-Mikroskops zur Krebserkennung zuerst gesehen auf VR∙Nerds. VR·Nerds am Werk!

Google Is Using VR To Virtually Preserve Endangered Heritage Sites

Google Is Using VR To Virtually Preserve Endangered Heritage Sites

We’ve heard about a lot of companies that are using VR to preserve history that’s at risk of being destroyed if it hasn’t already. Just last month, for example, we wrote about Perpetuity | Palmyra, a VR project attempting to virtually restore a Syrian city ravaged by ISIS. Now, though, Google is upping the scale of these restoration efforts with its Arts and Culture app.

This week the search engine giant announced a partnership with CyArk, a 3D laser scanning nonprofit that’s using its technology to digitally capture and recreate historical sites around the world. Together the pair has launched the Open Heritage Project, which allows people to explore these locations through PCs and smartphones.

The project already consists of a library of 3D models, 360 degree photos and traditional media for you to explore. However, the headling piece of the collection right now is a virtual tour of the temples of Bagan, Myanmar, which were damaged in an earthquake two years ago.The tour goes far beyond other Arts and Culture content, allowing you to walk around a virtually recreated temple and inspect the site in close detail.

CyArk’s 3D scanning produces highly realistic results, and virtual sites are littered with audio clips that teach you more about the sites you’re exploring.

Where does VR fit into all of this? You can experience this media inside a Google Daydream View headset, bringing you closer to the history. It uses a combination of native app support and WebVR content. The platform also leverages Google’s Poly service, which stores 3D assets that you can quickly view in VR.

Google Partnership Makes It Possible To Visit Ancient Monuments In VR

Google Partnership Makes It Possible To Visit Ancient Monuments In VR

Google is partnering with non-profit CyArk so anyone can visit ancient monuments and temples, many of which are at risk of being damaged, or, in some cases, have already been devastated.

One example available now is a temple in Bagan, Myanmar, that was damaged in a 2016 earthquake. CyArk is building an extensive 3D record of places from the real world, with particular focus on locations at risk of being destroyed. In this case, CyArk scanned the site prior to that earthquake, allowing it to be revisited in VR as it was prior to the damage.

The group uses a variety of techniques to capture the scenes, with 3D models available for some to fully explore in VR. Other locations in the Internet-powered exhibition include Al Azem Palace in Damascus, Syria, and the Mayan city of Chichen Itza in Mexico.

Google’s Arts & Culture efforts previously included bringing the collections of various museums online, but the partnership with CyArk takes it further by letting folks actually visit historic sites without leaving home. In addition, the effort seems to push forward Google’s broader work with the 3D object repository Poly, moving the service from simple objects to the high-end business of reality reconstruction.

You can check out the “Open Heritage” project online or download the apps for iOS and Android.

Tagged with: