AR Content is Coming to Google Maps, But It Won’t Matter Until There’s a Headset to See it Through

Google today announced it’s starting a pilot program that will soon allow select partners to create AR content and display it within Google Maps. While it seems like an important step for Google on the way to owning a piece of the ‘all-day AR glasses’ future, it’s unclear just where it’s all headed for the company in the near term. Because compared to Meta and Apple, Google still seems unable to commit to a coherent XR strategy.

Starting in Singapore and Paris later this year, Google is essentially elevating content built in its Geospatial Creator platform to the world stage, as it will soon allow select partners to publish their AR content connected to physical landmarks via Google Maps, which you can view through both Lens and Street View.

The hope, it seems, is it get mobile users engaged with AR content by searching for a location in Google Maps and holding your phone up at landmarks, shops, etc. Some of the examples seen in the video below include cultural and historical stuff, but also virtual billboards for private businesses, presenting something of a low poly Blade Runner vibe.

It’s a pretty neat showcase for tourist boards to get behind, and a cool Easter egg for Google Maps users too, but it’s difficult to imagine it will ever be more than that, at least on mobile devices.

While we use our phones for everything, mobile AR applications are neither as immersive as the promo video suggests, nor additive enough yet to really engage with for any meaningful amount of time before the glass rectangle goes back in your pocket or bag. That’s why so many companies are pinning their hopes on functional AR glasses for all-day use; it will remove that frictional boundary and put that AR layer much closer to the forefront to both users and the advertisers trying to reach them.

And as you’d imagine, there was little in the way of XR at Google’s I/O developer conference this year—unfortunately expected after the company canned its AR glasses Project Iris last summer, which also saw the resignations of top leadership, including AR & VR chief Clay Bavor, and head of XR operating systems Mark Lucovsky.

At the time, Lucovsky maintained in an X post his departure was heavily influenced by “changes in AR leadership and Google’s unstable commitment and vision.”

That’s not to say Google isn’t doing XR stuff, but it all still feels like it’s following the company’s usual brand of scattershot Darwinism. We heard about more incremental updates to ARCore, its developer platform for building AR experiences which was initially released in 2017. We heard about how its light field video chatting tech Project Starline will soon become an actual product.

We also got a quick glimpse of a very Project Iris-style device in a video (seen below), which the company simply calls “a prototype glasses device.”

The demo was more about highlighting the company’s research in computer vision and AI assistants with Project Astra though, as there’s no word on what those glasses are beyond that description. Given what we saw, it appears the device is more like a pair of Google Glass-style smartglasses than AR glasses as such. Learn more about the difference here.

The short of it: smartglasses can do things like feed you AI assistant stuff, play music, and show you static information, i.e. not spatial data like 3D models that blend in naturally with the physical landscape. That would require significantly more compute, battery, and more powerful optics than those prototype glasses could hope to provide, which means no interactive maps or more immersive version of Pokémon Go either.

Most of all, we’re still waiting to hear about the Samsung+Google partnership that might bring a Vision Pro competitor from Samsung. Most importantly though, it will be Google’s next big stab at launching an Android-based XR operating system following its now defunct Daydream platform.

The post AR Content is Coming to Google Maps, But It Won’t Matter Until There’s a Headset to See it Through appeared first on Road to VR.

Top 8 Uses for Augmented Reality

Augmented reality (AR) is a technology with a dizzying range of potential applications. And as new and more powerful AR hardware enters the market (such as Apple’s mooted glasses), we’re likely to see even more uses for AR. 

That’s not to say that AR, as it exists today, is any slouch, and to prove it we’re looking at eight of the best uses for augmented reality.

Virtual try-ons

The retail industry has been one of the most prominent embracers of AR technology over at least the past decade. Most of the industry’s biggest brands offer some form of the technology, which allows prospective buyers to see how a product would look on them without needing to physically try it on, usually utilising the ubiquitous phone camera to display the virtual elements in real-time.

Prominent virtual try-on examples include make-up from Maybelline, clothing from ASOS and Zeekit, and shoes from Vyking.

Vyking AR Shoes
Image Credit: Vyking

Gaming

Augmented Reality has found a natural home in the gaming industry, where it has powered some huge mobile game successes including Pokemon Go and Pikmin Bloom, both from developer Niantic.

Pokemon Go in particular was a smash hit, peaking at over 250 million players per month on the back of an experience that transported the gameplay of the popular Pokemon video game series to real-world locations. That built on work the developer had done in its previous game Ingress, which allowed players to use their mobile phones to interact with virtual portals appearing in real-world locations as part of its science fiction story.

Construction

AR is a key tool in the construction industry, from the design stage right through to the actual building process. For architecture, numerous tools exist to aid in the visualisation of spaces, such as The Wild, which allows designers to view 3D models in both virtual and augmented reality.

On the building side of the equation, AR has uses ranging from training workers on safety to progress capture and tracking functionality that directly compares real-world sites with virtual models in real-time to ensure they aren’t deviating.

VisualLive
Image credit: VisualLive

Surgery

The high-stakes field of surgery is being revolutionised by augmented reality technology which can overlay vital information onto a surgeon’s field of view as they work. Mixed reality headsets such as the Microsoft HoloLens 2 allow surgeons to operate on patients more effectively, blending the real world with projections of computed Tomography (CT), and Magnetic Resonance Imaging (MRI) scans of the patients.

Holographic representations of the area being operated on can also be observed in 3D before surgery takes place to ensure a surgeon has full familiarity of the area they are working on. To find out more about the role of AR in healthcare, read our article on the subject.

The tricky business of finding your way around busy spaces has been much improved with the help of AR, such as the Live View feature offered by Google Maps, which takes existing data from the map app and overlays it on the camera’s view of the real world with help from your phone’s GPS capabilities.

Individual locations have also explored using augmented reality to help guide visitors, such as Gatwick Airport, which installed navigational beacons that guide a passenger’s way back in 2018 – all accessed via a smartphone app.

Google Maps AR / Google Lens
Image credit: Google

Education

From a school setting to on-the-job training, AR can be used to help learners safely interact with materials they would otherwise not be able to gain access to, all while remaining in a familiar setting. Google debuted augmented reality search during the COVID-19 pandemic to help people learn by placing virtual objects such as spacesuits and animals into real-world locations. A host of apps exist to bring similar objects into a classroom setting, including the Merge Cube, which adds tactility to the experience.

Energy giants such as Shell, meanwhile, are using AR to educate workers in the field by bringing in experts who can see through a worker’s eyes and even draw on the screen of the augmented reality display they are using, boosting safety as they interact with potentially dangerous heavy oil and gas equipment.

Design

Designers at all levels are making use of AR to preview how a space will look before any changes are made physically, from those designing individual rooms all the way up to those planning cities.
Non-professionals too can make use of augmented reality to aid in their designs. Just one example is furniture store IKEA’s IKEA Place app which allows users to place 3D models of the company’s goods into their own rooms in order to preview how they would look, automatically scaling them based on the room’s dimensions to ensure they are true to life.

IKEA PLACE AR app
IKEA Place AR app. Image credit: Ikea

Manufacturing

AR is one of the key pillars underpinning the phenomenon of Industry 4.0, alongside such technologies as machine learning and big data. Consultants PwC has estimated that industrial manufacturing and design is one of the biggest potential areas for augmented and virtual reality, with their use in the industry having the potential to deliver a $360bn GDP boost by 2030.
As a result, examples of the technology in action for manufacturing are easy to come by. One example is Boeing’s use of augmented reality to give technicians real-time, hands-free, interactive 3D wiring diagrams. Lockheed Martin also utilised augmented reality in the creation of NASA’s Orion Spacecraft, overlaying information to help with mission-critical procedures such as precisely aligning fasteners.

Wooorld: Multiplayer Google Earth für Meta Quest

Tower Tag auf Steam

Mit Wooorld sollt ihr schon bald gemeinsam die Erde mit eurer Meta Quest erkunden können. Die Entwickler und Entwicklerinnen nutzen hierzu die Google Maps API.

Wooorld: Multiplayer Google Earth für Meta Quest

Aktuell befindet sich Wooorld in der Beta-Phase und ihr könnt euch hier zum Test anmelden. Wann die Software final für die Meta Quest veröffentlicht wird, ist aktuell noch unklar.

Das Besondere an Wooorld ist, dass ihr gemeinsam auf eine Reise gehen könnt und dies sogar im Passthrough-Modus. Dafür soll jedoch der Sichtbereich begrenzter sein als wir es von Lösungen für den PC gewohnt sind. Wir sind dennoch sehr gespannt auf Wooorld und haben uns direkt zum Test angemeldet.

Die Meta Quest 2 ist in Deutschland weiterhin nicht offiziell verfügbar. Ihr könnt sie aber problemlos über Amazon Frankreich bestellen. Unseren Langzeittest zur Meta Quest 2 findet ihr hier.

(Quelle: Road to VR)

Der Beitrag Wooorld: Multiplayer Google Earth für Meta Quest zuerst gesehen auf VR∙Nerds. VR·Nerds am Werk!

Google Maps AR Navigation Launches In Beta For Compatible iOS and Android Devices

The AR feature in Google Maps that Google first demoed more than a year ago — a heads-up mode called Live View that provides directions from phones’ cameras in real time — will roll out in beta in the coming weeks for compatible iOS and Android devices, the company announced. Live View was previously only available to users who were enrolled in the Google Maps beta and were level 5 or above in Google’s Local Guides program, and to owners of Pixel 3a, Pixel 3a XL, and older Pixel smartphones.

As before, Live View will require an ARCore- and ARKit-supported smartphone and will only work in countries where Google Street View is available. Google also warns that Live View will slightly increase Maps’ battery and cellular data consumption.

Tapping the Start AR button kicks off Live View navigation, and holding up the phone reveals arrows, indicators, and a live map that guides you to your final destination. Using a technique known as global localization, Maps gets a phone’s coarse location via GPS, and it uses Street View imagery to narrow down the exact location.

“Live View determines the location of a device based on imagery,” explainedGoogle Maps software engineer Tilman Reinhardt in a blog post. “[It] first creates a map by taking a series of images [that] have a known location and analyzing them for key visual features, such as the outline of buildings or bridges, to create a large-scale and fast-searchable index of those visual features. To localize the device, it compares the features in imagery from the phone to those in the index, and it uses machine learning to prioritize “features that are likely to be permanent parts of the scene [while] ignoring things like … dynamic light movement and construction that are likely transient.”

In other Maps-related news, Google launched a new trip reservations flow that lets you check important flight and hotel info from the Maps app even when you’re offline. (View it by tapping on the Your Places shortcut from Maps’ menu screen and then the Reservations tab.) Reservations are ingested automatically from Gmail and Google Calendar, in addition to other sources within the Google services ecosystem.

Additionally, Timeline — the Maps feature that shows location history and photos in a nifty chronological view, narrowed down by day, month, or year — now allows you to export places you’ve been to in a list format that can be annotated and shared with friends and contacts. New filters for countries and cities are in tow, as well as place categories, such as restaurants, shops, attractions, hotels, and airports.

The new Timeline hits Android devices today, with iOS support promised down the line.

This post by Kyle Wiggers originally appeared on VentureBeat.

The post Google Maps AR Navigation Launches In Beta For Compatible iOS and Android Devices appeared first on UploadVR.

Google Maps’ ‘Live View’ AR Feature Available in Beta, Makes Getting Lost Harder

Google may seem to be losing interest in its virtual reality (VR) ventures such as Daydream View but on the augmented reality (AR) the company is still pressing forward with gusto. Having released an AR feature for Google Maps earlier this year to Google Maps Local Guides and Google Pixel users, the company has today begun a wider rollout of Live View.

Google Maps Live View

Currently still in beta, the feature to compatible iOS and Android devices which support ARKit and ARCore respectively. While the launch happens today, you may not be able to update your device just yet, as it may take Google several days or weeks to get to your region.

The whole purpose of the AR option is to make navigation using the highly popular Google Maps even easier and straight forward. All you need to do is tap on a location in the app, hit the Directions button then select the Walking option. After that, you should find the option to ‘Live View’ those directions towards the bottom of the screen. With Live View enabled you’ll get some gigantic handy arrows appear in the middle of the street (or wherever you are) telling you the right direction to head.

Obviously, this sort of feature isn’t supposed to make you continually hold your phone up and look like a lost kitten. You can simply bring it up when required to let you know you’ve gone the wrong way – or going the right way. It’s just one of a number of updates Google has added to the app, including being able to see all of your flight and hotel reservations in one place, or finding a nice restaurant and booking a reservation all without leaving the app.

Google Maps

While AR might be seen as the little brother to VR, it’s often thought of as having the greatest potential in the long run. Apart from apps like Google Maps, a lot of the AR content consumers are coming across at the moment are videogames such as Harry Potter: Wizards Unite and Minecraft Earth. VRFocus will continue its coverage of AR, reporting back with the latest updates.

Google Teases AR Maps Integration to Help You Navigate By Sight

Aparna Chennapragada, VP of Product for AR/VR at Google, took to the stage at Google’s I/O developer conference today to tease some of the work the company is doing to integrate augmented reality into Google Maps.

Chennapragada says the team has combined the smartphone’s camera, computer vision, Street View and Maps to “reimagine walking navigation,” essentially letting you view navigational cues by using some of the things already built into Google’s augmented reality-able Camera app thanks to ARCore.

Chennapragada presented a few possible use-cases including a simple walking navigation scenario replete with blinking navigational arrows superimposed into the physical world.

Image courtesy Google

Teasing a bit more, Chennapragada mentioned other possible features including integration of landmark recognition, and even a little fox-buddy to help lead the way.

Image courtesy Google

“Enabling these kinds of experiences though, GPS alone doesn’t cut it. So that’s why we’ve been working on VPS – visual positioning system, that can estimate precise positioning and orientation,” she said.

First revealed at last year’s I/O, VPS is said to use the visual features of physical world to position you within it more precisely. While not specifically mentioned during the presentation, it was previously touted for its ability to take you where GPS can’t, i.e. out of satellite range with the ability to give you turn-by-turn directions indoors.

While Google tiptoed around any specific announcements of when such a AR Maps feature could be coming, the demo was certainly a promising step in a decidedly augmented direction.

The post Google Teases AR Maps Integration to Help You Navigate By Sight appeared first on Road to VR.

Google Unveil AR Visual Navigation

The Google I/O keynote contained a number of interesting new and improved technologies, such as new features for the Google Assistant. One of the reveals during the first day keynote involved a new augmented reality (AR) navigation tool for Google Maps.

During the first day keynote, Aparna Chennapragada spoke about some new features for Google Maps, where she discussed how the requirements that users have for Google Maps has changed, and much more is needed.

Google Maps AR / Google Lens

To help provide for these changing needs, the Google Maps team have worked to integrate Google Maps with the smartphone camera. To illustrate how this would work, Chennapragada talked about an example taken from real life; Imagine exiting from a train or subway station and being on your way to an appointment. Google Maps say to go South on High Street, but how do you know which way is South, and if it is an unfamiliar location, how do you know which one is High Street? This is where the Camera and AR integration comes in.

Instead of a top-down map, users will be able to see the street in front of them through the camera, with an AR overlay arrow pointing the direction and distance. The map view is just below, so users can double check that the two match up properly. The Google Maps team have even been experimenting with an animated guide character that you can follow, such as the animated fox shown briefly in the demo.

In addition, the Maps and Camera integration can be used to show users what shops, landmarks, hotels and restaurants are nearby, by tagging the information from Maps to the correct building, making it easier for users to find a location they are searching for.

In order to make this possible, GPS alone lacks the precision needed, so Google have been working on implementing a new system, referred to as VPS, or the Visual Positioning System. This can estimate a more precise position and orientation. VPS uses the visual features of the environment to provide a precise location.

Google Lens / Google Maps AR Fox

Further news from the Google I/O events will continue to be reported on here on VRFocus.