Oculus Quest Gets iPhone Notifications, Files App For Browser

The latest Oculus Quest software enables iOS users to see their iPhone lock screen notifications while in VR. The update also adds a file management app.

The phone notifications feature in the latest v29 update to Quest’s system software should be rolling out to headsets over a couple weeks as Facebook typically releases new features to headsets worldwide on a gradual basis. While the notifications are iOS only for now, Facebook notes that Android support is “coming soon as well.” The feature should respect multi-user settings, so “notifications will not be visible to any other account holders signed into the headset,” according to the company.

The file management app in the same update offers the ability to “to access, browse, manage, share, and upload files located on your headset across multiple locations in VR,” according to Facebook. While you could already access some file management features with the built-in Web browser, the arrival of a centralized file management app should make things easier.

The notifications feature is part of system-level updates to Facebook’s “Infinite Office” set of features meant to make it possible to get real work done in VR. In the previous v28 update, Facebook added the ability to mark the location of your desk in VR so you know where it is, as well as the ability to track the location of a specific Logitech keyboard so you can more easily type with the headset on. Consulting Oculus CTO John Carmack recently suggested Facebook was on a path to try and displace tablets and Chromebooks for some budget-conscious computer buyers with its VR headsets.

Facebook provided the following video demo feature showing how the notifications feature works while inside the headset.

How To Use AR On iPhone 12 Pro To Measure Someone’s Height

The iPhone 12 Pro and iPhone 12 Pro Max lets users instantly measure someone’s height using AR, thanks to the newly-added LiDAR scanner equipped on the new pro models.

The feature is available in Apple’s Measure app, and uses LiDAR-enhanced AR to measure the height of any person standing in-frame. You can measure to the top of their hat, head or hair, and it even works with people who are sitting down in a chair too.

The feature is only available for the high-end iPhone 12 Pro or 12 Pro Max models. The standard iPhone 12 and the upcoming iPhone 12 Mini do have AR capabilities, but do not include the LiDAR scanner which enhances AR functionality. The omission of the height measuring feature on the standard and mini models suggest that the LiDAR sensor is the missing ingredient on those phones.

iphone ar lidar height measurement

When using the Measure app on a 12 Pro or 12 Pro Max, the feature should work automatically — all you have to do is position the phone’s camera so that the person you’re measuring appears in-frame from head to toe. After that, an AR overlay should soon appear with a line marking the top of the person’s head and their measure height. You can tap the photo button in the bottom right to take a screenshot of the measurement, accessible anytime in your photo library.

With the inclusion of the LiDAR sensor, this measurement feature is just one of many expected advancements for AR on iPhone 12 Pro models. “iPhone 12 Pro uses a LiDAR Scanner to measure how long it takes light to reflect back from objects,” Apple explains on its website. “So it can create a depth map of whatever space you’re in. Because it’s ultrafast and accurate, AR apps can now transform a room into a realistic rainforest or show you how a new sneaker will fit.”

You can read more about the height measurement feature here.

LiDAR Scanner für iPhone 12 Pro für “Instant AR”

Tower Tag auf Steam

Apple hat gestern sein neuestes Angebot an Smartphones vorgestellt, darunter das iPhone 12 Pro und das iPhone 12 Pro Max, die beide mit einem LiDAR-Scanner ausgestattet sind, der die AR-Fähigkeiten erweitern wird.

LiDAR Scanner für iPhone 12 Pro für “Instant AR”

Wie das Anfang des Jahres eingeführte iPad Pro bringt Apple nun auch seine High-End-Smartphones, das neue iPhone 12 Pro und 12 Pro Max, mit einem LiDAR-Scanner auf den Markt.

LiDAR ist ein so genannter “Laufzeit”-Tiefensensor, der misst, wie lange es dauert, bis Licht von Objekten in der Szene abprallt und zum Sensor zurückkehrt. Mit präziser Zeitmessung wird diese Information verwendet, um die Tiefe jedes einzelnen Punktes zu beurteilen. Mit reichhaltigen Tiefeninformationen können Augmented-Reality-Erfahrungen schneller und genauer sein.

Apple sagt, dass LiDAR im iPhone 12 Pro und 12 Pro Max bedeutet, dass die Telefone in der Lage sein werden, “instant AR” zu erzeugen. Das liegt daran, dass LiDAR Tiefeninformationen im Äquivalent eines “Einzelfotos” erfasst, ohne dass das Telefon bewegt wird oder Bilder über die Zeit hinweg verglichen werden müssen. Aber natürlich kann die Kamera bzw. das Smartphone auch bewegt werden, um eine komplette 3D-Karte eines Raumes mit hoher Genauigkeit zu erstellen.

Apples iPhone 12 Pro kostet 1.120 Euro und wird am 23. Oktober auf den Markt gebracht, während das größere iPhone 12 Max 1334,45 Euro kostet und am 13. November auf den Markt kommt. Die anderen neu eingeführten Telefone des Unternehmens, das iPhone 12 und der iPhone 12 Mini, enthalten keinen LiDAR-Sensor.

(Quelle: Road to VR)

Der Beitrag LiDAR Scanner für iPhone 12 Pro für “Instant AR” zuerst gesehen auf VR∙Nerds. VR·Nerds am Werk!

Apple Announces iPhone 12 Pro With LiDAR-Enhanced AR

Apple announced a new iPhone with the same kind of Lidar 3D-sensing technology which shipped earlier this year on the iPad Pro.

The new feature on the iPhone 12 Pro promises much improved AR features, like what’s pitched as helping improve photo quality as well as instantaneous placement of virtual content into your environment. It also may make 3D scanning of objects or places in the real world a more common practice.

This is breaking news so expect updates.

Apple’s Investment In Lidar Could Be Big For AR

While many of Apple’s investments in innovative technologies pay off, some just don’t: Think back to the “tremendous amount” of money and engineering time it spent on force-sensitive screens, which are now in the process of disappearing from Apple Watches and iPhones, or its work on Siri, which still feels like it’s in beta nine years after it was first integrated into iOS. In some cases, Apple’s backing is enough to take a new technology into the mainstream; in others, Apple gets a feature into a lot of devices only for the innovation to go nowhere.

Lidar has the potential to be Apple’s next “here today, gone tomorrow” technology. The laser-based depth scanner was the marquee addition to the 2020 iPad Pro that debuted this March, and has been rumored for nearly two years as a 2020 iPhone feature. Recently leaked rear glass panes for the iPhone 12 Pro and Max suggest that lidar scanners will appear in both phones, though they’re unlikely to be in the non-Pro versions of the iPhone 12. Moreover, they may be the only major changes to the new iPhones’ rear camera arrays this year.

If you don’t fully understand lidar, you’re not alone. Think of it as an extra camera that rapidly captures a room’s depth data rather than creating traditional photos or videos. To users, visualizations of lidar look like black-and-white point clouds focused on the edges of objects, but when devices gather lidar data, they know relative depth locations for the individual points and can use that depth information to improve augmented reality, traditional photography, and various computer vision tasks. Unlike a flat photo, a depth scan offers a finely detailed differentiation of what’s close, mid range, and far away.

Six months after lidar arrived in the iPad Pro, the hardware’s potential hasn’t been matched by Apple software. Rather than releasing a new user-facing app to show off the feature or conspicuously augmenting the iPad’s popular Camera app with depth-sensing tricks, Apple pitched lidar to developers as a way to instantly improve their existing AR software — often without the need for extra coding. Room-scanning and depth features previously implemented in apps would just work faster and more accurately than before. As just one example, AR content composited on real-world camera video could automatically hide partially behind depth-sensed objects, a feature known as occlusion.

In short, adding lidar to the iPad Pro made a narrow category of apps a little better on a narrow slice of Apple devices. From a user’s perspective, the best Apple-provided examples of the technology’s potential were hidden in the Apple Store app, which can display 3D models of certain devices (Mac Pro, yes; iMac, no) in AR, and iPadOS’ obscure “Measure” app, which previously did a mediocre job of guesstimating real-world object lengths, but did a better job after adding lidar. It’s worth underscoring that those aren’t objectively good examples, and no one in their right mind — except an AR developer — would buy a device solely to gain such marginal AR performance improvements.

Whether lidar will make a bigger impact on iPhones remains to be seen. If it’s truly a Pro-exclusive feature this year, not only will fewer people have access to it, but developers will have less incentive to develop lidar-dependent features. Even if Apple sells tens of millions of iPhone 12 Pro devices, they’ll almost certainly follow the pattern of the iPhone 11, which reportedly outsold its more expensive Pro brethren across the world. Consequently, lidar would be a comparatively niche feature, rather than a baseline expectation for all iPhone 12 series users.

The new XS Portrait Mode lets you adjust background blur (bokeh) from f/1.4 to f/16 after taking a photo.

Above: Portrait Mode lets you adjust background blur (bokeh) from f/1.4 to f/16 after taking a photo.

Image Credit: Jeremy Horwitz/VentureBeat

That said, if Apple uses the lidar hardware properly in the iPhones, it could become a bigger deal and differentiator going forward. Industry scuttlebutt suggests that Apple will use lidar to improve the Pro cameras’ autofocus features and depth-based processing effects, such as Portrait Mode, which artificially blurs photo backgrounds to create a DSLR-like “bokeh” effect. Since lidar’s invisible lasers work in pitch black rooms — and quickly — they could serve as a better low-light autofocus system than current techniques that rely on minute differences measured by an optical camera sensor. Faux bokeh and other visual effects could and likely will be applicable to video recordings, as well. Developers such as Niantic could also use the hardware to improve Pokémon Go for a subset of iPhones, and given the massive size of its user base, that could be a win for AR gamers.

Apple won’t be the first company to offer a rear depth sensor in a phone. Samsung introduced a similar technology in the Galaxy S10 series last year, adding it to subsequent Note 10 and S20 models, but a lack of killer apps and performance issues reportedly led the company to drop the feature from the Note 20 and next year’s S series. While Samsung is apparently redesigning its depth sensor to better rival the Sony-developed Lidar Scanner Apple uses in its devices, finding killer apps for the technology may remain challenging.

Though consumer and developer interest in depth sensing technologies may have (temporarily) plateaued, there’s been no shortage of demand for higher-resolution smartphone cameras. Virtually every Android phone maker leaped forward in sensor technology this year, such that even midrange phones now commonly include at least one camera with 4 to 10 times the resolution of Apple’s iPhone sensors. Relying on lidar alone won’t help Apple bridge the resolution gap, but it may further its prior claims that it’s doing the most with its smaller number of pixels.

Ultimately, the problems with Apple-owned innovations such as 3D Touch, Force Touch, and Siri haven’t come down to whether the technologies are inherently good or bad, but whether they’ve been widely adopted by developers and users. As augmented reality hardware continues to advance — and demand fast, room-scale depth scanning for everything from object placement to gesture control tracking — there’s every reason to believe that lidar is going to be either a fundamental technology or a preferred solution. But Apple is going to need to make a better case for lidar in the iPhone than it has on the iPad, and soon, lest the technology wind up forgotten and abandoned rather than core to the next generation of mobile computing.


This post by Jeremy Horwitz originally appeared on VentureBeat.

The new generation of AR-media Plugins is coming

Back in 2008 when the first generation of AR-media Plugins was released, Augmented Reality wasn’t the solid and promising technology it is today, but was probably more of a very risky bet.
As Inglobe Technologies we have been among the pioneering companies worldwide to believe in AR and to invest in the development of products and solutions for Creators, Developers and Enterprises.

Since then the AR-media Plugins have written an important page in the history of Augmented Reality, conquering the trust of over 50.000 registered users worldwide who have been creating immersive contents at an amazing pace over the years.

Why have the AR-media Plugins been so successful?

The reason for such a big success relies on the fact that Inglobe Technologies understood that there were so many Content Creators out there that were looking for innovative ways to present their work to their clients, colleagues or students, even if they had no coding skills. The plugins have been used with satisfaction by Designers, Architects, Engineers, Marketers, Publishers, Teachers and many more.

The AR-media Plugins were designed starting from that intuition, providing 3D content creators with user-friendly coding-free tools that allowed for an easy authoring workflow starting from the 3D modeling software of choice. Among the supported products there were Sketchup; 3D Studio Max; Maya; Cinema 4D; Vectorworks and Scia Engineering.

Users could easily prepare their Augmented Reality experiences and then visualize them by means of the AR-media Player.
Back then Augmented Reality visualization wasn’t an all mobile and wearable thing, it was rather more frequent to view AR contents using a webcam connected to a personal computer, pointing to a previously printed marker.

The Smartphone changed… everything

Exactly in 2008, when we were releasing the first generation of AR-media Plugins, Apple presented the first iPhone, paving the way for a revolution in the area of mobile devices that is still changing our lives as of today.
The advent of the smartphone era is of vital importance for the affirmation of Augmented and Virtual Reality because it made it much easier to interact with AR contents and this trend kept growing over time as smartphones became more and more powerful.

A more recent and inevitable revolution started with the advent of wearable devices as we know them today, as Google released the Google Glass in 2013. This type of device has not yet reached the maturity of smartphones, but it is changing the way AR and VR contents get enjoyed.

Mobile and Wearable devices have changed so much our lives and brought at our fingertips an insane amount of technology and computational power for a relatively low price, opening the way for a major jump in the Augmented Reality market.
For this reason we thought it was about time we totally redesigned our AR-media Plugins and Platform in order to provide to the Content Creators a renewed and more powerful set of tools.

What can you expect from the next generation of the AR-media Platform?

We are actively working towards the release of the new generation of AR-media Platform, currently testing some of the new exciting features that will be included in the new version.
In the new version you will find a cloud based authoring tool, the AR-media Platform, where you will be able to easily create advanced AR and VR contents without any coding, as well as a set of exporters, the AR-media Plugins, that we’ll be releasing gradually for the major 3D modeling software (Trimble SketchUp, 3ds Max, etc…) and a companion mobile AR-media App.

Note that the AR-media Plugins will be a support to export the contents properly from their native programs, but they won’t be mandatory, as you will be able to export any model following our guidelines and to upload them directly to the AR-media Platform.

The viewer for the previously created experiences will be the AR-media Player as usual, that will be made available for both iOS and Android from day 1.

Among the new features we can already disclose the following will be included:

  • Cloud based Experience Creation (AR-media Platform)
  • Optional exporters for the major 3D modeling software (AR-media Plugins)
  • Mobile viewer app for iOS and Android (AR-media Player)
  • Spatial mapping to view your contents in the environment (indoor and outdoor) for a markerless experience
  • Many more to come…

Please keep following our channels to be the first to know when the new AR-media Plugins will be made available to the public!

Follow our Linkedin page
Follow our Facebook page
Subscribe to our YouTube Channel