Augmented eyes on Apple at developer conference

New computers, iPad overhaul and expanded Messages app on the cards, with AR glasses a possibility

Apple is to reveal details of the software updates coming to its phones, tablets and computers, in the company’s annual worldwide developers’ conference (WWDC).

But while new computers, an expanded Messages app, and an overhaul of the iPad’s software to make it more like a laptop are all on the cards, the biggest question mark on Monday is whether Apple will show any evidence of its forthcoming augmented reality – or AR – glasses.

Continue reading...

Oculus Quest Gets iPhone Notifications, Files App For Browser

The latest Oculus Quest software enables iOS users to see their iPhone lock screen notifications while in VR. The update also adds a file management app.

The phone notifications feature in the latest v29 update to Quest’s system software should be rolling out to headsets over a couple weeks as Facebook typically releases new features to headsets worldwide on a gradual basis. While the notifications are iOS only for now, Facebook notes that Android support is “coming soon as well.” The feature should respect multi-user settings, so “notifications will not be visible to any other account holders signed into the headset,” according to the company.

The file management app in the same update offers the ability to “to access, browse, manage, share, and upload files located on your headset across multiple locations in VR,” according to Facebook. While you could already access some file management features with the built-in Web browser, the arrival of a centralized file management app should make things easier.

The notifications feature is part of system-level updates to Facebook’s “Infinite Office” set of features meant to make it possible to get real work done in VR. In the previous v28 update, Facebook added the ability to mark the location of your desk in VR so you know where it is, as well as the ability to track the location of a specific Logitech keyboard so you can more easily type with the headset on. Consulting Oculus CTO John Carmack recently suggested Facebook was on a path to try and displace tablets and Chromebooks for some budget-conscious computer buyers with its VR headsets.

Facebook provided the following video demo feature showing how the notifications feature works while inside the headset.

LiDAR Scanner für iPhone 12 Pro für “Instant AR”

Tower Tag auf Steam

Apple hat gestern sein neuestes Angebot an Smartphones vorgestellt, darunter das iPhone 12 Pro und das iPhone 12 Pro Max, die beide mit einem LiDAR-Scanner ausgestattet sind, der die AR-Fähigkeiten erweitern wird.

LiDAR Scanner für iPhone 12 Pro für “Instant AR”

Wie das Anfang des Jahres eingeführte iPad Pro bringt Apple nun auch seine High-End-Smartphones, das neue iPhone 12 Pro und 12 Pro Max, mit einem LiDAR-Scanner auf den Markt.

LiDAR ist ein so genannter “Laufzeit”-Tiefensensor, der misst, wie lange es dauert, bis Licht von Objekten in der Szene abprallt und zum Sensor zurückkehrt. Mit präziser Zeitmessung wird diese Information verwendet, um die Tiefe jedes einzelnen Punktes zu beurteilen. Mit reichhaltigen Tiefeninformationen können Augmented-Reality-Erfahrungen schneller und genauer sein.

Apple sagt, dass LiDAR im iPhone 12 Pro und 12 Pro Max bedeutet, dass die Telefone in der Lage sein werden, “instant AR” zu erzeugen. Das liegt daran, dass LiDAR Tiefeninformationen im Äquivalent eines “Einzelfotos” erfasst, ohne dass das Telefon bewegt wird oder Bilder über die Zeit hinweg verglichen werden müssen. Aber natürlich kann die Kamera bzw. das Smartphone auch bewegt werden, um eine komplette 3D-Karte eines Raumes mit hoher Genauigkeit zu erstellen.

Apples iPhone 12 Pro kostet 1.120 Euro und wird am 23. Oktober auf den Markt gebracht, während das größere iPhone 12 Max 1334,45 Euro kostet und am 13. November auf den Markt kommt. Die anderen neu eingeführten Telefone des Unternehmens, das iPhone 12 und der iPhone 12 Mini, enthalten keinen LiDAR-Sensor.

(Quelle: Road to VR)

Der Beitrag LiDAR Scanner für iPhone 12 Pro für “Instant AR” zuerst gesehen auf VR∙Nerds. VR·Nerds am Werk!

Apple’s Investment In Lidar Could Be Big For AR

While many of Apple’s investments in innovative technologies pay off, some just don’t: Think back to the “tremendous amount” of money and engineering time it spent on force-sensitive screens, which are now in the process of disappearing from Apple Watches and iPhones, or its work on Siri, which still feels like it’s in beta nine years after it was first integrated into iOS. In some cases, Apple’s backing is enough to take a new technology into the mainstream; in others, Apple gets a feature into a lot of devices only for the innovation to go nowhere.

Lidar has the potential to be Apple’s next “here today, gone tomorrow” technology. The laser-based depth scanner was the marquee addition to the 2020 iPad Pro that debuted this March, and has been rumored for nearly two years as a 2020 iPhone feature. Recently leaked rear glass panes for the iPhone 12 Pro and Max suggest that lidar scanners will appear in both phones, though they’re unlikely to be in the non-Pro versions of the iPhone 12. Moreover, they may be the only major changes to the new iPhones’ rear camera arrays this year.

If you don’t fully understand lidar, you’re not alone. Think of it as an extra camera that rapidly captures a room’s depth data rather than creating traditional photos or videos. To users, visualizations of lidar look like black-and-white point clouds focused on the edges of objects, but when devices gather lidar data, they know relative depth locations for the individual points and can use that depth information to improve augmented reality, traditional photography, and various computer vision tasks. Unlike a flat photo, a depth scan offers a finely detailed differentiation of what’s close, mid range, and far away.

Six months after lidar arrived in the iPad Pro, the hardware’s potential hasn’t been matched by Apple software. Rather than releasing a new user-facing app to show off the feature or conspicuously augmenting the iPad’s popular Camera app with depth-sensing tricks, Apple pitched lidar to developers as a way to instantly improve their existing AR software — often without the need for extra coding. Room-scanning and depth features previously implemented in apps would just work faster and more accurately than before. As just one example, AR content composited on real-world camera video could automatically hide partially behind depth-sensed objects, a feature known as occlusion.

In short, adding lidar to the iPad Pro made a narrow category of apps a little better on a narrow slice of Apple devices. From a user’s perspective, the best Apple-provided examples of the technology’s potential were hidden in the Apple Store app, which can display 3D models of certain devices (Mac Pro, yes; iMac, no) in AR, and iPadOS’ obscure “Measure” app, which previously did a mediocre job of guesstimating real-world object lengths, but did a better job after adding lidar. It’s worth underscoring that those aren’t objectively good examples, and no one in their right mind — except an AR developer — would buy a device solely to gain such marginal AR performance improvements.

Whether lidar will make a bigger impact on iPhones remains to be seen. If it’s truly a Pro-exclusive feature this year, not only will fewer people have access to it, but developers will have less incentive to develop lidar-dependent features. Even if Apple sells tens of millions of iPhone 12 Pro devices, they’ll almost certainly follow the pattern of the iPhone 11, which reportedly outsold its more expensive Pro brethren across the world. Consequently, lidar would be a comparatively niche feature, rather than a baseline expectation for all iPhone 12 series users.

The new XS Portrait Mode lets you adjust background blur (bokeh) from f/1.4 to f/16 after taking a photo.

Above: Portrait Mode lets you adjust background blur (bokeh) from f/1.4 to f/16 after taking a photo.

Image Credit: Jeremy Horwitz/VentureBeat

That said, if Apple uses the lidar hardware properly in the iPhones, it could become a bigger deal and differentiator going forward. Industry scuttlebutt suggests that Apple will use lidar to improve the Pro cameras’ autofocus features and depth-based processing effects, such as Portrait Mode, which artificially blurs photo backgrounds to create a DSLR-like “bokeh” effect. Since lidar’s invisible lasers work in pitch black rooms — and quickly — they could serve as a better low-light autofocus system than current techniques that rely on minute differences measured by an optical camera sensor. Faux bokeh and other visual effects could and likely will be applicable to video recordings, as well. Developers such as Niantic could also use the hardware to improve Pokémon Go for a subset of iPhones, and given the massive size of its user base, that could be a win for AR gamers.

Apple won’t be the first company to offer a rear depth sensor in a phone. Samsung introduced a similar technology in the Galaxy S10 series last year, adding it to subsequent Note 10 and S20 models, but a lack of killer apps and performance issues reportedly led the company to drop the feature from the Note 20 and next year’s S series. While Samsung is apparently redesigning its depth sensor to better rival the Sony-developed Lidar Scanner Apple uses in its devices, finding killer apps for the technology may remain challenging.

Though consumer and developer interest in depth sensing technologies may have (temporarily) plateaued, there’s been no shortage of demand for higher-resolution smartphone cameras. Virtually every Android phone maker leaped forward in sensor technology this year, such that even midrange phones now commonly include at least one camera with 4 to 10 times the resolution of Apple’s iPhone sensors. Relying on lidar alone won’t help Apple bridge the resolution gap, but it may further its prior claims that it’s doing the most with its smaller number of pixels.

Ultimately, the problems with Apple-owned innovations such as 3D Touch, Force Touch, and Siri haven’t come down to whether the technologies are inherently good or bad, but whether they’ve been widely adopted by developers and users. As augmented reality hardware continues to advance — and demand fast, room-scale depth scanning for everything from object placement to gesture control tracking — there’s every reason to believe that lidar is going to be either a fundamental technology or a preferred solution. But Apple is going to need to make a better case for lidar in the iPhone than it has on the iPad, and soon, lest the technology wind up forgotten and abandoned rather than core to the next generation of mobile computing.


This post by Jeremy Horwitz originally appeared on VentureBeat.

Place Lamborghini’s new Huracan EVO RWD Spyder in Your Driveway Using AR

Lamborghini Huracan Evo

Carmakers the world over have utilised immersive technology from the design stages all the way through to marketing their latest models. Illustrious sports car manufacturer Lamborghini used virtual reality (VR) in 2015 to promote the Huracán, and now in 2020, the new V10 Huracan EVO RWD Spyder is getting the augmented reality (AR) treatment.

Lamborghini Huracan Evo

To see the new model, Lamborghini fans need to head to its official website on a compatible iPhone or iPad (AR requires iOS 11 and an A9 processor or later) using Apple’s AR Quick Look. On the site, Apple users can simply tap “See in AR” to view the new open-top rear-wheel-drive model anywhere they like, from their living rooms to their driveways.

The AR experience enables the Huracan EVO RWD Spyder to be rotated and scaled as required, all the way up to 1:1 scale. This enables viewers to freely look closely at its exterior and interior details. They can also take pictures of the new car, providing a high level of photorealism.

This is the first time Lamborghini is using AR to promote a new model, as countries battle to contain the coronavirus (COVID-19) pandemic through lockdown measures. Lamborghini will soon expand this functionality to the entire range.

Lamborghini Huracan Evo

“At a time of major business challenges, Lamborghini is innovating once again and exploring new methods of communication. New technologies have accelerated rapidly during this time of global emergency, and Lamborghini is pioneering exciting new possibilities. Starting tomorrow, Lamborghini can be in everyone’s home thanks to Apple’s AR technology, which is available on hundreds of millions of AR-enabled devices around the world,” said Stefano Domenicali, Chairman and Chief Executive Officer of Automobili Lamborghini in a statement.

The AR model will likely be the closest most of us come to the V10 Huracan EVO RWD Spyder, not least because of the pandemic. With a suggested retail price of £151,100.00 (approximately $186,000 USD) the car offers owners a 0-62 mph acceleration of just 3.5 seconds and a top speed of 201 mph. And its soft-top roof reportedly stows away within 17 seconds, up to speeds of 31 mph.

As further examples of carmakers using VR and AR arise, VRFocus will keep you updated.

New iPad Pro Gets LiDAR Scanner for Improved AR

LiDAR, the light detection and ranging technology, is usually found in commercial and industrial equipment for things like 3D scanning buildings, land, and other objects. Now, Apple announced that its latest iPad Pro includes LiDAR hardware, making it what the company calls “the world’s best device for augmented reality.”

LiDAR is able to create a depth map by measuring how long it takes light to reach an object and reflect back. To that effect, Apple’s new custom-designed LiDAR scanner is said to operate at “nano-second speeds,” and work up to five meters away (~16.5 ft) from an object, something the company says it can do both indoors and out. This basically gives it a higher fidelity way of mapping a room for more accurate AR scenarios.

Image courtesy Apple

To do this, Apple says its new depth frameworks in iPadOS combines depth points measured by the LiDAR scanner, data from both cameras and motion sensors, and on-device computer vision algorithms via its A12Z Bionic chip.

The company has also integrated the new scanner to plug into its existing ARKit framework, giving all of the platform’s AR apps improved motion capture and people occlusion, Apple says. This close integration likely points to other Apple devices getting LiDAR for improved AR in the future, with the most likely suspect being the next flagship iPhone.

SEE ALSO
Report: Apple Acquires Motion Capture Firm IKinema

“Using the latest update to ARKit with a new Scene Geometry API, developers can harness the power of the new LiDAR Scanner to unleash scenarios never before possible,” the company says. “The LiDAR Scanner improves the Measure app, making it faster and easier to automatically calculate someone’s height, while helpful vertical and edge guides automatically appear to let users more quickly and accurately measure objects. The Measure app also now comes with Ruler View for more granular measurements and allows users to save a list of all measurements, complete with screenshots for future use.”

Priced at $800 for the 11-inch version and $1,000 for the 12.9-inch, the new iPad Pro’s LiDAR scanner comes alongside a list of other “pro” features, including new cameras, motion sensors, “pro” performance & audio, and a Liquid Retina display. Can’t forget that new Magic Keyboard, which Apple hopes will entice more users to finally make the switch from PC laptops to iPads. Check out the new iPad Pro here.

The post New iPad Pro Gets LiDAR Scanner for Improved AR appeared first on Road to VR.