I tried the Apple Vision Pro and saw the future, but don’t buy it yet

Maria Korolov at the Apple store in Holyoke, Massachusetts.

This past weekend, I went to the local Apple store and got a demo of the new Apple Vision Pro headset — the one where you’ll spend a minimum of $3,500 and more likely $4,000 if you buy it.

A few years ago, I traded my iPhone for an Android because Samsung released the Gear VR headset and Apple didn’t have anything similar in the pipeline.

I still miss my iPhone, but all the phone-based VR action has been on the Android side. However, Samsung dropped its Gear VR project, and Google stopped developing its Cardboard and Daydream View platforms.

So I’m open to going back to the Apple ecosystem, if there’s something worth switching for.

Is the Apple Vision Pro the reason to switch? No.

Was the demo educational? Yes, and I’m going to tell you what I learned.

And, at the end of this article, I’ll explain who should buy the headset now, and who should wait for two or three generations.

But first, why I’m not going to switch to the iPhone and buy an Apple Vision Pro, even though I cover tech so could deduct it as a business expense.

I can’t do my work on it

Even putting aside the fact that my work computers are all Windows, and the Vision Pro only pairs with Apple computers — and late-model computers at that — the headset itself isn’t optimal for prolonged use.

It’s heavy so you don’t want to wear it for hours. There’s no usable virtual keyboard — you’d need to use a physical keyboard, anyway. It’s hard to drink coffee in it. And you can’t attend Zoom meetings in it. Yes, you can Facetime — but only as a cartoon avatar.

And, of course, it doesn’t replace a computer. It’s an add-on to a computer. It’s basically a single external monitor for my computer. I already have two giant monitors, and the prices for monitors are ridiculously cheap now, anyway. If I wanted to upgrade a monitor, I’d just upgrade the monitor itself and not switch to a VR headset.

 

Maria wearing the Apple Vision Pro.

Still, the graphics are amazing. I enjoyed the almost-completely-realistic resolution of the display.

There’s no killer app

I didn’t see anything during the demo that I absolutely had to have, and would use all the time.

If there was a killer app, then maybe I’d get the headset, and then use the other stuff because I have the headset on all the time, anyway — might as well do everything in VR.

That’s what happened with smartphones. We got the smartphones because we needed a phone anyway. You can’t live without a phone. And once you have the phone, might as well use it as a camera, as a GPS, as an ebook reader, as a music player, as a note-taking app, as a calendar, and as a casual gaming device. Not to mention all the thousands of other apps you can use on a smartphone.

Will that happen with VR? No. Nobody is going to spend their life inside a VR headset.

It will happen with AR. I’m still totally convinced that AR glasses are the future. They will replace our phones, and, since we have the AR glasses on anyway, we’ll use them for work, we’ll use them for music and movies and games and social media and everything else.

But right now, we’re not there. The Apple Vision Pro wants to be there — it’s pass-through camera makes the device usable for augmented reality. But it’s not an always-on, always-with-you device. And until it is, we’ll still use all the other stuff.

It’s too expensive for just fun and games

Sure, there are a handful of games for the Vision Pro. And you can watch movies on a giant personal screen.

But there are far, far cheaper ways to play VR games, with much bigger selections. And there are already very cheap and lightweight glasses that let you watch movies if you want that kind of thing. Or you can just buy a slightly bigger TV. TVs are getting ridiculously cheap these days.

Also — I already own a big TV set. And I can watch my TV with other people. I can’t watch movies on the Vision Pro with other people.

Now, lets talk about what I learned about the future from getting this demo.

Seeing an overlay over reality is awesome

Yes, the Meta Quest has a pass-through camera but the video quality is lousy.

The Apple Vision Pro’s video quality is awesome. It’s almost like looking through a pane of glass. Not exactly glass — it fakes it with video — but close enough. I was extremely impressed.

And, Samsung showed off a transparent TV earlier this year.

So the idea of a transparent pair of glasses that can turn into AR or VR glasses on demand — it’s within reach. And these transparent glasses are going to be awesome for augmented reality. And we can replace our phones with these glasses.

If you want to see what that world will be like, go to the nearest Apple store and get your own Vision Pro demo.

The interface of the future will be gesture-based

Remember how, in The Minority Report, Tom Cruise moved images around with his hands?

That’s what the Vision Pro interface is like. They have cameras on the headset that can see your hands. In fact, even if my hand was hanging down by my side, it still registered if I made a pinching gesture.

No controllers necessary.

This will be great in the future when we wear smart glasses all the time because we won’t have to carry controllers around. The fewer things we have to carry around, the better.

But the big progress that Apple made with the interface is the eye-tracking. To click on something, you just look at it and pinch your fingers. That means that you don’t have to have your hands up in the air in front of you all the time. That would get tired. I mean, how long can Tom Cruise stand there, waving his arms around? No matter how fit you are, that’s going to get tiring.

And, like I said, you don’t need to raise your arm to make the pinching gesture. You can keep your hand down on your lap, or by your side, or on your desk.

(Photo by Terrence Smith.)

You still need to raise your arms to resize windows, or to drag them around, but how often do you need to resize a window, anyway?

Who should buy the headset today

If you’re building an AR or VR platform for the future, you should definitely check out the Vision Pro and see what possibilities are offered by the pass-through camera and the eye-tracking-and-pinching control system.

But, unless your company is paying for the device, return it within the two-week period.

The only reason to keep it is if you are currently developing apps for the Apple Vision Pro. Then, you need the device to test your apps.

If you’re anyone else, buy a larger TV and computer monitor and a PlayStation VR or a Quest to play games on and you’ll still be around $3,000 ahead.

But do go and get a demo. It’s free, and it might give you some ideas for apps or business opportunities for a few years down the line.

Apple and the bane of VR gentrification

Apple Vision Pro. (Image courtesy Apple.)

I recently read a CNN article on Tim Cook and the risk he’s taking with Apple Vision Pro. The gist of it is this: The Vision Pro will be Apple’s riskiest launch in years and could end up being the product that defines Tim Cook’s legacy.

What struck me is the point of view. The main concern is the fate of one person, Tim Cook. And fair enough, his legacy at Apple might indeed be affected by the success or failure of the Vision Pro. But I can’t help thinking about how there’s high visibility attention on a single wealthy-for-life individual — but far less talk about the societal impact of this latest Apple product, which, to my mind, conjures up the phrase, “gentrification of extended reality technologies.”

Have you seen the Apple Vision Pro movie? It’s a statement, not only about the product, but also of the affluence of the “neighborhood” that Apple associates with their target demographic. 

The thing about gentrification of physical neighborhoods is that it implicitly demotes the preceding locals and the context of their lives, despite marketing claims to the contrary. I’ve watched this happen first hand in the so-called Arts District in Los Angeles. Artists and artists’ lofts have, with few exceptions, given way to expensive upscale condominiums and trendy food and drink spots.

The first impression the Vision Pro movie makes is that Apple’s target demographic lives in immaculate upscale dwellings ostensibly in an upscale neighborhood. Of course this compliments the marketing of an AR-forward technology that includes seeing the physical environment while the projected interactions are displayed as a visual overlay.

However, a cluttered wall or messy pile of clothes is also going to be a part of the Vision Pro experience, and a major distraction I think. The minute I see highly staged and perfected environments in the marketing I suspect that gentrification – in this case, gentrification of our dwellings – is in play.  

Apple Vision Pro. (Image courtesy Apple.)

If that was the end of it, we could excuse it as marketing pretention to flatter the product. But then there’s the retail cost of $3,500 for the privilege of Vision Pro ownership and the case for gentrification becomes unavoidable.

When VR and AR, component technologies of XR, were still emerging from niche implementations, it was interesting that Google created viewers out of cardboard, to take advantage of the ubiquitous technology of the cell phone and provide some form of XR experience to virtually anyone, anywhere. The cost of entry was exceedingly low, although we understandably bemoaned the lack of apps and motion sickness.

Subsequently, however, the push for a superior stand-alone headset has seen rising costs while still not achieving widespread adoption. Consumers have balked at the increasing retail price of the Meta Quest headset, which has doubled between the two latest versions. Still, I suppose it’s something to say that it comes in under $1,000, similar to the cost of a well-configured recent model iPhone.

Now with the Vision Pro, however, Apple has really upped the ante and set its sights on a privileged few. At $3,500, it costs five times as much as the Meta Quest 3 and ten times as much as the Meta Quest 2. It’s priced like a very well configured MacBook Pro, but without the corresponding breadth of software ecosystem to power it. 

Hopefully the cost of anything is, first and foremost, a reflection of its relative value.

Well, as shown in the Vision Pro movie, the primary functionality of the Vision Pro is watching visuals, entertainment and video chats. So, your friend can appear to be hovering over your bed as you chat and walk about the room. Amazing? Sure, but who needs this?

Perhaps truly absurd is the person packing a suitcase, while wearing the headset and then taking a video call. It’s already challenging that there’s a headset and tethered battery to wear at all, but to wear it while doing a real-world chore, just in case a call comes through? No one… literally no one with an ounce of practicality is going to do that.

Yet the implication is that if you want to stay connected, you should want to do that, at all times. Ironically it also suggests that your cell phone, which easily slips out of the way into a pocket or waits, also out of the way, patiently on a counter, has become just so… passé, so… inadequate. Gentrification of your phone calls never looked so sci-fi, yet so pointless.

Apple Vision Pro. (Image courtesy Apple.)

Consequently, my concern is that this whole class of technology still won’t become ubiquitous like the cell phone. The potential benefits of XR’s components, VR and AR, could be enormous for everyone. 

But like gentrification of a neighborhood, people will be priced out of the Apple XR privilege in droves. There will be fewer customers, but with necessarily greater economic means. Their needs and desires will take over the paradigm and be the influence for most content.

And consider this: the Apple marketing movie shows movie watching with the Vision Pro. Are you in a family of, let’s say, four? Well, that’s $14,000 in headsets for everyone to take part together.

Yet Apple touts an inclusive paradigm of the Vision Pro by displaying an uncanny valley version of your face on the headset to people who look at you. But rather than inclusivity, the implicit message is, “I live in a world you can’t experience without affluence.”

Apple Vision Pro. (Image courtesy Apple.)

I’m skeptical that Apple has cracked the code for selling the world on XR, but we may nonetheless be witnessing the gentrification of a technology.

Of course, it’s not so much that Apple is trying to gentrify this domain. Solving the challenges of this technology has been expensive, and the devices we’d be happy with would inevitably be expensive, at least at first. I just hope XR doesn’t remain a vanity project for Apple with usefulness based on deep pockets and superficial ideas of what we need to lead meaningful lives.

Update: I’ve been online at Apple’s Vision Pro sales page to see what kinds of options are available for the Vision Pro. To my surprise, the first step you’re compelled to complete is a scan of your head for measurements needed by Apple to include the correct fit of Light Seal and head bands. You’ll need an iPhone or iPad with Face ID to find the right size. If you’re on a desktop computer, you’ll also scan a circular Apple code on the screen that synchronizes their site with your captured head dimensions.

After looking left, right, up and down, twice, your dimensions are submitted to Apple. The next step is to select options for your vision, whether you have a prescription, contacts or readers. You won’t need precise prescription information because the inserts are generalized and accommodate most prescriptions. The optical inserts run between $99 and $149.

After all of the sizing procedure, you’re able to select a storage memory size, from 256GB to 1TB. The 1TB option is $3,899.

My final point is this: If there was any doubt this device is a vanity device, the custom fit and optical inserts tell you that each Vision Pro is tailored primarily for just for one person. Since the optical inserts attach magnetically, you could swap them out with another user, but in practice, is that practical? And what about the Light Seal and head bands, also sized to fit?

Of course, the real measurements that count are product sales and paradigm adoption rates.

Do you want to buy an HP Reverb G2 VR headset? I’m also giving away three free VR headsets.

Hey there, Hypergrid Business readers. It’s the new year, and I’m moving my office and cleaning up, and have a few VR headsets sitting around that I’d like to get rid of.

They work, are hardly used, and one is even in its original — UNOPENED — box.

If you’re in the western Massachusetts area, and want to meet up, I can give you a free VR headset. Or if you’re anywhere in the world, and can pay for shipping, you can buy the brand-new one.

Here’s the one I’m selling

HP Reverb G2 VR headset

I don’t have my own picture of the headset itself because I haven’t opened the box. Yup, I bought it a year ago and never even opened it. It’s been sitting on a shelf in my office, and I realized that if I haven’t opened it yet, I’m never going to.

It runs for $599 on the HP website, currently on sale for $469, but it’s out of stock as I write this. I’m selling it for $400.

The box is unopened, so I don’t know exactly what’s in there for certain, but I bought it directly from HP and I’m reasonably sure that they put in everything it’s supposed to have.

Here’s the official picture of the headset itself:

HP Reverb G2

It’s a fancy, high-end headset and comes with two controllers, has six degrees of movement, and is compatible with SteamVR and Windows Mixed Reality. The way it works is that you plug it into your computer, so there is a cable that you have to have on your head when you use it. So, unless you’ve got one of those computers that fits in a backpack, you’d probably be using this headset sitting down, or, at least, standing in one place close to your PC.

Here’s a picture of some guy using it, with the chair positioned just right so you can’t see the cable running from his head to the laptop:

HP Reverb G2 VR headset. (Image courtesy HP.)

Are you interested? Email me at maria@hypergridbusiness.com. I’m charging $400 plus shipping, so if you’re not too far away, it might be a good deal.

If nobody here is interested, I’ll put it up on eBay.

And here are the three free VR headsets I’m giving away:

HTC Vive

HTC Vive

Comes with a couple of controllers plus a faceguard thing. It’s an all-in-one headset that you recharge with a USB cord. I think it’s the HTC Vive Focus Plus. It’s currently $449 on the official website, down from a regular price of $629. I’ve opened it and played with it, and no longer have the original packaging, so I’m just giving it away.

You don’t need a phone or a PC to use it, so it’s completely wireless. You do need a WiFi connection, though, to download apps and stuff.

If you’re around Western Massachusetts, we can meet up in some local coffee shop, and you can just have it. Or you can pay for shipping and I can box it up and send it to you. But, like I said, I don’t have the original packaging so I’ll have to bubble wrap it.

Google Daydream View

Google Daydream

This is one of those headsets that you put a phone into. It’s the Google Daydream View and Google has stopped supporting it, but there are still Daydream-compatible apps up in the app store.

The controller has a little hidey-place inside the headset:

Daydream View headset from Google.

That’s also how you put your phone in it. For a list of compatible phones, see this official list from Google.

It can also run regular Google Cardboard apps, but then the controller won’t work.

Generic Google “Cardboard” headset

Some off-brand Google Cardboard-compatible headset.

 

This is one of those cheap generic $10 headsets you can buy at Walmart that you put your phone into. It can run any Cardboard-compatible app.

I use it with my Android phone, but there’s even support for iPhones. There’s no controller with Cardboard, and no six degrees of movement. You can turn your head, but you can’t move your head laterally forward or backward, so if you’re not careful with how you use it, you can become dizzy quite easily. But you can use it to watch YouTube’s 360-degree videos in VR, and there’s a bunch of roller-coaster-type rides, some simple games, and, of course, porn.

If nobody here wants any of these free ones, I’ll give them away on Nextdoor or Craigslist, but I figured I’d give you guys first crack at them.

I’ll be presenting at OSCC tomorrow

Checking out the podium at this year’s conference. (Image by Maria Korolov.)

I’m giving two presentations tomorrow at the OpenSim Community Conference.

First, I’m giving my usual state of the hypergrid talk at 3:30 p.m. Pacific time. I’ll be doing a roundup of this year’s top news and OpenSim statistics.

Then, at 4 p.m. Pacific, I’ll be talking about how generative AI will change content creation and coding.

I have been covering AI quite a bit lately, especially for CIO magazine. You can see all my latest AI articles here. As part of that, I’ve been talking to CEOs, CIOs and other senior executives at companies around the world, as well as leading experts on AI and the vendors building the technology. It doesn’t hurt that I have a degree in mathematics and can read the research papers. My own undergraduate research, funded by the NSF, was about a dynamical systems approach to differential equations. If you want more AI, and want to see me in the physical world, I’ll be the keynote speaker at the 2024 Data and AI Summit in March.

About the conference

It’s the eleventh annual OpenSimulator Community Conference, celebrating the community and development of the OpenSimulator opensource software. It will feature over 70 speakers leading presentations, workshops, panel sessions, and social events across the diversity of the OpenSimulator user base.

This year’s conference kicks off yesterday with networking events and today there will be art tours and music performances. The conference then features two days of dynamic presentations on Saturday and Sunday–including a hypergrid shopping tour and a closing night party on Sunday. There will also be more community events and tours following the conference weekend.

Register for the conference here. See the full schedule here.

Attending the conference event is free, but those wishing to financially support the conference can still sponsor or participate in its crowdfunder campaign when registering. Participants in the crowdfunding cCampaign will receive a variety of thank-you gifts depending upon their level of participation, including conference VIP seating and the ability to have a virtual expo booth at the event. The conference sponsorship or crowdfunder contribution is tax-deductible to the extent allowable by law for US residents.

You can also choose to register to have an avatar account created for you locally on the OSCC conference grid server or hypergrid to OSCC via your home grid avatar.

Apple Takes on Meta in Race to Make VR Mainstream

Meta has not seen the results it has hoped for with its investments in the VR space, with an operating loss of $31 billion. However, the recent announcement of Apple’s Vision Pro last June is set to reinvigorate the marketplace.

Rolf Illenberger

“I and the entire industry was waiting for Apple for a decade,” said Rolf Illenberger, CEO of VRDirect, which provides company software solutions to build their own VR projects. Among their clients are Porsche and T-Mobile.

The Apple Vision Pro is branded by Apple as a spacial computer, which will allow users to clearly see their surroundings, and project apps as if appearing within the physical space.

Several companies are currently involved in the virtual reality and augmented reality space, including “Meta, Apple, HTC . . . [and]  Lenovo,” said Illenberger told Hypergrid Business.

Finance website Insider Monkey ranks Apple, Inc. at number one in their largest VR/AR companies, with a market cap of $2.8 trillion — though Apple doesn’t actually have an AR or VR product out yet. It ranks Meta at number six, having a market cap of $787 billion.

With Apple set to release the Vision Pro, VR is about to go mainstream, according to Illenberger.

“So I think now that Apple has joined the group of companies pushing this technology,” he continued, “it’s obvious that this is the next big thing. And it’s also obvious that these companies will, you know, continue investing billions in this technology, not only in the technology but also in the kind of adoption of this technology in the market out there.”

One major hurdle that the Apple Vision Pro will face is its price tag of $3,500, which may turn off many consumers.

“You have to see that the whole, let’s say, metaverse, technologies, VR and AR, we’re still that’s still technologies, very infant technologies,” said Illenberger on the accessibility of VR technology. “I would even argue they’re not yet in a state that it’s a mass market b2c thing at this point in time, you know, look at the look at the Apple headset and the price point, but also looking at the other available VR headsets. We’re not talking about devices that are tailored towards a mass market audience at this point in time.”

In a separate interview with Laptop, Illenberg highlighted that the initial goal of the first Vision Pro model is not to sell units but rather to create buzz for the product.

“A fair comparison might be HDTV, in say, 2006 or 2007. The motivation to announce Vision Pro now was to stimulate and nurture the ecosystem of app developers and content creators to invest in the new device, which was already happening once rumors about the device started to emerge several months ago. Hence, Vision Pro is already a great success for Apple.”

Meta Quest 3. (Image courtesy Meta.)

Meta officially launched the Meta Quest 3 during its Meta Connect event last month. Meta is also experimenting with Flamera, a VR headset that utilizes a new passthrough technology that is supposed to eliminate external feed distortion and artifacts.

Army orders more AR goggles post-pukegate

(Image courtesy Microsoft.)

Remember those mixed reality combat goggles that Microsoft was building for the Army that made soldiers nauseous? (See our previous story here.) Well, they’re back.

Microsoft got its hands slapped and had to go back to the drawing board after reports came out last year that its new AR goggles were making soldiers literally sick to their stomachs. But it seems they’ve fixed the issues, and are ready for round two.

The Army just placed a new order for the Integrated Visual Augmentation System or IVAS goggles, Bloomberg reports.

Microsoft sent the Army some new prototype headsets this summer. The company apparently fixed the issues that caused headaches, nausea and pain.

The Army spokesperson said the new headsets showed “improvements in reliability, low light sensor performance, and form factor.”

I’m sure Microsoft was sweating bullets about this contract, since its consumer AR efforts seem to be dying on the vine. Apple has been grabbing all the hype with its upcoming mixed reality headset — but at $3,500 a pop, I don’t know how much traction those headsets are going to get, either. Which just leaves Meta and the Quest 2.

Meanwhile, Microsoft laid off a bunch of the HoloLens team earlier this year.

The next steps for IVAS include adding in cloud computing, the Army Times reports. This will let soldiers download apps for specific mission needs.

The Army wants to avoid overloading IVAS by offloading apps to the cloud instead of the device. During testing, soldiers used the goggles for assault planning, mission practice, targeting, and more.

IVAS lets them ditch the sand table to quickly scout and rehearse missions virtually.

Rather than an MRE-box sand table, a unit could virtually “see” the terrain in their heads-up display and rehearse a mission in their patrol base before leaving the wire,” Brig. Gen. Christopher Schneider told Army Times.

“Now we have to make this system producible and affordable,” he added.

Earlier issues around night vision, size, and weight are getting fixed bit by bit. The goal is to nail down cost and manufacturing in 18 months.

If all goes well, IVAS could start hitting units by 2025. Of course, that’s assuming the cloud tech actually works as advertised. And that Congress keeps funding the project.

How it started

Microsoft started working with the Army in 2018 on mixed reality headsets using its HoloLens tech. The goal was to help soldiers train, plan missions and operate better in the field, the company said in a long article about the project two years ago.

IVAS has night vision, heat sensors, 3D mapping and other HoloLens features. It’s meant to give soldiers more awareness by layering digital info onto the real world, the company said.

To get input, Microsoft engineers did mock bootcamps in 2019, where they learned skills like navigating at night. This helped them design IVAS to handle tough conditions soldiers face.

After soldiers tested IVAS for around 80,000 hours by early 2021, Microsoft had a headset ready for combat use.

I’m not sure why they missed the whole nausea thing the first time around. Maybe the engineers had been using the headset so much themselves, during the whole development process, that they were used to it? Or it was so much better than the early iterations, that the nausea didn’t even register as a problem any more?

New data shows VR interest continues to fall

(Image by Maria Korolov via Midjourney.)

New research reports and surveys released this month show that interest in virtual and augmented reality is continuing to drop.

According to an EY Consulting survey released earlier this month, only 24 percent of people said their company has started using VR and AR technologies, putting it in last place among all technologies people were asked about. Those other technologies included cloud and edge computing, IoT, digital twins, quantum computing, biometrics, blockchain, and generative AI. And that’s with quantum computers not even available on the market yet and generative AI only really becoming accessible to the world late last year.

Similarly, even among people who were familiar with VR technology, only 15 percent said that they wear a VR headset at work, only 17 percent said they attend meetings in the metaverse, and only 18 percent use VR for onboarding or training. The survey didn’t ask if they did those things on a regular basis, or tried them once and stopped.

According to a report by research firm IDC, global shipments of AR and VR headsets dropped sharply this year — a decline of 54 percent compared to the same time in 2022.

The only growth was in augmented reality displays — the kind with transparent lenses, where you can see the real world, just with a holographic overlay over it. The top example of this are the Air AR glasses from Xreal. They can project a TV screen or computer monitor into the air in front of you, and look just like sunglasses. You connect them via a USB-C port to your smartphone or tablet, or to a PC or gaming console. You can even order them with prescription lenses. And they cost just $379 — still a little on the high side but about as much as you’d pay for a second monitor, and only about a tenth of the price of Apple’s yet-to-be-released Apple Vision Pro headset.

Xreal Air AR glasses. (Image courtesy Xreal.)

Right now they only come in black and look a little clunky. But, in general, this is what I expected the Apple headset to be — a replacement display for the iPhone screen that was as easy to use as a pair of sunglasses. Add a few design options, a Bluetooth connection, and a case that doubles as a charger and I’m sold — especially if the resolution of the display is good enough to read text.

Oh, and I want the kind of lenses that automatically turn lighter or darker when you want them to. If they could replace my regular glasses, I could just wear them all the time.

Some companies are continuing to invest in AR and VR. J. Crew, for example, launched a virtual store earlier this month.

J. Crew virtual store.

When you go inside the house, it’s basically a typical real estate or museum tour — click on the arrows to teleport around, then look in various directions. For the most part, it’s walls with pictures of items from the J. Crew catalog. A bit less pleasant to experience than the paper version, and a lot less convenient than its regular online shopping experience.

Frankly, it reminded me a bit of in-world stores in Second Life and OpenSim.

And I’m not the only one who wasn’t impressed.

According to a YouGov survey, 45 percent of J. Crew customers don’t see any practical applications for augmented or virtual reality.

The survey also shows that 67 percent of the retailer’s customers think that augmented or virtual reality allows people to experience products and services before they buy them. Since that statement is technically true — augmented and virtual reality does allow that, for a certain definition of the word “experience” — I’m surprised that the answer wasn’t 100 percent.

The thing is, many retailers are already adding virtual models to their websites, so you can see what the clothes might look like on you, or on a model that’s shaped like you. No AR or VR required.

In other words, AR and VR have all the inconveniences of physical stores — limited selection, hard to find what you’re looking for — with none of the benefits like, say, being able to feel the fabric, checking that the shoes or clothes don’t pinch or itch, or buying a Cinnabon in the food court after you’ve finished shopping as a reward for surviving the ordeal.

I was very disappointed in Apple this week

In April, I wrote that I had high expectations for Apple’s new augmented reality headset — and that I was looking forward to switching back to the iPhone if it was what I hoped for.

I was very much disappointed by the actual announcement on Monday.

You can watch the headset part of the presentation below:

The price tag. OMG, the price tag

Really? $3,500? Really? I’m a writer, so I’ve owned cars that cost less than that.

If we assume that prices will drop in half every year, it will take four years to get down to what I would consider a reasonable price — around $200. Unless, of course, it can work as a phone replacement, in which case, it might be down to a reasonable $875 in two years.

Not available until next year

Did I say two years? I meant three years — because this thing won’t be available until 2024.

Which means if it’s an add-on, and not a phone replacement, it will take five years to get down to a reasonable $200.

The size. Look at the size.

This thing is huge. I want my augmented reality headset to be a pair of sunglasses. Especially since this thing is connected by a cord to a battery pack you wear. If it’s connected by a cord to something anyway, might as well connect it to a phone — or a pack that has the processor in it, so that the headset itself can be a lot smaller.

Also, this feels like it’s meant to be a work productivity tool. That means that the cord could be connecting it to a computer. Again, you can move the chips out of the headset itself and make it lighter.

The creepy eye display.

When you look at someone wearing one of these headsets, you’ll be able to see their eyes and where they are looking. Not because the display is transparent — but because the entire outside surface is a display screen that shows you a video of the person’s eyes.

It is super creepy. To me, at least.

(Image courtesy Apple.)

Also, it seems like such a waste of processing and display just to show a pair of eyes. If you want to have a headset where you can see the user’s eyes, just have a clear-glass headset where you can see the user’s eyes.

The pass-through camera.

And clear glass goes the other way, too.

As far as I’m concerned, in order to use an augmented reality headset for anything, you need to have clear lenses. That way, you can see your surroundings, with the augmented reality stuff as an overlay on top.

This is what I thought the Apple headset was going to look like:

(Image by Maria Korolov via Midjourney.)

Plus, when clear-glass headset is turned off, you can still use it as a regular pair of glasses or sunglasses. I currently wear glasses. Being able to have them do double-duty as a phone display screen would be excellent.

Instead, Apple decided to make the headset opaque, and to use cameras to try to trick you into thinking that you’re seeing out of them to the room around you.

Now, according to Mike Rockwell, head of Apple’s AR/VR team, their headset chips are so good that it “virtually eliminates lag.”

It’s that “virtually” that gets to me. Virtually? So there will be lag between what happens around me, and what I’m seeing in the headset? That’s the kind of fake augmented reality I hate in the headsets I already own, like Meta’s Quest.

I really wanted to have transparent lenses — like the old Google Glass headset, but better looking, and with better functionality and usability.

Google Glass. Image by Mikepanhu via Wikimedia Commons.)

 

Now, maybe Apple will be able to fully eliminate lag by the time people start to actually use the headset three to five years from now, but, today, I’m disappointed.

Even better, when I actually buy this thing they will have figured out a way to use clear glass. The technology is there already — glass that can programmatically go from opaque to transparent and show projected images on it.

Or they’ll have made the cameras so responsive that any lag is completely eliminated, not just “virtually.”

Not a phone replacement

Because this is a bulky headset with short battery life and a cord — and not a pair of sunglasses you can whip on and off — this is not a headset that is going to replace your phone screen.

And if it doesn’t replace my phone, then I’m not going  to be using it all the time. Which means I’ll be using it the way I use my VR headsets today — infrequently. And when I use things infrequently, I forget how they work between sessions.

This means that I’m reluctant to use the headset instead of, say, just a regular Zoom meeting. Which means I use it even less often, until, eventually, it just sits there gathering dust on my shelf. I’m not about to pay $3,500 for a paperweight.

I’ve come around to AR. OpenSim might not be the way to get there — but Apple might be

I used to think that the path to the metaverse started with screen-based virtual worlds then expanded to virtual reality. At some point, I thought, we’d all be doing everything in the metaverse. The same way that the Internet made information instantly accessible to everyone everywhere, the metaverse would do the same with experiences and human interactions.

I spent over a decade of trying to make it happen for myself and my team. We had an in-world office. I started a group for hypergrid entrepreneurs that met in OpenSim. I am on the OpenSim Community Conference organizing team, and our early meetings are in OpenSim. I even figured out a way to get my desktop and many of my apps into OpenSim, so that I could work in my virtual office.

Spoiler: I did not, in fact, ever do any significant amount of work in my virtual office.

Here I am at my desk in my old virtual office.

I still think it’s possible. Well, theoretically possible, at least.

Eventually. But not in the immediate future, and not with the technology we have today.

First, until the resolution of a virtual world is as good as real life, there will be an advantage to working the old-fashioned way, especially when you’re in a graphics-heavy profession. I’m not an artist, but I do create graphics to go with blog articles and social media posts. And I’m the one responsible for web design for several outlets, including Hypergrid Business, MetaStellar, Writer vs AI, and Women in VR. That’s hard to do on a screen in a virtual world.

And don’t even get me started on trying to work in virtual reality. Even typing is hard if you can’t see the keyboard. I touch type, but sometimes I have to type special characters. I never remember where any of them are. In addition, I multi-task. I have several windows open at once and am cutting-and-pasting between them, looking things up, using calculators and other tools, and, of course, checking my phone. I can’t do most of that in virtual reality, even with a pass-through camera. And if I’m just going to be sitting at my computer, typing, why am I in a virtual reality headset, anyway?

But AR — augmented reality — well, that’s something entirely different. Instead of replacing the entire world around you with a virtual one, augmented reality just adds a little bit of the virtual to the actual world around you. Instead of looking out at the world through a distorting pass-through camera, you see the world as it is.

What this means is that, instead of a Zoom call, I can see the person I’m talking to sitting in front of me as a basically a hologram. Well, they’d be projected on my glasses, but to me it would look as if they were actually there.  Instead of a screen full of little Zoom faces, I could see people sitting around a conference table. I’d have to rearrange my home office so that my layout would work for this, but I had to rearrange my office anyway, so that it would look good on Zoom.

The thing is, we’re probably going to get to AR through our phones. Instead of wearing a smart watch, we’d wear smart glasses and just keep our phones in our pockets. Until the phones got so small, of course, that they’d fit completely inside the glasses frames.

The home screen of my phone is currently — blessedly! — ad-free, so I don’t expect to see pop-up ads just showing up willy-nilly in augmented reality, either. If they did, nobody would use the platform. Instead, we’d probably see ads the same places we currently see them — when we play free games and scroll through news feeds.

I can see some very interesting things happening when we get AR glasses. We’d use virtual keyboards instead of physical ones, and probably dictate quite a bit more, too.

I do like the physical feel of a tactile keyboard, but we already have Bluetooth-enabled keyboards that sync to our mobile devices, so I can easily see continuing to use one, if I prefer.

But I can also see myself dictating more.

Speech detection is getting more accurate all the time. In fact, I’m already dictating most of my text messages because it’s so convenient. And instead of physical screens, we’d get virtual screens that float at an arm’s length in front of us, and we can position them where we want them, and have as many screens as we want, of any size. I only have a couple of apps that I use that don’t run on a phone — GiMP and Filemaker. Everything else I do, including word processing, is browser-based, so I can already do it on a phone.

An AR phone will make my desktop PC, monitors and keyboards and backup laptop obsolete. Well, I’d still keep my laptop, just in case, but my other hardware will go the way of all the other devices that smartphones relegated to the trash heap of history. And, also, to literal trash heaps. And Windows. I hate Windows, and will be happy to never use it again.

Since these AR smart glasses will be so convenient, everyone will be using them for everything. We’ll be living in a world that has a continuous virtual overlay on it, a magical plane that gives us superpowers.

(Image by Maria Korolov via Midjourney.)

Oh, and our AI-powered virtual assistants who are as smart as we are, or even smarter, will live inside this virtual overlay.

All the pieces are already there — including the intelligent AI. All it will take is for someone to put them together into an actually useful device.

I’m guessing that this will be Apple. When it happens, I’ll be switching back from Android the first chance I get. I originally had an iPhone, but switched to Samsung when Gear VR came out because Apple didn’t support VR. Then I switched to the Pixel because I hated Samsung so much, and because I liked Google’s Daydream VR platform.

Both Gear VR and Daydream are now gone, though Google Cardboard remains. I still see between 3,000 and 4,000 pageviews a month on my Google Cardboard headset QR Codes page. These are the codes that people use to calibrate their Google Cardboard-compatible headsets. They’re ridiculously bad, and have limited motion tracking, but as phone screens get better, the image quality has become pretty good — good enough to watch movies on a virtual screen, and, of course, for VR porn. Ya gotta admit, porn does drive technology adoption. I’ve heard.

But the phone-screen-based approach seems to be hitting a head end, since few people want to have a huge phone screen strapped to the front of their face.

I’ve been waiting for years for Apple to do something in this space.

This might now be happening.

Here’s a quote from Apple CEO Tim Cook, in a recent interview with GQ:

“If you think about the technology itself with augmented reality, just to take one side of the AR/VR piece, the idea that you could overlay the physical world with things from the digital world could greatly enhance people’s communication, people’s connection,” Cook says. “It could empower people to achieve things they couldn’t achieve before. We might be able to collaborate on something much easier if we were sitting here brainstorming about it and all of a sudden we could pull up something digitally and both see it and begin to collaborate on it and create with it. And so it’s the idea that there is this environment that may be even better than just the real world—to overlay the virtual world on top of it might be an even better world. And so this is exciting. If it could accelerate creativity, if it could just help you do things that you do all day long and you didn’t really think about doing them in a different way.”

He didn’t deny or confirm the release of an Apple AR headset.

But, yesterday, Bloomberg reported that Apple is getting ready to unveil its augmented reality headset this June, and is already working on dedicated apps, including sports, gaming, wellness, and collaboration.

Here’s what an AI thinks that the new Apple headset might look like:

(Image by Maria Korolov via Midjourney.)

I do love my Pixel, and I’d have to replace all my Android apps with new iPhone ones if I switched, but if we’re about to hit an iPhone moment with augmented reality, I want to be first in line.

Teens slow to adopt VR and more bad news for the metaverse

(Image by Maria Korolov via Midjourney.)

Virtual reality hasn’t caught on with American teens, according to a new survey from Piper Sandler released on Tuesday.

While 29 percent percent of teens polled owned a VR device — versus 87% who own iPhones — only 4 percent of headset owners used it daily, and just 14 percent used them weekly.

Teenagers also didn’t seem that interested in buying forthcoming VR headsets either. Only 7 percent said they planned to purchase a headset, versus 52 percent of teens polled who were unsure or uninterested.

That’s not the only bad news for VR that’s come out recently.

Bloomberg has reported that Sony’s new PlayStation VR2 Headset is projected to sell 270,000 units as of the end of March, based on data from IDC. It had originally planned to sell 2 million units in the same time period, Bloomberg reported last fall.

In fact, VR headset numbers in general are down.

According to IDC, headset shipments declined 21 percent last year to 8.8 million units.

“This survey only further exemplifies that the current state of VR is very business-focused,” said Rolf Illenberger, managing director of VRdirect, a company that provides enterprise software solutions for the metaverse and virtual reality.

“The pandemic further accelerated progress for VR and AR usability in the office, while the release of new devices will mean more for developers building practical use cases than they will for teenagers seeking entertainment,” he told Hypergrid Business.

But that might be wishful thinking.

According to IDC, both consumer and enterprise interest in virtual reality fell last year.

Earlier this year, I wrote about how Microsoft and other companies have pulled back on their VR and AR plans. And the bad news has continued to come in.

In mid March, Google announced the end of Google Glass Enterprise. And, last week, the Wall Street Journal reported that Disney shut down its metaverse team and the Truth in Advertising nonprofit advocacy group reported that Walmart had shut down its Roblox virtual experience.

Even Meta’s Mark Zuckerberg seems to have soured on the metaverse. In his March letter announcing a “year of efficiency” and layoffs of 10,000 people, Zuckerberg said that the company was now going to focus on AI.

“Our single largest investment is in advancing AI and building it into every one of our products,” he wrote. So much for the metaverse being Meta’s biggest investment. In 2021 and 2022, Reality Labs — its metaverse division — reported a total loss of nearly $24 billion.

Given the explosion of interest in AI since ChatGPT was released late last year, and its clear and obvious business benefits, I have serious doubts that anyone is going to be investing much in enterprise VR this year.

After all, generative AI is clearly poised to solve a multitude of business challenges, starting with improved efficiencies in marketing, customer service, and software development. And virtual reality continues to be a technology in search of a problem to solve.

I’m a huge, huge fan of OpenSim. But, other than giving a presentation at the OpenSim Community Conference in December, I can’t remember the last time I went in-world for a meeting. It’s all Zoom, Zoom, Zoom, and occasionally Microsoft Teams.

Oh, and here’s another downer. I watched the Game Developers Conference presentations from Nvidia, Unreal Engine, and Unity. I don’t play video games much, other than on my phone, so I hadn’t noticed just how amazing graphics, environments and characters have become. I originally watched for the AI announcements, which were insane, but the realism of the visuals just blew me away. I’m feeling the urge to run out and buy a gaming console.

(Image courtesy Unreal Engine.)

Now, general purpose platforms like OpenSim don’t have to have the same level of graphics to be successful. The early web, for example, had very poor graphics compared to what was available from commercial add-ons like Flash. And look at Minecraft — you can’t get any worse than that, graphics-wise.

So while the graphics were awesome, that’s not why I was most concerned. No, I was looking at the AI-powered environment generation features. And it’s not just Unreal and Unity. There are a bunch of AI-powered startups out there making it super easy to create immersive environments, interactive characters, and everything else needed to populate a virtual world.

With the basic Unreal and Unity plans available for free, is it even worth it for developers to try to add these AI features to OpenSim? It might feel like putting a jet engine on a horse-drawn buggy. I mean, you could try, but the buggy would probably explode into splinters the minute you turned it on.

Am I wrong?

Will we be able to step into OpenSim and say, “I want a forest over there,” and see a forest spring up in front of us? Will we be able to have AI-powered NPCs we can talk to in real time? And will we be able to create interactive and actually playable in-world experience beyond just dance-and-chat and slot machines?

There’s good news, though.

AI tools are helping to accelerate everything, including software development and documentation. With the big players pulling back from enterprise VR, this gives an opportunity for open source platforms like OpenSim to use those tools, grab this window of opportunity, and catch up. Maybe even take the lead in the future hyperlinked, open source, interconnected metaverse.