NFT crash wipes out nearly all collections, and the metaverse won’t save them

Well, the NFT craze certainly crashed and burned quickly.

According to a new report by DappGambl, 95 percent of NFT collections now have zero value. That’s not a typo. Zero.

Out of over 73,000 collections, only around 3,500 still have any market cap at all. And only a tiny fraction of those are worth more than a few bucks.

(Image courtesy DappGambl.)

Remember when NFTs of random pixel art were selling for millions? When Jack Dorsey sold his first tweet for $2.9 million? When Melania Trump thought people would pay real money for NFTs of her eyes? When Alex Pomposelli thought people would pay for a random in-world screenshot?

Yeah, those days are long gone.

Not that, for some people, they were ever there. Of the 22 NFTs in Pomposelli’s AviTron A Metaverse World collection, none have seen any recorded purchases.

One of the AviTron NFTs, “A gentleman in a metaverse.” (Image courtesy AviTron.)

But he’s not alone.

According to the new report, 79 percent of NFT collections had no sales.

But people still spent a fortune minting them, wasting enough electricity to power 2,048 homes for a year, the Guardian calculated, and emitting as much carbon as 4,061 round-trip flights from London to New Zealand. For nothing.

I love how the report points out that a lot of these NFT collections still have insanely high list prices that are completely detached from reality. As if anyone would actually pay over $13 million for some random NFT named MacContract that has only ever sold for $18.

Weekly NFT trading volume. (Data courtesy Cryptoslam via The Block.com.)

Will NFT values ever recover? Experts say they’d need historical significance like Pokémon cards, artistic merit as true artworks, or real utility.

So no, those randomly-generated cartoon apes aren’t coming back.

And neither are random snapshots of OpenSim builds.

As for utility, some say NFTs can represent virtual land or items in the metaverse. But as we’ve seen with falling VR headset sales, the metaverse hype has faded.

Plus, while the NFTs can REPRESENT virtual land or items they are not, themselves, that land or items. They are pictures of that land or items. Pictures that you can go and take for free. Or grab copies of from the NFT websites themselves. That NFT above, “A gentleman in a metaverse,” does not give you actual ownership of that avatar, his clothes, or the landscape behind him. If you want to buy land in the metaverse, you can … actually, you can’t. Anybody who’s “selling” metaverse land is not actually selling anything, just renting you some server space. You don’t really get to own that server. If you want to own a server, you can go to the server store, buy a server, buy rights to the virtual content, and put it on the server. That is the only way, legally, to actually own virtual land. If you do this, I recommend using the DreamGrid version of OpenSim, which, as a bonus, happens to be free. They buy the rights to the build from its original creator or copyright holder. Then you’ll have actual ownership. And only then.

What people are actually “selling” are bragging rights. Like those companies that claim to name a star after you or rent you a square inch of space in Scotland. You don’t actually get a star named after you. You don’t actually buy any Scottish land. You just get a piece of paper with no legal significance. And, with NFTs, you don’t even get a piece of paper.

So, for example, the guy who paid $3.4 million for this picture at a Southesby auction:

By the way, as of this writing, that NFT is selling for $37,532 on OpenSea. And you can just right-click or screenshot, and you’ve got your own copy. I just saved you $37,532. You’re welcome.

The NFTs were just another way for people to gamble online. For a great look at who profited from all these scams, check out the book Easy Money: Cryptocurrency, Casino Capitalism, and the Golden Age of Fraud by Ben McKenzie. I just finished reading it, and it’s a fun read.

And, for future reference, if you want to buy something that you can’t yourself use, then don’t spend more money than you can afford to lose.

Despite his own data, Vlad Hategan, NFT Gaming Specialist at DappGambl, says that NFTs still have a place in the future.

“At DappGambl, we still maintain that once the dust has settled, we will start to see an evolution within NFTs,” he wrote, but added, “To weather market downturns and have lasting value, NFTs need to either be historically relevant (akin to first-edition Pokémon cards), true art, or provide genuine utility.”

I’m not sure how any of those things can be true. First of all, if the NFT platforms themselves will probably go away, since it costs money to keep the servers going and, the amount of money the platforms are collecting from the sales is steadily falling. At that point, the NFTs will no longer even exist, much less exist with any historical relevance. And we’ll always have the news articles for historians to look at.

As far as “true art” is concerned, I can see them being stunt art, like that Banksy painting that destroyed itself after it was bought. But there’s a limited number of buyers for self-destructing art. And, with the Banksy painting, it did have resale value — there was a piece of the painting left, plus the frame, and the shreds. With an NFT, there is literally nothing left other than the memory of its existence — and the news articles about the initial sale. Is there some value in being known as a giant idiot who blew a bunch of money on nothing? I guess… but I’m guessing that these idiots will move onto something else that they can waste money on, instead. Something fresh and hot and new. After all, there might be news value in being the first giant idiot, but nobody pays any attention to the thousandths giant idiot or the ten-thousandths idiot.

And genuine utility? Nobody’s come up with any uses for these things yet. They’re inefficient, ridiculously wasteful of resources, and astoundingly insecure. If someone steals your virtual wallet, there’s no FDIC insurance to get your money back.

I, personally, am glad that NFTs are dying a quick death. But I am a little worried about what’s going to come next.

I was very disappointed in Apple this week

In April, I wrote that I had high expectations for Apple’s new augmented reality headset — and that I was looking forward to switching back to the iPhone if it was what I hoped for.

I was very much disappointed by the actual announcement on Monday.

You can watch the headset part of the presentation below:

The price tag. OMG, the price tag

Really? $3,500? Really? I’m a writer, so I’ve owned cars that cost less than that.

If we assume that prices will drop in half every year, it will take four years to get down to what I would consider a reasonable price — around $200. Unless, of course, it can work as a phone replacement, in which case, it might be down to a reasonable $875 in two years.

Not available until next year

Did I say two years? I meant three years — because this thing won’t be available until 2024.

Which means if it’s an add-on, and not a phone replacement, it will take five years to get down to a reasonable $200.

The size. Look at the size.

This thing is huge. I want my augmented reality headset to be a pair of sunglasses. Especially since this thing is connected by a cord to a battery pack you wear. If it’s connected by a cord to something anyway, might as well connect it to a phone — or a pack that has the processor in it, so that the headset itself can be a lot smaller.

Also, this feels like it’s meant to be a work productivity tool. That means that the cord could be connecting it to a computer. Again, you can move the chips out of the headset itself and make it lighter.

The creepy eye display.

When you look at someone wearing one of these headsets, you’ll be able to see their eyes and where they are looking. Not because the display is transparent — but because the entire outside surface is a display screen that shows you a video of the person’s eyes.

It is super creepy. To me, at least.

(Image courtesy Apple.)

Also, it seems like such a waste of processing and display just to show a pair of eyes. If you want to have a headset where you can see the user’s eyes, just have a clear-glass headset where you can see the user’s eyes.

The pass-through camera.

And clear glass goes the other way, too.

As far as I’m concerned, in order to use an augmented reality headset for anything, you need to have clear lenses. That way, you can see your surroundings, with the augmented reality stuff as an overlay on top.

This is what I thought the Apple headset was going to look like:

(Image by Maria Korolov via Midjourney.)

Plus, when clear-glass headset is turned off, you can still use it as a regular pair of glasses or sunglasses. I currently wear glasses. Being able to have them do double-duty as a phone display screen would be excellent.

Instead, Apple decided to make the headset opaque, and to use cameras to try to trick you into thinking that you’re seeing out of them to the room around you.

Now, according to Mike Rockwell, head of Apple’s AR/VR team, their headset chips are so good that it “virtually eliminates lag.”

It’s that “virtually” that gets to me. Virtually? So there will be lag between what happens around me, and what I’m seeing in the headset? That’s the kind of fake augmented reality I hate in the headsets I already own, like Meta’s Quest.

I really wanted to have transparent lenses — like the old Google Glass headset, but better looking, and with better functionality and usability.

Google Glass. Image by Mikepanhu via Wikimedia Commons.)

 

Now, maybe Apple will be able to fully eliminate lag by the time people start to actually use the headset three to five years from now, but, today, I’m disappointed.

Even better, when I actually buy this thing they will have figured out a way to use clear glass. The technology is there already — glass that can programmatically go from opaque to transparent and show projected images on it.

Or they’ll have made the cameras so responsive that any lag is completely eliminated, not just “virtually.”

Not a phone replacement

Because this is a bulky headset with short battery life and a cord — and not a pair of sunglasses you can whip on and off — this is not a headset that is going to replace your phone screen.

And if it doesn’t replace my phone, then I’m not going  to be using it all the time. Which means I’ll be using it the way I use my VR headsets today — infrequently. And when I use things infrequently, I forget how they work between sessions.

This means that I’m reluctant to use the headset instead of, say, just a regular Zoom meeting. Which means I use it even less often, until, eventually, it just sits there gathering dust on my shelf. I’m not about to pay $3,500 for a paperweight.

Teens slow to adopt VR and more bad news for the metaverse

(Image by Maria Korolov via Midjourney.)

Virtual reality hasn’t caught on with American teens, according to a new survey from Piper Sandler released on Tuesday.

While 29 percent percent of teens polled owned a VR device — versus 87% who own iPhones — only 4 percent of headset owners used it daily, and just 14 percent used them weekly.

Teenagers also didn’t seem that interested in buying forthcoming VR headsets either. Only 7 percent said they planned to purchase a headset, versus 52 percent of teens polled who were unsure or uninterested.

That’s not the only bad news for VR that’s come out recently.

Bloomberg has reported that Sony’s new PlayStation VR2 Headset is projected to sell 270,000 units as of the end of March, based on data from IDC. It had originally planned to sell 2 million units in the same time period, Bloomberg reported last fall.

In fact, VR headset numbers in general are down.

According to IDC, headset shipments declined 21 percent last year to 8.8 million units.

“This survey only further exemplifies that the current state of VR is very business-focused,” said Rolf Illenberger, managing director of VRdirect, a company that provides enterprise software solutions for the metaverse and virtual reality.

“The pandemic further accelerated progress for VR and AR usability in the office, while the release of new devices will mean more for developers building practical use cases than they will for teenagers seeking entertainment,” he told Hypergrid Business.

But that might be wishful thinking.

According to IDC, both consumer and enterprise interest in virtual reality fell last year.

Earlier this year, I wrote about how Microsoft and other companies have pulled back on their VR and AR plans. And the bad news has continued to come in.

In mid March, Google announced the end of Google Glass Enterprise. And, last week, the Wall Street Journal reported that Disney shut down its metaverse team and the Truth in Advertising nonprofit advocacy group reported that Walmart had shut down its Roblox virtual experience.

Even Meta’s Mark Zuckerberg seems to have soured on the metaverse. In his March letter announcing a “year of efficiency” and layoffs of 10,000 people, Zuckerberg said that the company was now going to focus on AI.

“Our single largest investment is in advancing AI and building it into every one of our products,” he wrote. So much for the metaverse being Meta’s biggest investment. In 2021 and 2022, Reality Labs — its metaverse division — reported a total loss of nearly $24 billion.

Given the explosion of interest in AI since ChatGPT was released late last year, and its clear and obvious business benefits, I have serious doubts that anyone is going to be investing much in enterprise VR this year.

After all, generative AI is clearly poised to solve a multitude of business challenges, starting with improved efficiencies in marketing, customer service, and software development. And virtual reality continues to be a technology in search of a problem to solve.

I’m a huge, huge fan of OpenSim. But, other than giving a presentation at the OpenSim Community Conference in December, I can’t remember the last time I went in-world for a meeting. It’s all Zoom, Zoom, Zoom, and occasionally Microsoft Teams.

Oh, and here’s another downer. I watched the Game Developers Conference presentations from Nvidia, Unreal Engine, and Unity. I don’t play video games much, other than on my phone, so I hadn’t noticed just how amazing graphics, environments and characters have become. I originally watched for the AI announcements, which were insane, but the realism of the visuals just blew me away. I’m feeling the urge to run out and buy a gaming console.

(Image courtesy Unreal Engine.)

Now, general purpose platforms like OpenSim don’t have to have the same level of graphics to be successful. The early web, for example, had very poor graphics compared to what was available from commercial add-ons like Flash. And look at Minecraft — you can’t get any worse than that, graphics-wise.

So while the graphics were awesome, that’s not why I was most concerned. No, I was looking at the AI-powered environment generation features. And it’s not just Unreal and Unity. There are a bunch of AI-powered startups out there making it super easy to create immersive environments, interactive characters, and everything else needed to populate a virtual world.

With the basic Unreal and Unity plans available for free, is it even worth it for developers to try to add these AI features to OpenSim? It might feel like putting a jet engine on a horse-drawn buggy. I mean, you could try, but the buggy would probably explode into splinters the minute you turned it on.

Am I wrong?

Will we be able to step into OpenSim and say, “I want a forest over there,” and see a forest spring up in front of us? Will we be able to have AI-powered NPCs we can talk to in real time? And will we be able to create interactive and actually playable in-world experience beyond just dance-and-chat and slot machines?

There’s good news, though.

AI tools are helping to accelerate everything, including software development and documentation. With the big players pulling back from enterprise VR, this gives an opportunity for open source platforms like OpenSim to use those tools, grab this window of opportunity, and catch up. Maybe even take the lead in the future hyperlinked, open source, interconnected metaverse.

How to create seamless tileable textures in Midjourney (plus free alternatives)

I love building in OpenSim, but I don’t love the process of finding or creating textures.

It’s hard to find just the right one, and even harder to figure out its license terms. All those texture packs floating around — are they legitimate? Or are they ripped off from Second Life?

For a while, anytime I went anywhere in real life I’d take pictures of rock walls, brick walls, asphalt, tree bark — anything that I could turn into a texture. Then I’d use Gimp to process the image, in a painstaking and labor-intensive process.

Turns out, my favorite AI image generator — Midjourney — has a tile function that will generate free, unique, seamless textures on demand.

Just type a command like the following:

/imagine flower texture –tile –v 3

The “–tile” command is what makes it seamless, and the “–v 3” command indicates that you want version 3 of the AI model. The latest version, version 4, doesn’t yet support tiles. Though, by the time you read this, it well might.

Here’s one of the results:

(Image by Maria Korolov via Midjourney.)

Here is what it looks like tiled:

(Image by Maria Korolov via Midjourney.)

You really can’t tell where the seams are. The pattern is perfect.

Here is one of the results for the prompt “tree bark texture –tile –v 3“:

(Image by Maria Korolov via Midjourney.)

And here it is again, so you can see what it looks like tiled:

(Image by Maria Korolov via Midjourney.)

Here’s one of the results for the prompt “black white and gray marble –tile –v 3”:

(Image by Maria Korolov via Midjourney.)

And here it is again, tiled:

(Image by Maria Korolov via Midjourney.)

 

 

Here’s one of the results for the prompt “bricks –tile –v 3”:

(Image by Maria Korolov via Midjourney.)

 

 

And here it is again, tiled:

(Image by Maria Korolov via Midjourney.)

 

I had a harder time getting it to come up with a water texture. It kept wanting to give me a horizontal view of the water, from the side, instead of a top-down view. It also tried giving me waves instead of tiny ripples.

I tried asking for “water,” “water surface,” “water top down view,” “water ripples clear ocean top down view,” “tiny water ripples on surface of ocean, top-down view,” and “barely visible water ripples on surface of still tropical ocean, top-down view.” No go.

I finally asked for “light blue seamless water texture for second life –tile –v 3.” That worked:

Here it is, tiled:

(Image by Maria Korolov via Midjourney.)

 

Midjourney offers 25 free images to start with, then the basic plan is $10 a month. I’m on the $30 a month plan, which gives you unlimited images as long as you switch to “relax” mode.

The free, open source alternative: Stable Diffusion

I’m a fan of Midjourney because it’s relatively easy to get something you want that looks great.

You can see how they improved last year in this image progression:

Evolution of Midjourney, from version 1 to version 4. (Images by Maria Korolov via Midjourney.)

But there’s also free, open source alternative — Stable Diffusion — and it also has a tile function.

You can dowload the model and run it on your home computer or in Google Colab to access the functionality, which requires technical skills and, for running it locally, a powerful computer with an Nvidia graphics card.

 

Fortunately, there are plenty of sites that provide simple front ends for Stable Diffusion-based images. Many offer free plans, with additional features available to those who upgrade.

Creative Fabrica has a free online seamless repeating pattern generator. Here’s the result for the prompt “light blue second life water texture”:

(Image by Maria Korolov via Creative Fabrica.)

The images are 1,024 by 1,024 pixels, which is a very good size for textures. The images are easy to generate, and easy to download but it’s not exactly clear what the usage terms are for the free content. You can see all the seamless textures that their users have generated on their community showcase.

If you mouse over the images, you can see the prompts used to create them, and you can then use the same prompts to create your own, unique, versions of the same textures.

For example, I saw a pretty rose quarts texture on the community page, grabbed the prompt — “Realistic Quartz Feminine Rose Gold Marble Texture, hyper realistic, intricate detail, painting, illustration, photograph” — and generated my own version:

(Image by Maria Korolov via Creative Fabrica.)

Here it is, tiled:

(Image by Maria Korolov via Creative Fabrica.)

In fact, many of the low-cost and free online apps that are built on top of Stable Diffusion can provide seamless textures, no technical skills required. Kind of. This isn’t the official tiling function, I don’t think, because the results aren’t always seamless. In my testing, they came out seamless about 90 percent of the time.

For example, here’s the Lexica Art app, and the prompt “water seamless pattern”:

(Image by Maria Korolov via Lexica Art.)

It’s not the water texture I wanted but is, in fact, seamless:

(Image by Maria Korolov via Lexica Art.)

I think it would make for cute wallpaper for a kid’s bedroom. Plus, Lexica Art also gives you very large images — 2,560 pixels square. These are excellent textures.

If you haven’t tried out Lexica Art yet, it’s a lot of fun. You get 100 free images a month, and if you need more, you can upgrade to 1,000 for $8 per month.

Another option is Mage.Space, which gives you an unlimited number of free images. Just add “seamless pattern” to the end of your prompt and ask for square images.

Here’s the image for the prompt “flowers seamless pattern”:

(Image by Maria Korolov via Mage.Space.)

And here it is, tiled, to show that it is, in fact, seamless:

(Image by Maria Korolov via Mage.Space.)

Weirdly, Mage.Space kept giving me warnings for many of my prompts, telling me that “NSFW content is only available for premium members.” Why is a rose quarz texture considered NSWF? Who knows! But rerolling the same prompt did give me results, though it took me several quite a few tries to get something similar to waht Creative Fabrica generated.

Other AI texture platforms

Stable Diffusion isn’t the only way to go.

One free site is Poly, which seems to offer unlimited AI-powered texture generation in a very simple interface. By signing up for a free account, it will also let you save the images you created. But you don’t need to sign up to start creating textures.

It’s a very simple interface, and the texture generation is very fast.

In the screenshot below, I asked for a “colorful stone path” texture:

Another easy-to-use and free AI texture generator is Polycam.

Simplified also offers a free texture generator among its many AI tools, with 1GB of storage. You do have to sign up for an account before using it, but you get unlimited, fast images, using a choice of Stable Diffusion or Dall-E 2 as the AI engine.

Free AI texture sites

You can also use textures that other people have generated.

Free Textures for 3D has a giant collection of hundreds of textures, all free and distributed under the Creative Commons 0 license, which is public domain, meaning that the textures can be used in any way, including commercial.

 

Another collection of free AI-generated textures is available at Pixela.AI. Most are seamless, but some clearly aren’t, so tile them and check before using.

 

Stable Diffusion resources for hard-core nerds

If you want to run Stable Diffusion yourself, with all the raw functionality the model offers, I recommend this tutorial. That’s the one I followed to get it running.

I used the Google Colab option because my computer is optimized for word processing. In other words, I only use it to write, so I bought the cheapest piece of junk I could find. (The days of me running OpenSim servers on my home computer are far, far in the past.)

Google Colab is, from what I could tell, a free service that Google offers that lets people run software on their cloud for free. It’s specifically designed for running machine learning and AI code written in Python. Check out the official Google Colab FAQ for more info, including usage restrictions.

It took me about five minutes from start to finish to get it running and create my first texture:

Note the “Tiling” checkbox at the bottom of the screen. The image it produced for the prompt “second life water texture” was a little on the green side, so I reran it again with “light blue second life water texture.”

Here’s the first result I got:

(Image by Maria Korolov via Stable Diffusion 1.5.)

And here it is, tiled:

(Image by Maria Korolov via Stable Diffusion 1.5.)

That isn’t bad at all. And it didn’t take any time to create.

The default size is 512 by 512, but you can change the sliders all the way up to 2048 by 2048.

Stable Diffusion requires more work than Midjourney to get a good image out of it. Here’s a good resource for some great Stable Diffusion prompts to get you started.

Just remember to uncheck the “Tiling” option once you’re no longer looking for seamless textures. Otherwise, your results will look super weird.

I’m currently using Stable Diffusion version 1.5, which is the “classic” version. But there are newer versions coming out all the time, some optimized to look more like Midjourney.

Then, a couple of hours after I did this experiment, my Google Colab version of Stable Diffusion stopped working. Do I care enough about this to trouble shoot? No, no, I don’t — not when there’s plenty of easy, free sites out there I can use.

Is this the end of the metaverse?

(Image by Maria Korolov via Midjourney.)

Last Friday, AltspaceVR announced that will be shutting down its virtual environment in March.

AltspaceVR was a consumer-focused immersive space that could be accessed either via the desktop or through a virtual reality headset. Since its launch in 2016, the platform has hosted big names and quickly became the go-to platform for virtual concerts, stand-up specials, and other metaverse happenings. It was the platform I recommended to people looking to socialize in virtual reality. For me, its biggest downside was that you couldn’t really use it to build your own world. Plus, it was a proprietary system, owned by Microsoft, and didn’t play well with other metaverse platforms.

In its blog post, the company said that it will be focusing on other metaverse-related efforts, such as the launch of Microsoft Mesh, an enterprise-focused metaverse platform.

Except.

In an SEC filing last Wednesday, Microsoft said it will lay off 10,000 employees. According to media reports, the layoffs include its entire Mixed Reality Toolkit group and the AltspaceVR team, effectively killing Microsoft’s own Hololens project.

It shouldn’t come as too much of a surprise — nobody’s been able to make any money with the metaverse yet.

In fact, just last week, Congress denied the US Army’s request for $400 million to buy more HoloLens headsets. The reason? More than 80 percent of soldiers reported that they didn’t like using the headset, including “mission-affecting physical impairments” such as headaches, eyestrain, and nausea.

In a report last spring, Department of Defense’s Office of the Inspector General warned that buying augmented reality systems without user acceptance “could result in wasting up to $21.88 billion in taxpayer funds to field a system that soldiers may not want to use or use as intended.”

Instead of continuing to invest heavily in AR, VR and the metaverse, Microsoft seems to be going all-in on artificial intelligence.

The rise of AI

On Monday, Microsoft announced a “multi billion dollar” investment in OpenAI, the company ChatGPT and Dall-E 2. Microsoft didn’t give the exact number, but news reports from the New York Times and Bloomberg have put it at $10 billion.

On yesterday’s earnings call with investors, Microsoft CEO Satya Nadella said, “The age of AI is upon us.”

And it’s making money. Microsoft’s AI platform, Azure ML, has seen its revenues more than double for each of the five quarters in a row, he said.

(Image by Maria Korolov via Midjourney.)

Last week, Microsoft made its Azure OpenAI service broadly available, he added.

“And already over 200 customers – from KPMG to Al Jazeera – are using it,” he said. “We will soon add support for ChatGPT, enabling customers to use it in their own applications for the first time.”

Microsoft will be the exclusive cloud provider for OpenAI, deploying their models across Microsoft’s consumer and enterprise products.

In addition, Microsoft is also behind GitHub Copilot, an AI-powered coding assistant.

“More than one million people have used Copilot to date,” said Nadella. “This quarter, we brought Copilot to businesses, and we’ve seen strong interest and early adoption from companies including Duolingo, Lemonade, and Volkswagen’s CARIAD software group.”

“We fundamentally believe that the next big platform wave, as I said, is going to be AI,” Nadella added.

What about Meta, Google and Apple?

Meanwhile, Meta is continuing to bleed money on its metaverse investments. In its most recent earnings call in October, CFO Dave Wehner said that Reality Labs income was down 49% due to lower Quest 2 sales. And expenses were up 24%.

As a result, Reality Labs income was just $285 million — on expenses of $4 billion.

That’s a loss of $3.7 billion. In just three months. That’s a loss of more than a billion dollars per month.

As of last October, Meta had lost a grand total of $30.7 billion betting on the metaverse. And that number is just going to keep going up.

“We continue to anticipate that Reality Labs operating losses in 2023 will grow significantly year-over-year,” the company said in an SEC filing on Nov. 9.

Meta’s metaverse — its Horizons platform in particular — “obviously has a long way to go before it’s going to be what we aspire for it to be,” said Meta CEO Mark Zuckerberg on the October earnings call.

Google has scaled back its VR ambitions a couple of years ago, ending support for its Daydream platform in 2020. Apple still hasn’t released its long-awaited AR headset.

Overall, global sales of virtual and augmented reality headsets shrank 12 percent in 2022, according to research from CCS Insights, from 11 million units sold in 2021 to 9.6 million in 2022.

In addition to the long-awaited Apple headset, which might come this year, another upcoming bright spot is Sony’s Playstation VR, which is a closed, proprietary gaming system, not really a virtual reality metaverse play.

The company will release a new headset in February, which Omdia predicts will sell 1.6 million units its first year. By comparison, the original Playstation VR headset sold 1.9 million units its first year, back when it was released in 2017.

What does this mean for OpenSim?

I think this is a good news, bad news kind of situation for OpenSim, a free, open source platform for immersive environments.

Today, more than 300 public worlds run on OpenSim, plus thousands of private worlds.

However, most users access OpenSim via desktop software. It does have virtual reality support, to a very limited extent. The server software is optimized for desktops, meaning that it favors rendering graphics as much as possible, even if the frame rate drops a little bit.

For users, this means that they get to see more of the world faster. But for virtual headset wearers, it means that when they turn their heads, the image they see doesn’t keep up.

When your vision doesn’t match what your body is doing, the natural physical reaction is to think that there’s something wrong with your brain. Maybe you ate something poisonous. Maybe you should throw up before it gets worse.

Any disconnect between your body’s physical motion and what your eyes are telling you can cause nausea. It’s hard to grow a user base when your product literally makes your customers throw up.

So what’s the good news?

The good news is that companies are continuing to invest in hardware. Apple, Sony, Meta — they’re still pouring in the cash.

But, with the shuttering of AltspaceVR, and Meta’s continuing struggle to get any kind of traction for its virtual world platform, this creates a window of opportunity for OpenSim.

I don’t know if any developers are reading this, but I do believe that an open source, interconnected platform will, in the long term, be the future of the metaverse.

(Image by Maria Korolov via Midjourney.)

And OpenSim is a great test bed for how it can work. That’s because nearly all OpenSim worlds are interconnected. Avatars can teleport from one world to another, with their appearance, belongings, and friend lists. Even when those worlds are hosted on different servers, and run by different companies. It’s a marvelous bit of engineering that more people should know about.

And OpenSim also has a pool of companies and virtual world operators and content creators — and users — who are committed to an open metaverse.

Can the software itself be adapted to run on virtual reality headsets? Maybe. It would be great if that was the case, and I’d love to see it.

RezMela is a CMS for OpenSim, and it’s going free and open source

Empowering users in the metaverse: an open source approach

Empowering users should always be the primary focus of the metaverse and virtual worlds. Our open-source project aims to develop a user-generated content platform that allows users to easily create, modify, and share high-quality content to support applications that have real-world impact. We believe that empowerment is the next level challenge in achieving maximum engagement in any application or environment.

Currently, entertainment applications dominate virtual world experiences, mostly because of the well-understood Skinnerian box design strategy which industry loves because it provides a simplified means to measure traction. In an attention economy where ethics have often taken a back seat, users are incentivized to return to an experience, even if they are not particularly interested in it after its novelty has faded. This is often achieved by reward systems that are unrelated to the virtual experience itself. Whether the experience increases well-being or is addictive is not particularly of concern in a profit-making enterprise. There is certainly a need for more applications that enhance cognition and productivity to create a balanced ecosystem of use cases for the metaverse.

We propose a multi-disciplinary open-source strategy to advance the metaverse from a collaborative design perspective. Our strategy focuses on empowering users by providing them with the tools, resources, and support they need to take control of their own experiences and to make meaningful and positive contributions to the virtual and real world. We believe that this approach will resonate with end users who are looking for applications that facilitate their work or daily lives, as well as with educators who are seeking new pedagogical approaches to explore and advance.

RezMela-generated scenes. (Image courtesy RezMela.)

Open source resilience

As an open-source effort, we face an uphill battle against well-funded corporations that can invest billions of dollars in achieving goals that at the surface are hard to distinguish from ours. However, one of our key advantages is that the opensim platform is designed to be decentralized from the very beginning. This decentralized design provides numerous benefits and is a major strength of our platform. In short, decentralized systems are better than centralized ones because they allow for the distribution of power, eliminating the need for a single, centralized authority.

We believe that open-source projects like ours can be more resilient in the long run because they are not dependent on a single company or organization. When a company that supports a proprietary system goes out of business or changes its focus, the users of that system are often left stranded with no support or options. In contrast, open-source projects are supported by a community of users and developers who are invested in their success and are committed to maintaining and improving the system over time.

RezMela: a much needed user-facing layer for OpenSim

Our goal is to carefully plan and execute this process to ensure that our RezMela platform code is properly documented and supported. To facilitate collaboration, we plan to set up a Discord server and a forum for asynchronous communication. Our existing website provides information about our initial set of applications. Some of these applications are currently available on the Kitely Market but since we want our offerings to run on any grid we will be making it freely available during the first quarter of next year.

The RezMela Composer building tool is available for $1 on the Kitely Market.

We are working with the makers of Outworldz DreamGrid to integrate the RezMela platform into their offering. This will allow end users not only to be able to set up their own regions with the ease that DreamGrid makes possible but soon users will be able to create advanced virtual experiences as well.

We believe that the philosophy of design is important in setting the context for our work. That is why we have written a book to explain our philosophy and road map for the future. This book is based on our collective experience in implementing multi-disciplinary design for software applications that have real-world impact.

Our current platform allows users to organize content into modules that interact with applications that can be deployed on any piece of virtual land. These apps can have a range of functions that the end user can easily customize. The content of these modules can be as simple as a rock or as advanced as a tree that changes its appearance at the click of a button to match the season, or even an advanced AI NPC powered by the IBM Watson conversation engine.

We also have apps that are integrated with Google Street View, allowing users to create virtual tours of any location on Earth. The apps allow users to create scenes consisting of objects sourced from these modules, which can be saved and shared with others. The point-and-click functionality of our apps allows for rapid and efficient creation of virtual museums and campuses.

More recently, we developed a user-facing scripting language to control the behavior of OpenSim NPCs, or bots. Our current system supports a wide range of applications, which are described on our website. As we open-source these applications, we plan to focus on documenting and growing our user base to help build a knowledge base and make support easier for all.

In terms of future functionality, we plan to integrate APIs for more powerful AI engines, such as the GPT series from OpenAI, into our modules.

We have also written a book to share the body of knowledge on which our work is based, in order to inspire and guide new members of our community in advancing the envelope effectively.

Current team and forthcoming book

Our current team is small, consisting of only five members: Ramesh Ramloll, Andrew Hellershencks, Anabel Nowak, Ludovic Lotoah, and Sue Claxton. We will also be collaborating with Fred Beckhusen, well known developer of Outworldz DreamGrid.

However, our team is designed to be multi-disciplinary in nature, as we believe this is necessary for our open source design efforts.

We will be publishing a book in the first quarter of next year that explains the need for a multi-disciplinary approach and welcomes participation from individuals with a wide range of expertise, including coders, artists, 3D modelers, cognitive scientists, neuroergonomics experts, storytellers, and video production makers. We believe that everyone can contribute something valuable to our project.

We are well aware that this project is ambitious. Building a new layer of technology on top of the only decentralized, open-source Metaverse platform, Open Simulator, is not a small undertaking. While this article provides an overview of why this layer could be important, our forthcoming book, which will become available during the first quarter next year, will do a better job of explaining the various streams of thought that led to this effort.

Financial concerns

As with any project, financial sustainability is a concern. In the past, I made the mistake of not paying attention to the amount of money I was investing and became too focused on the urgency of the mission. I never paid myself for the years the project ran, and while I tried my best to remunerate contributors, I always felt I could do more.

Funding will always be a priority, and I will try to use my experience to raise funding through grants. However, we also hope that the traction our products gain will open up donation opportunities. We plan to open an online Merchandise store where we could sell items that could help to spread the word about our efforts. We are open to suggestions and support in figuring out the best ways to sustain this project.

As with any open-source effort, we recognize that this is a community project and will continue to hold meetings where anyone can join to learn about the status of our effort, meet other users, and get help as needed.

Invitation to participate

As you can see, we have a lot to contribute to the OpenSimulator community. We are working hard to make this possible and welcome your interest in our progress. If you would like to stay updated on our activities, please visit our sign-up page and share your contact information with us. We look forward to keeping you informed and involving you in our efforts.

The VA’s Transformation — Evading the Frustration

Gordon Dodson

A combat veteran suffering from PTSD shares his challenges in traversing the VA’s endeavor to keep up with technology. These institutional failures led to the founding of 2B3D, a first-of-its-kind metaverse for delivering live, free mental health services to vets via web3.

The facts are not new. The number of veteran suicides per year speaks for itself. We are living in the age of ever-evolving information, and yet we struggle with providing seamless services for our veterans to access.

As a 30-year IT professional and a retired Army officer, I personally understand the challenges that come with fielding a technical solution on an enterprise level. Nevertheless, it seems amiss to allow such challenges to be pushed down for veterans to resolve on their own, and navigating these complex systems can trigger a veteran’s anxiety, yielding the exact opposite desired effect of post-service care.

For too many veterans, navigating our transitional systems has become a living nightmare, and the only possible solution is to unite and fight this beast head-on, together with the Veteran Administration’s — or VA’s — help.

It’s not that the government doesn’t have good intent — it’s the magnitude of the problem that’s the challenge. There are literally hundreds of different systems and associations trying to work together, and the biggest roadblocks are interface challenges that can actually make routine actions more difficult and overly complicated.

When I retired in 2012, I felt confident I could work through any transition without much of a problem. In the military, we moved every couple of years and got deployed routinely, so what’s another transition?

However, it quickly became apparent to me that the system was not prepared to adequately support my needs, and I would be looking out for myself.

The hardest part about this lookout assignment, like other veterans, is that I am not equipped to identify, understand, and completely care for my needs. To say this has been a challenge is the understatement of the century, and I just cannot imagine how difficult it is for homeless or disadvantaged veterans, who I see everywhere and very obviously have limited access to VA resources.

I have been diagnosed primarily with Post Traumatic Stress Disorder — or PTSD — and trying to navigate the system under those conditions has been all-consumingly depressing. My treatment to date has been primarily facilitated by the VA and their “mental health professionals,” who the VA contracts out to provide this service. Contracts primarily go to the lowest bidders who check all bare minimum requirements, and those professionals are repeatedly shown to have little to no experience in dealing with the type of traumatic stress veterans experience.

On one particular visit, I was left in a secure, windowless room for over three hours while being informed the nurse was on her way, only to learn there was a shift change, and I was not part of that transfer of responsibility.

At that point, I really started to wonder what I was doing wrong and if this problem and these conditions were specific to me or endemic to the entire organization. From the outside, everything looks fully operational and streamlined, but when you are in the weeds and going through the process, it is failing to provide the care veterans deserve.

Sometimes I wonder if I would be better cared for by the VA agency and staff with physical injuries that are glaringly more apparent, and then I ask myself why I would even dare to think such a terrible thought.

We need to talk about and understand the absurd reasons that I, a prior-enlisted soldier to lieutenant colonel in the US Army with years of service to my country, would ever begin to feel this way. The worst part is that I am not alone, as I’m finding many veteran groups and forums discussing the same concerns and upset.

So what’s the solution? The answer can and will be technology.

Combating veteran suicide and PTSD is not a new requirement or mission, and because of evolving technology and boots-on-the-ground capability, we now have an opportunity for a 2B3D veteran team to combat this destroyer head-on.

The vision is to provide a free means to deliver essential life-saving services 24/7/365 to all veterans, regardless of their location or financial status. With the launch of 5G and Starlink networks, internet connectivity and bandwidth challenges are becoming a challenge of the past.

I became hooked on their vision when I realized the veterans on the team had already created the baseline to launch a virtual reality metaverse designed to provide veterans with medical care. 2B3D’s Virtual Reality Medical Environment — or VRME — is sometimes referred to as our veteran’s virtual medical campus and is designed to look and feel like a video game but with real interactions between people.

It combines people and technology, not just replaces people with technology. The VRME has several verticals that will directly address a variety of medical challenges that expand multiple disciplines, and we are not alone as we engage other leading-edge companies in partnership and integration opportunities, all designed to improve the medical support system.

The VRME solution, and others like it, needs to be embraced by the Veterans Administration. Moving critical services into a VA-driven VRME will address the direct concerns veterans personally encounter as they continue to negotiate the VA’s ever-evolving programs supporting veteran suicide and PTSD challenges.

Taking the frustration out of the equation will be a key component in our deliverable. The benefits of immersive VR technology are continually improving and great and include those currently being realized by successful telehealth and telemedicine programs.

Developing key community partnerships that enable spreading the word to the entire military community is why 2B3D and Military.com are working together to highlight this new initiative to combat veteran suicide and PTSD on different terrain.

Getting the word out to the entire military community is our key objective. It’s important for veterans to keep abreast of evolving technologies that directly support the military community. The VA’s telehealth network is one such community whose next technical evolutionary step is real-time VR immersion to prevent or resolve existing challenges that are associated with mental health.

2B3D’s military and veterans VRME will offer more than just crisis response services and will include a variety of other essential services for the veterans and their family members to utilize. Such services will include 2B3D’s VRx NFT prescription service and blockchain technology to set the conditions needed for a veteran to securely access and manage their healthcare data.

2B3D is taking the lead in setting the conditions for Veterans to easily transition into its VRx environment and has plans to create active duty and veteran-specific NFTs to ensure the military community isn’t left behind in this rapidly fluid time but leads the way into this new realm of possibilities for all veterans to take advantage. Veterans deserve to be thought of first, and carving out a portion of the metaverse for them now is the best possible solution we can offer.

ABOUT 2B3D

2B3D is a decentralized metaverse with active and developing communities in the crypto-verse. The 2B3D metaverse includes several core projects:

  • VRx | Virtual healthcare with NFT prescriptions and real-life professionals.
  • NFTy150 | NFT marketplace, minting option, and showroom.
  • Topher’s Inferno | Connecting enthusiastic gamers with ambitious developers.
  • RestXP | B2B meeting rooms with a resting crypto reward.
  • So Many Gods | A Play-to Earn, space-themed sci-fi looter shooter.

To learn more about 2B3D, visit www.2B3D.com or follow us on Twitter at @2B3Dinc.

Snoot Dwagon’s OpenSim Fest 2022 exhibits

(Image courtesy OpenSim Fest 2022.)

I’m Snoots Dwagon, head of greeters, exhibitor, and a performer at this year’s OpenSim Fest.

It’s my first OpenSim festival, and I’ve got two parcels you might want to check out, Rascal Flats and Replicant City.

In the case of both exhibits, the entire purpose is to provide some fun for visitors.  I don’t sell things in OpenSim, we don’t charge rent or other fees, and all my lands are open 24/7/365 to visitors, with self-guided tours and freebies all over.

The Replicant City exhibit is designed to be fully interactive and to give visitors some fun things to do.  It contains a re-imagined replica of the Firefly spaceship, explorable with lots of buttons to press.

The Rascal Flats exhibit has a working Ferris wheel, a Godzilla battle, a beautiful dance floor, and a house with free Wee avatars – including a small robot, four colors of koala bears, kitties, and a tiny penguin.

Let me tell you a little more about the event.

Snoots Dwagon

Officially abbreviated OSFest, this annual expo of OpenSim creators is being held July 8 to July 25, 2022.

This mega-event can be accessed by registering and creating an event avatar or by using your existing avatar and visiting via the hypergrid.

The hypergrid address is grid.opensimfest.com:8022.

 

 

OSFest offers many attractions:

  • Dozens of live performances and dances during the entire event
  • Displays showing off the talents of OpenSim creators
  • Market regions with OpenSim merchandise
  • Freebie areas for new and old members alike

Find out more at the official event website. In addition, you can locate individual OSFest areas here.

This is THE OpenSim Festival

If you’re an OpenSim user, or a Second Life resident who has wondered what OpenSim is all about… come to OSFest ’22. It’s a great way to immerse yourself in all things OpenSim!

Snoots Dwagon’s exhibit area – Replicant City (Image courtesy Snoots Dwagon.)

The history of Replicant City

I started building Replicant City in 2010 on InWorldz grid and spent a great deal of the next 12 years adding things to it a bit at a time.

Most of the creations are mine, but occasionally other creators would donate something to the City if it applied to the science fiction theme.

When InWorldz took a nose-dive into the pavement I was very fortunate to be able to salvage the entire Replicant City build in an OAR file and recreated it in high-sky ElvenSong VAR on OSgrid — this provided a unique arrangement of Elven Fantasy on ground level, and a full science fiction city in high sky.

Replicant City is now so large and detailed that it literally takes days to fully explore everything that’s there, taking it an hour or two at a time. It is truly huge, extremely detailed, totally interactive throughout, and can be very addictive when explored.

Snoots Dwagon’s exhibit area – Rascal Flats. (Image courtesy Snoots Dwagon.)

The history of Rascal Flats

Rascal Flats is a “tiny town” found on VARMINTS region, DigiWorldz Grid. It’s part of the Wellspring group and is provided by them for enjoyment by the Wee community — Tinies, Dinkies, and other Smalls — free of charge. There are free avatars, free homes, and free activities.

“Wee is free” is the motto. Rascal Flats is an affiliate land of the ancient Elf Clan group, which means that Rascal is G-rated, family-friendly, and safe for all visitors.

Here’s how Rascal Flats got its start.

The Wellspring Group originally built the Sendalonde Community Library on InWorldz.

It was widely known for its readable library, live play theater presentations, and regular large exhibits with numerous creative talents. For example, one year they held a sci-fi exhibit, with creators from all over InWorldz contributing to the event. It was truly fun to explore.

I was one of the contributing creators to the Library and was invited to the Library Board of Directors, which is how I came to know the owners Alexina Proctor and Prax Marijaxz. I hope my memory is serving well. This was several years ago.

As with Replicant City, when InWorldz died, they managed to barely save some major components of Sendalonde, including most of the library.

They re-established their lands on OpenSim, setting up on a 64-region mega-server.  Their rebuilding task was considerable since their OAR files did not come across as fully as Replicant’s.

They announced that they were rebuilding the Sendalonde Community Library, which is what caught my attention.

I had worked with them on the original Library by building a nice theater in the back.

So I contacted them and asked if they would like me to rebuild the theater for them as that was lost in the transfer.

Their response was three-fold:

1. Yes, they would very much like to have the Theater rebuilt
2. Would I like to be on the Board of Directors again and be their scripting and technical consultant?
3. (and this was the clincher)  Would I like to build a tiny town where tinies could live and play for free?

Now, what Dwagon could resist an invitation to build an entire town for Wees, Tinies, and Dinkies?   They knew the bait, and I chomped.

I built the theater, which turned out nicely if I do say so.  I built a town that was later expanded to two-and-a-half times its originally intended size and renamed it Rascal Flats, a name that properly relates to its residents.

We added a huge sandbox, a rather large Giant Spider Race Track, a 50’s-style Eats Diner with a good-size dance floor among the booths and custom jukeboxes, and a Pirate Cove with several sailing ships.

Rascal Flats contains numerous homes which are free to Wees who care to settle there, as well as a Central Park with several attractions.

Today in 2022, Rascal Flats is pretty much finished and as large as we want it to be: two-and-a-half regions of tiny mayhem.

We are friends and close associates of Weelandia, another land of Tinies on the same grid.  We often announce and attend each other’s events, and work together on projects.

For example, we fixed the widely-distributed but often-broken mesh Koala avatar and sent it out to be distributed among the Wee society. A copy of this avatar can be picked up at the Rascal Flats exhibit at OSFest.

We’d like to invite readers to visit Rascal Flats and the Sendalonde Community Library. It’s always good to see old friends… and meet new visitors!

We’ve been added to “the list for destruction”

An OSgrid shop titled “From Ukraine with Love” that sells items in support of Ukraine on OSgrid’s Big Easy region. (Image courtesy The Big Easy.)

So I just got a warning in my email inbox from a “touche@1studi.ru.” The domain name is associated with a company that runs a couple of Russia-based OpenSim grids.

Apparently, the flag of Ukraine is now a “Nazi” flag. Now, I mean sure, there are Nazis in every country — the United States and Russia are both filled with crazy right-wingers — but these guys seem to be referring to Ukraine’s leadership.

Of course, Ukraine president Volodymyr Zelenskyy is himself Jewish, his grandparents fought the Nazis, and his greatgrandparents were killed by the Nazis when their village was set on fire. There’s isn’t even a whiff of connection between him and any neo-Nazis. And the country’s last prime minister, Volodymyr Groysman, was Jewish, as well.

It feels like the Russians are making a desperate attempt to connect the war they’re fighting now in Ukraine with World War II, the last war in which they fought against a clear evil.

I believe they’re following the “big lie” principle of propaganda — if you tell a giant lie often enough and loudly enough, then some people will start to believe you, no matter how obviously ridiculous it may be.

It’s a little disturbing that they’re using an official company email account for this. There doesn’t seem to be any concern about repercussions for pushing the propaganda.

I doubt their accounts were hacked. I don’t think that Russia’s state-sponsored cyber warfare teams would care much about OpenSim.

Here’s what they said:

You should be more careful in making decisions about your journal.

This is not a threat, but a warning. If you continue to distribute Nazi
symbols (the flag of Ukraine, etc.), publish calls to “help” Ukraine
financially, publish links to donations, your magazine will be included
in the list, along with “opensimworld” for destruction.

So let me provide some information about how to help Ukraine financially and publish some links to donation sites:

  • Global Giving‘s page for Ukraine lists dozens of projects from help for abandoned babies to medications for children’s oncological units. They currently have a top rating from Charity Navigator.
  • UnitedHelpUkraine.org has projects such as providing medical supplies, humanitarian aid and help for wounded warriors. They also have a high rating from Charity Navigator.
  • Save the Children provides families with immediate aid such as food, water, hygiene kits, and cash assistance. According to their site, $100 helps supply a month’s worth of nutritious food to a Ukrainian family in crisis. The organization has a top rating from Charity Navigator.
  • Direct Relief provides medical aid, emergency response packs for first responders, oxygen concentrators, critical care medicine, and more to the Ukrainian people. The organization has a top rating from Charity Navigator.
  • Doctors Without Borders provides Ukrainian hospitals with medical supplies. The organization has a top rating from Charity Navigator.
  • World Central Kitchen is supplying thousands of fresh meals to families fleeing their homes as well as meals to those people who are staying in Ukraine. The organization has a top rating from Charity Navigator.
  • The Humane Society International is providing support and emergency funds to groups that are working to help with relief efforts, care, and providing for animals. They have a high rating from Charity Navigator.
  • Support Airbnb. They’ve waived guest and host fees for bookings to help funnel money directly to Ukrainian hosts and also have an initiative to provide housing to Ukrainian refugees.
  • Help Ukrainians with disabilities by supporting Fight For Right’s GoFundMe campaign. This is a Ukrainian organizations that’s almost reached its 400,000 Euro goal.
  • And here’s the full list of top-rated charities recommended by Charity Navigator that are focusing on helping Ukraine. These are vetted organizations, so you can be sure that your money is making the biggest possible impact.

And since Hypergrid Business is a media organization, I’d like to throw in a pitch to support independent media in Ukraine.

  • Kyiv Independent’s GoFundMe campaign has surpassed its 1.2 million Euro goal. This new publication is an English-language outlet run by a team of experienced journalists who were formerly with the Kyiv Post before that newspaper was bought and shut down by a businessman who didn’t like its independent voice.
  • Support other Ukrainian media, including  Ukrainska Pravda and Zaborona, by donating to a separate GoFundMe that the Kyiv Independent is running for other media outlets. This campaign has already surpassed its 800,000 Euro goal, but I’m sure they can use more money.

And if you want to know more about what OpenSim grids are doing to help Ukraine, check out our previous article, Grids stand with Ukraine following Russian invasion.

But let’s get back to the email I just got. Here’s the rest of it:

Again, this is not a threat, but a warning. Please don’t forget who you are dealing with. Look at Ukrainian and Polish government websites, intelligence, finance and law enforcement websites.

If you continue your activity, your journal will be added to the list for destruction.

This is not a threat, but a warning.

Think twice. We don’t repeat twice.

Oh, noes, I’m so scared. Also, I don’t think they know the difference between “threat” and “warning.” It kind of sounds to me that they’re making a threat.

In any case, I would be super duper stoked to be added to a “list for destruction.”

I’ve been feeling a little guilty that I haven’t already jumped on a plane to cover the war in Ukraine. Twenty years ago, I would have.

In fact, after the fall of the Soviet Union, I actually went to Russia and the former Soviet republics and covered the wars there for Reuters, United Press International, the Moscow Tribune, and other publications. I reported from Chechnya, from Georgia, and from Tajikistan. I was taken prisoner twice, I had a death squad show up at my door. I was shot at. I was shelled. I walked through mined villages and slept in trenches while bullets flew overhead. I caught rides on tanks and helicopters. I was on the ground in Abkhazia with a UN delegation when Russian planes flew overhead in a bombing run. I helped surgeons operate on a little girl who had fragments from a shrapnel bomb through her body, and who died on the operating table. I visited prisoners of war and interviewed refugees and crossed borders and front lines.

I saw first-hand what happens when former Soviet republics try to leave Moscow’s orbit.

The Ukrainians saw all this, too. And they knew that the West didn’t step in and help, that Russia was allowed to bomb civilians in Chechnya, in Georgia, and in other republics with few repercussions. Even when Russia invaded Crimea in 2014, all Russia suffered was some relatively minor financial sanctions.

Ukraine is not part of NATO, not part of the European Union. But it’s fighting anyway.

I personally think it’s an inconceivably brave stand to take.

So I’m donating to the charities, and I hope that our readers will, as well.

How science fiction precedes science fact and what it means for the metaverse

Some days, it feels like we’re all living in a science fiction novel, though more dystopian than utopian. Between a global pandemic, Russian saber-rattling on the Ukraine border, and climate change challenges, you’d be forgiven for despairing about the state of the world. And yet, there is more peace in the world than war, extreme poverty is on the wane in most parts of the planet, and we’ve made incredible progress against childhood mortality.

While we still don’t have the flying cars, time machines, and holographic movies that sci-fi has long offered us, it’s not to say the genre hasn’t delivered on some promises. The Star Trek communicators became the Motorola flip phone. The gentle hum of electric cars in are now selling like hot cakes in the form of Tesla and others.

Jules Verne-inspired space tourism is now a thing, video calls are de rigueur, and we all walk around with powerful computers in our pockets that connect to an infinite web of data and information — and yet we use this incredible computing power to play diverting games like Candy Crush and Wordle.

One subset of science fiction, virtual fiction, or what I like to call ViFi, is having its moment in the spotlight as the tech world leans into a future it believes will be dominated by the metaverse. While there is no one, unifying definition for what the metaverse actually is, the general consensus is that it will involve a more immersive version of the current internet, likely an enhanced digital reality that enables users to connect and communicate in a virtual space — William Gibson called it cyberspace — and perhaps using interface tools such as goggles or headsets to offer a more immersive experience than today’s flat-screen, 2D internet. The term itself was first coined by Neal Stephenson in his 1992 novel Snow Crash and popularized on the big screen adaptation of Ernst Cline’s 2011 novel Ready Player One.

Still from 2018 film Ready Player One.

I wrote my own ViFi novel, MetaWars, in 2010 after the Icelandic ash cloud grounded flights across Europe. That event inspired a story world where everyone interfaces digitally in a global metasphere using a brain-computer interface and is even able to upload their consciousness to the internet to achieve a type of digital immortality.

And while it’s easy to dismiss today’s hardware as clunky and uncomfortable — I still loathe headsets and get nauseous after about sixty seconds into any virtual reality experience — it’s notable that in a way, we’re already living in the bunny hill equivalent of the metaverse. Anyone who spends their days working from home, on what feels like non-stop Zoom meetings or Microsoft Teams sessions, is living a type of virtual reality. We each project a version of ourselves, a type of avatar, to present our digital selves in a constructed way. My personal metaverse high water mark was getting a good score on Room Rater, but no one ever sees the laundry hanging on the rack just out of view.

We’re all willingly living in the Matrix, eschewing parts of our real lives to give ourselves over to Big Tech’s tools of virtual interconnectivity. And it shows no sign of abating. Last week’s announcement of Microsoft’s takeover of Activision Blizzard was couched in terms of capitalizing on the metaverse. And last year, Facebook nailed their colors to the mast, rebranding themselves as Meta and declaring their intention to be a metaverse-first company.

All this was foretold in ViFi. Technology tends towards monopoly, or at least oligopoly, and the metaverse will likely not be different. The Pareto Principle applies here, in that there will be a few — or even one — big winners, and a lot of also-rans. Just look at search, mobile operating systems, social media, and virtual real estate micro-leasing to name a few verticals that are dominated by three or fewer players.

(Image courtesy julientromeur via Pixabay.)

The great filmmaker David Cronenberg made an underappreciated film called eXistenZ, which portrayed a world dominated by a fully immersive computer game that required a bio-connection to play and two dominant companies fighting for the attention of the world’s gamers. I wonder if that’s the direction of travel for Microsoft. It’s not such a huge leap given that some ridiculous number of humans believe that Bill Gates has put microchips into vaccines. Maybe the next big game from Activision will require bio-connectivity — real life imitating Cronenberg’s art.

Of course, the virtual metaverse interacts with the real world in very concrete ways. The tap of an app can bring restaurant-quality food to your door in a matter of minutes. Calling an Uber sure feels like waiving a magic wand, summoning a chariot in the form a Toyota Prius. And the fact that I can snap a photo of my kids and share it instantly with their grandparents reminds me that when I was a boy  — in what my children teasingly call the nineteen hundreds — sharing photos involved two trips to the drug store to drop off film and pick up prints. These daily parts of our modern lives would have read like science fiction just twenty years ago.

Twenty years from now, will we look back at various ViFi stories as the road map for the way we live? Or, will the metaverse be the flying car, more dream than reality?