The metaverse: Where we are and where we’re headed

Enterprise

Products You May Like

The coming metaverse has provoked hype, confusion, and misinformation.

For technophiles, the metaverse represents a nirvana: a place to immerse yourself in any digital surrounding, and participate in any physical reality, at any time – and also to be able to see and feel anything, even if you are thousands of miles away from that physical place.

In a future state, electromyography (EMG) movements and neural interfaces — triggered by only slight finger movements — will allow you to control devices, communicate, and collaborate with others almost as simply as thinking. Your eyes will exploit glasses that use complex sensors to see both your own reality, but virtual ones as well. 

Some aspects of this apparent science fiction world are closer than we realize. Matthew Ball, a venture capitalist who has studied the metaverse closely, last year wrote a series of articles about where things are headed in the next decade. Ball breaks down the various technologies and protocols that need to come together to create the metaverse. 

Ball categorizes the metaverse into eight core features, which can be thought of as a stack.

Matthew Balls eight categories of the metaverse.
SOURCE: Matthew L. Ball

While Ball’s vision extends to the next decade, this article focuses on where the metaverse is headed in the next two or three years. It aims to review what most enterprise decision makers need to know, whether they’re in the gaming industry – where the metaverse has had its most immersive form so far – or in the enterprise, where things are only now starting to take shape. 

It’s clear that there will be a first big wave of innovation over the next 12 to 24 months, where “mixed reality” hardware produces breakthroughs for immersive experiences. 

A second big wave is then likely somewhere in the next three or more years, when fully immersive augmented reality (AR) glasses hit the market in a bigger way. This hardware is important, because it’s the gateway to the metaverse.

What we’ve also learned is that key aspects of the metaverse – like allowing your personal avatar to show up as a hologram to someone else in their physical reality – may be pie in the sky as a practical technology right now. But not within the next three years. 

Facebook demos a life-like avatar image that shows pores. 
Source: Facebook

Experts agree there is a revolution afoot, pushed forward by the convergence of several technologies and social forces – the onset of 5G networks, the need for more intense virtual collaboration accelerated by COVID-19, the rise of edge computing (that allows for more ambient intelligence), and advances in AI, AR, and VR. Add in blockchain and NFTs (non-fungible tokens) and it’s clear the metaverse is the biggest technology revolution since the emergence of smartphones 15 years ago. 

The underpinnings of the metaverse have already taken the gaming industry by storm, because gaming is where virtual experiences have been the most immersive. In fact, there’s almost a separate conversation happening in gaming, where virtual interaction and things like NFTs and cryptocurrencies are spawning a creator and gamer economy that hasn’t yet impacted the enterprise. 

Bitter rivalries exist in gaming, as evidenced by the legal battle between Apple, which wants to charge 30% for access to its app store, and game maker Epic, which needs to access the iPhone because it’s such a compelling format for gaming but refuses to pay that tax. Many gamers dream of a connected network of always-on 3D virtual worlds where you can port your gaming profile anywhere. But that’s not going to happen anytime soon, given that virtual spaces are owned by different companies. And besides, a cross-gaming metaverse doesn’t fully encapsulate the metaverse’s full potential – the one that will transform just about every industry.

While forecasting the exact form of the coming metaverse is impossible, the seeds are being sown today. To see where the metaverse for enterprise applications is headed in 2022-2025, it’s best to look at the industry players that have been working on building the metaverse the longest. And there, it makes sense to start with the giants, who have the most resources. In this case, it’s companies like Google, Meta (aka, Facebook), Microsoft, and Apple, but also critical infrastructure companies like Nvidia and Qualcomm, and the fastest-growing virtual native companies like Epic, Unity, and Roblox. 

Google’s metaverse: Glasses

Google, a company known for moonshot ideas, was the first to unveil augmented reality glasses in 2012. The glasses presented a small screen on the right side of your vision designed to carry web or other information streamed via Wi-Fi or Bluetooth connection with your phone. A host of complications caused Google’s initial effort to fail. 

Glass Enterprise Edition 2.
Source: Google

But Google hasn’t given up. It’s been working on an enterprise version. Three years ago, it released the second version of its enterprise glasses, called Glass Enterprise Edition – mostly to allow workers to go hands-free. It has gotten good traction in a handful of industries that are heavy in logistics, manufacturing, or collaboration — where a worker can stream what they are seeing via their glasses to get advice from someone watching along. Health care (Sutter Health), transportation (DHL), and agriculture (AGCO) companies are among those using Google Glass.

In this vision, shared by many other leading tech companies, the glasses can funnel information from the web – including virtual images of your friends or colleagues, or anything else – so that you can enjoy an alternative reality that is layered over your physical reality. Most recently, Google made it possible to hold Google Meet on the glasses, and also partnered with Verizon’s Bluejeans to allow conference calls that way, too. 

Google is tight-lipped on its ultimate plans and vision for the metaverse, and declined to comment for this story.

But reports have emerged that Google is working on a new AR headset, now codenamed Project Iris. The head-mounted display will begin shipments in 2024, according to two anonymous sources that were cited by The Verge. It will carry outward facing cameras, and a blend augmented reality within a video feed of a user’s real world.

This follows other signs that Google is serious about the metaverse. Last year, it acquired North, an AR startup that had purchased assets behind a smart-glasses project that had originated at Intel. Last May, it unveiled Project Starline project, an impressive light field display technology that allows 3D, life-like holograms of people to appear in front of you  – although the intricacy of its setup and its cost (in the tens of thousands of dollars) limits its ability to scale to popular use at this time. And in November, Sundar Pichai led a reorganization that placed all projects related to the metaverse into Google Labs – led by Clay Bavor, and directly reporting to Pichai. 

AR glasses are no silver bullet for the metaverse – yet

Still, as promising as AR glasses may be for the enterprise metaverse, they face numerous complications. For example, getting lighting just right is tough, because the AR information or images need to stay in front of your pupil – but your pupil moves depending on what it’s looking at. Depth and focus are hard to get right, too. Also, the augmenting screen has to be squeezed in a narrow field of your vision. And your eyes can do only so much: you still have to control inputs, and doing that with your phone is clunky, and doing it with your hands is even clunkier.

Nikhil Balram, who oversaw AR hardware at Google at the end of 2019, left to join an augmented glasses startup EyeWay Vision. He published a presentation that summarizes the challenges for AR, but also points the way to what he calls the “holy grail” of fully realized AR glasses: addressing power consumption.

AR glasses will burn power to run what is essentially a super computer, Balram says. People don’t want to walk around with a giant battery pack on their head, or near their face. If there’s a company that can pull it off, it’s likely to be Apple, Balram says. Leveraging its prowess in design, Apple could devise something like a snap-on, snap-off power source that wins consumer appeal.

The upshot is that a full AR experience is at least three years away.

Meta (formerly Facebook) pushes for maximalist vision

If Google appears careful and focused on use cases around glasses, Facebook represents the opposite. It’s investing heavily ($10 billion in 2021 alone), and publicly, in not just a metaverse around AR, but also VR, and across devices. Zuckerberg even renamed the company as part of this new focus. The company’s new north star, he said, is to “bring the metaverse to life.”

If Matthew Ball’s series represent a foundational screed for the metaverse, Zuckerberg and Meta’s team bring it to life with what is essentially a movie version: an engaging 1hr 17min movie presentation articulating the technologies Meta sees as part of the metaverse. These technologies happen to be the same experiences that the other giants – Google, Microsoft, Apple, for example – all want to master as well.

Meta emphasizes that it wants to put you directly into other experiences, not just augment your existing reality. Zuckerberg emphasized this idea of “embodiment” as a key principle of the metaverse. “Instead of just viewing content — you are in it,” he explains. “You feel present with other people as if you were in other places…..So that can be 3D — it doesn’t have to be. You might be able to jump into an experience, like a 3D concert or something, from your phone, so you can get elements that are 2D or elements that are 3D.”

As a result, Facebook is busy creating all kinds of connective technology, which represents covering its bets across possibilities. It has the Quest headset for VR. It has Horizon, a social VR platform it launched in October that includes various forms: Horizon Home (for your default personal digital space), Horizon Workrooms (for your work environment), Horizon Venues (for events), and Horizon Worlds (which lets you hang out with up to 20 people at a time in a virtual space, and write code to build things like games in a Minecraft-like environment).

Meta is also developing a mixed reality (MR) headset, called Project Cambria, for release later this year. It is called mixed because it will allow sensors to pick up your surroundings to inject them into your VR experience. 

A prototype of Facebook’s full AR glasses,
Source: Facebook

And ultimately, it is developing a pair of AR glasses, named Project Nazare, that will include “hologram displays, projectors, batteries, radios, custom silicon chips, cameras, speakers, sensors to map the world around you, and more.” Zuckerberg said it will require “fitting a super computer into a pair of glasses.”

So while Meta started with VR, it has plans for an MR headset in the near term, with a vision for full-fledged augmented reality glasses after that. So Facebook is joining Google, and others in a race for the same goal: perfectly immersive AR glasses. While Google shies away from VR, Facebook’s Zuckerbeg says VR is important because it “delivers the clearest form of presence.”

One day you’ll be able to use your nerves (EMG) and contextual AI to communicate.
Source: Facebook

While Meta is all in, it has not managed to score any significant breakthroughs of its own. Its Oculus VR play is a leader among many VR headset products. Its Horizon products aren’t dissimilar to what other gaming companies like Fortnight or have already done. That said, Facebook’s future as a player in the metaverse is secured. It can use its massive scale as a social network to drive traffic to new metaverse products, even if it isn’t first. It was built on the premise of connecting people digitally. It is also a newer, younger company, was fast to go native in mobile, and has done the most to rally itself around what is coming next.

VR’s achilles heel

Meta’s Quest leads the market among VR headsets, in front of Sony’s PlayStation VR and HTC’s headsets. But the VR market has taken off slower than the original hyped projections called for. It’s clear something is not quite right with the format. 

The advantage of VR is that it offers the most immersive experience, with a wider field of vision for the user and great visual fidelity. But it will never be able to own the metaverse by itself. That’s because with VR, you relinquish your own reality, and enter the one in your headset that surrounds your eyes, which creates two mental frameworks that can be disorienting.

You can’t see your real world. Good luck taking a sip of that coffee on that desk in front of you without spilling it over. VR forces your brain to build and maintain two separate models of your world — one for your real surroundings, where you are perhaps sitting down or standing in place facing in one direction, and one for the virtual world that is presented in your headset, and that may be moving or interaction that is not natural. Louis Rosenberg, CEO of Unanimous AI, writes:

“When I tell people this, they often push back, forgetting that regardless of what’s happening in their headset, their brain still maintains a model of their body sitting on their chair, facing a particular direction in a particular room, with their feet touching the floor (etc.). Because of this perceptual inconsistency, your brain is forced to maintain two mental models. There are ways to reduce the effect, but it’s only when you merge real and virtual worlds into a single consistent experience (i.e. foster a unified mental model) that this truly gets solved.”

There’s also the form factor: “Wearing a scuba mask is not pleasant for most people, making you feel cut off from your surroundings in a way that’s just not natural,” says Rosenberg. It turns out, visual fidelity isn’t the key factor that will drive adoption of the metaverse. Rather, technology that offers the most natural experience to our perceptual system will. And the most natural way to present digital content to the human perceptual system is by integrating it directly into our physical surroundings. Thus, AR, which can be layered on top of your existing reality.

So while VR that completely covers may offer an amazingly virtual experience and will work with games and other certain tasks, “it is not something for the general public,” says Nikhil Balram, the former Google VR executive. 

Mixed reality: The bridge until fully fledged AR arrives

So AR glasses and VR headsets both have limitations. But what if they were to merge in some way? One protagonist that emerges here is Qualcomm, a company that doesn’t get much attention when talking about the metaverse. It’s true that convergence in a whole host of technologies, standards, protocols, and payment systems, is required before a full-fledged metaverse can be realized, as Matthew Ball has pointed out. And Qualcomm plays in just one major area here, which is chipsets for headsets. But it’s an increasingly important one.

Google, Microsoft, Motorola, Lenovo, Oppo, Xiaomi, and many more are working on AR glasses. And what’s notable is that they’re almost all using a similar chipset to base them on: Qualcomm’s XR platform, which includes a CPU, a 5G radio chip, and an AI engine for the glasses. Qualcomm has launched more than 50 devices with its partners.

Qualcomm has also been building a way to bridge AR devices with phones, which many believe will play an important role – at least in the near term – as an edge server for wearables like AR devices. Qualcomm’s platform is centered around Snapdragon Spaces, and lets developers use the phone’s processing power and cellular connection to drive the AR experience. Qualcomm recently acquired a mapping company Augmented Pixels to help build out Snapdragon Spaces. Terms were undisclosed. 

Snapdragon Spaces gives developers access to spatial mapping and meshing.
Source: Qualcomm

The use of Qualcomm’s technology by many of the big players implies a continued convergence around basic AR infrastructure, but consolidation won’t lead to a single platform or architecture dominating the metaverse just yet. There’s plenty of pressure by the big tech companies to go it alone to try to grab their share of key parts of the value chain. For example, Google is developing its own chipset for glasses, even though it has partnered with Qualcomm until now.

Qualcomm’s neutral role also allowed it to see how formats like glasses and headsets are converging, too. Hugo Swartz, senior VP of engineering, points to the convergence by big players on mixed reality. Like genetically modifying a corn type, it takes a hybrid of two inferior models and comes up with a superior one. This is where you start with a VR headset, but you allow a camera and other sensors to pass through as much of  the real world as possible into your field of vision. And it’s not only the tech giants who are building these. A European startup called Lynx, is doing this with a Qualcomm chipset, but it is still early and has limitations (see a demo of this). Other types of MR hybrids exist, including Magic Leap 1 and Snap, with its AR spectacles.

What’s clear is that many companies see it as the best approach for the next two or three years. And so they are working on realizing the best they can from MR, with plans to move on toward fully-fledged AR once more complications with glasses are ironed out.

Microsoft HoloLens 2: The best immersive MR experience so far

With the HoloLens 2, Microsoft has built what is arguably the most advanced immersive AR experience. If Meta’s Zuckerberg has described the future most eloquently, Microsoft has done the most to implement the state of the art as this demo shows. The HoloLens uses Microsoft edge to allow you to do things like open up virtual browser tabs in front of you, that you can reach out and touch and scroll down with your fingers (see min 4:40 of the above video). And Microsoft’s Mesh platform allows you to bring in other augmented content, for example avatars of other people you can collaborate with. Also, Hololens 2 “solved the comfort problem,” says EyeWay Vision’s Balram. ”Comfort on your head overrides everything else.”

Microsoft Hololens allows you to pull up virtual browsers and screens.
(Source: Microsoft)

The powerful vision of the future – evoked in sci-fi movies like Iron Man, or District 9, where characters usher up virtual computer screens or holograms in front of them, almost out of thin air – is getting more tangible.

Pulling something out of the area like that may seem fantastical, but when you think about it, it’s actually close to what we have now. It’s called 5G, where the Internet and other media ride on a powerful backbone – invisible to the eye – that carries gigabits of information per second. The challenge is fitting it in front of the human eye comfortably, reliably, securely, and without latency, so that it is actionable at any angle.

The HoloLens 2 highlights the existing technology challenges. Its field of view captures only a small space of the area around you, 43 degrees horizontal by 29 degrees vertical. And it faces significant ambient occlusion (depth perception) problems in allowing hand control. Typing in the area easily will take at least a couple of more years to solve. That said, like the Google Glass, Hololens is already used in specialized enterprise use-cases. 

Manufacturing, healthcare, and education companies have been early adopters of the HoloLens. Boeing has committed to designing its next-generation aircraft within the metaverse, using Microsoft HoloLens, along something called “digital twin” technology. Digital twin tech simply refers to creating a virtual twin from a physical original. You can then run simulations on the virtual version, which can be hugely cost effective, and easier for collaboration among remote workers. Microsoft imports this digital twin technology through its Mesh platform, but there are several players offering digital twin simulation technology, too.

When Microsoft CEO Satya Nadella announced Mesh in March last year, he gushed about the technology’s capability in a way that likely sounded like hyped gobbledygook to an uninformed viewer: “Mesh allows you to interact with others holographically with true presence in a natural way,” he said. But the HoloLens 2 shows just why Nadella is right to be excited. Hologram avatars that look very similar to you – down to seeing your skin pores – are not that far away.  

Apple: The metaverse antagonist

Apple is the world’s largest company, valued at $3 trillion value, built on the most robust phone and app infrastructure in the world. In theory, it has the most to gain as the metaverse takes off. But it also has the most to lose if its business model is disrupted. 

It’s working on a MR headset. However, Apple downplays the idea of a metaverse: “Here’s one word I’d be shocked to hear on stage when Apple announces its headset: metaverse,” said Mark Gurman in his Jan 9 newsletter. The Bloomberg reporter is considered well-sourced at Apple. “I’ve been told pretty directly that the idea of a completely virtual world where users can escape to — like they can in Meta Platforms/Facebook’s vision of the future — is off limits from Apple.”

Still, don’t let that fool you. CEO Tim Cook has called the area of AR “critically important” and one of “very few profound technologies.” Apple is working as feverishly as any company to build for the future. And it’s following a similar strategy to others: a MR headset for the short-term, expected for release next year – but only as a precursor for full-fledged AR glasses when that technology is ready – again, not expected for several years.

Source: The Information

Apple watchers say the coming MR headset will include “some of its most advanced and powerful chips, which will build upon Apple’s existing M1 chip. Like others, the headset will require cameras to capture the outside world and feed it back to you. According to various reports, there will be up to 15 cameras and lidar sensors mounted on the device. Apple has already incorporated some of these technologies into devices like the iPhone 12 Pro for augmented reality processing. Some analysts say the company might preview the headset as early as this year, but with full delivery mostly likely in 2023

Apple’s strategy requires it to offer technology for developers to leverage too, to keep its app store business thriving. Indeed, that app store has become a major source of friction, since Google continues to charge a 30% tax on revenues made from apps – much higher than developers want. 

This, and other efforts by Apple to protect its turf, is why it is a perceived resistor to an open metaverse. It has the power to slow things down considerably. The initial web was built on open standards, but Apple’s growing market share has made it, according to some, a “de facto regulator for the internet.” Apple iPhone does not allow for fully alternative browsers. And its own browser, Safari, doesn’t support much of WebGL, a JavaScript API that is used elsewhere on the open web – and allows for browser-based 2D and 3D rendering without plug-ins. Apple’s numerous rules, controls over APIs and changing of policies have strongly impacted promising games companies that have been developing elements of the metaverse, for example Roblox, and Epic (which owns Fortnite).

The one metaverse versus many metaverses debate

This constant jockeying among the bigger players – and emerging disrupters – will ensure that the metaverse will likely not, as some skeptics worry, be owned by a single player. There will remain serious fragmentation. As such, there won’t be harmonious, seamless movement of your profile and all its appendages, be it avatar, account information, status, presence, and so on – between games and other virtual worlds anytime soon. Outside of power dynamics, the technical challenges of doing so are significant too.

We can look back at the previous computing revolutions – the PC, web, mobile – for the clues on how the metaverse will unfold. There will be a first period of massive investment, as excitement and hype around the metaverse peaks. This is the phase we are in now. Here, existing large companies stake out their bets, but private capital also pours into startups taking advantage of it. Then, in a second phase, there will be consolidation, as end-users seek to make sense of the fragmentation and bet on perceived winners. And in the end, an oligopoly of leaders is likely. In today’s paradigm – with mobile the most recent revolution – there are at least five major winners: Google, Apple, Facebook, Amazon and Microsoft (GAFAM).

The virtual natives are coming: Epic, Unity and Roblox

All of those, except for Amazon, are in the race for hardware hegemony for the metaverse. Hardware is important because it’s a gateway to the metaverse, but it can only take you so far.

Game engine companies Epic and Unity are coming on strong as contenders in the metaverse because they are helping thousands of developers build games that are fundamentally virtual from the get-go. They also make games themselves, with Epic’s Fortnite – the most-played game in 2021 – being an example of a breakthrough success. 

A virtual concert held by singer Travis Scott within Fortnite in 2020 marked for many an inflection point heralding the emerging dominance of the virtual medium. Writes Ball:

“Nearly 30 million people spent nine minutes fully immersed in his music. This included die-hard and casual fans, non-fans and people who didn’t even know he existed. There is no other experience on earth — including the Super Bowl half-time show — that can deliver this degree of reach and attention.”

Travis Scott performs within Fortnite.
Source: Travis Scott

This makes the Fortnite’s owner, Epic, a big draw for other companies. Disney, the NFL, the NBA, Netflix Ferrari, and fashion company Balenciaga, all have partnered with the company, offering digital goods like avatars and clothing to play across its games.

History of previous computing revolutions shows that there are companies from the previous era that hang on and thrive, but that there are also altogether new companies that emerge, most of which are unpredictable. And no one is in a better situation to thrive in a virtual metaverse than companies that already offer virtual worlds. 

Another fast-growing gaming company, Roblox, is a metaverse-era big contender, says Ball.  Roblox has thrived because it lets users build and generate the content – games and other experiences – and connects players together in virtual online spaces. There, players can create virtual items – things like clothing for characters – that can be sold for virtual currency. Players can buy these things with real currency. It is the most valuable video game company in the U.S. And where people spend time and money, brands show up too. Gucci created a virtual garden, and Ralph Lauren created a virtual ski store.

Like Epic and Unity games, Roblox supports its product across devices, and its success threatens the dominance of the iPhone ecosystem in the coming metaverse paradigm. “Roblox is disintermediating iOS from the majority of virtual world time, and from virtual world developers,” says Ball. “Developers are using Roblox’s engine, and users are discovering content directly through Roblox.”  

It’s no surprise then, that Microsoft, which already has a leading game asset in the Xbox console, is being more aggressive in games of late. It just acquired Activision, the world’s largest gaming company network, for $68.7 billion – the biggest deal in gaming history. With the HoloLens, Microsoft was already arguably the best poised company of any for the enterprise metaverse, and this deal will make it stronger overall, giving it access to strong mobile games too, where it had almost none. Microsoft also owns Minecraft, a top-ten gaming company with 141 million users, that is cross-platform with user-generated content. And its Azure cloud services offering works irrespective of device, and is second only in scale to Amazon.

Comments by Activision CEO Bobby Kotick about the logic behind the sale to Microsoft are eye-opening. Kottick said Activision, even as the biggest gaming network, couldn’t compete in attracting the graphics engineering and AI/ML talent required to keep up with.. That even Activision had to join a tech giant to compete in the metaverse points to significant consolidation down the road elsewhere. 

It’s worth mentioning that there’s a whole different story to be written about how payments will develop in the metaverse. Blockchain technologies for virtual currencies in games and elsewhere offer huge advantages, but they also still face plenty of challenges  – an area that has been well covered elsewhere. Writes Ball:

You can’t access the Metaverse except through hardware, and every hardware player is fighting to be the (or at least a) payment gateway to the Metaverse. This is why Facebook, which lacks a major operating system, is investing so heavily in Oculus. And why Snap is developing its own AR hardware, while defending Apple’s 30% take

Similarly, the emergence of NFTs, to give digital creators ownership over their work, and to be able to buy and sell digital goods, is a trend that will play out over the coming years. Almost every week, it seems, a new brand is jumping in to NFTs, from Adidas to Coca-Cola. Tim O’Reilly has done some work trying to dissect how much blockchain and crypto are part of a coming revolution. It shows that big questions are unanswered. 

The metaverse’s enterprise killer apps: collaboration and simulation

So what does all this mean for enterprise decision-makers, who are not with gaming companies or one of the GAFAM? Will the metaverse be like the cloud, which was talked about beginning in the 1990s, but took years to really come about? 

Not really. The fact is, use cases of the metaverse are already beginning to pop up, as evidenced by early enterprise users of the Hololens, Google Glass, Magic Leap, and the number of companies now launching digital twin simulation with gaming engines like Unity and Epic, and elsewhere. Most of these use cases center around the need for either more natural ways for remote workers to collaborate, or to simulate products or environments. Collaboration and simulation will soon impact a variety of functions, from product management to engineering, as we’ve reported here.

The game engine companies like Unity and Epic are fascinating places for simulation. Industrial, film, car and other enterprise companies are using them. Hummer’s dashboard UI is now based on [Epic’s] Unreal Engine  and can simulate the vehicle live. And Hong Kong International Airport picked Unity to build a digital twin for simulation. Unity was able to render the not-yet-real environment and stress-test it for fire, flood, power outage, backed-up runway, and flow of humans. 

Another notable player here is Nvidia, a company that rocketed in value over the past few years building graphics processing units (GPUs), which sparked a revolution in the performance of gaming apps and deep learning apps for AI. Nvidia’s hard-charging CEO Jensen Huang has been talking about the metaverse for years. Last year, the company launched Omniverse, an open standards way for creators, designers and engineers to collaboratively build virtual worlds, and to connect them. What’s also notable about Nvidia is that it’s offering embraces third-party standards, which is compatible with the needs of the metaverse to avoid lock-in. Omniverse is based on Pixar’s widely adopted Universal Scene Description (USD), the leading format for universal interchange between 3D applications. Basically, Omniverse connects 3D software tools that typically don’t talk well to each other. 

An early notable user of Omniverse is BMW, which is using it to build state-of-the-art factories. The platform also allows customers to access real-time photorealistic rendering, physics, materials, and interactive workflows between industry-leading 3D software products. 

Read on for what the metaverse means

Notably, Nvidia also announced in November plans to build a digital twin of the entire earth, called Earth 2, revealing that simulation of our reality knows no bounds. This simulation is not just for fun. It will be used to simulate and predict climate change, and model other improvements in the real world. “Omniverse is different from a gaming engine,” said Huang during the announcement. “Omniverse is built to be data center scale and hopefully, eventually, planetary scale,” he said.

Nvidia is building a simulation of the earth.
Source: Nvidia

This isn’t a modest conclusion. The metaverse will piggyback on a fully simulated earth. It will also take you to space (virtually). You’ll be able to talk with life-like avatars of friends and workers as if they’re in front of you, even though they’re thousands of miles away. And you’ll be doing it through fully immersive AR glasses, where you can mix your reality and other realities as you choose.

The metaverse will impact every industry. Just as in previous revolutions, the most excitement around the metaverse will start on the consumer side. Enterprise companies will have some more time to prepare themselves for what’s coming, but it will start with the areas closest to consumers, for example in communications and collaboration. But those companies that move quickly will also be the ones who are best positioned to leverage that future. Read on in our series to learn more about what it means for you.

Products You May Like

Articles You May Like

Koo to Shut Down Four-Year-Old Service After Acquisition Discussions Fail
Revolut CEO confident on UK bank license approval as fintech firm hits record $545 million profit
Samsung One UI 6 Watch Beta for Select Galaxy Watch 5, Galaxy Watch 4 Models Reportedly Released
Sennheiser Accentum True Wireless Review: Flagship Sound at Mid-Range Price?
Amazon beefs up AI development, hiring execs from startup Adept and licensing its technology

Leave a Reply

Your email address will not be published. Required fields are marked *