Why AR, not VR, will be the heart of the metaverse

Enterprise

Products You May Like

Join gaming leaders, alongside GamesBeat and Facebook Gaming, for their 2nd Annual GamesBeat & Facebook Gaming Summit | GamesBeat: Into the Metaverse 2 this upcoming January 25-27, 2022. Learn more about the event. 


This article was contributed by Louis Rosenberg, CEO and chief scientist at Unanimous AI

My first experience in a virtual world was in 1991 as a PhD student working in a virtual reality lab at NASA. I was using a variety of early VR systems to model interocular distance  (i.e. the distance between your eyes) and optimize depth perception in software. Despite being a true believer in the potential of virtual reality, I found the experience somewhat miserable. Not because of the low fidelity, as I knew that would steadily improve, but because it felt confining and claustrophobic to have a scuba mask strapped to my face for any extended period.

Even when I used early 3D glasses (i.e. shuttering glasses for viewing 3D on flat monitors), the sense of confinement didn’t go away. I still had to keep my gaze forward, as if wearing blinders to the real world. There was nothing I wanted more than to take the blinders off and allow the power of virtual reality to be splattered across my real physical surroundings.

This sent me down a path to develop the Virtual Fixtures system for the U.S. Air Force, a platform that enabled users to manually interact with virtual objects that were accurately integrated into their perception of a real environment. This was before phrases like “augmented reality” or “mixed reality” had been coined. But even in those early days, watching users enthusiastically experience the prototype system, I was convinced the future of computing would be a seamless merger of real and virtual content displayed all around us.

Event

The 2nd Annual GamesBeat and Facebook Gaming Summit and GamesBeat: Into the Metaverse 2

January 25 – 27, 2022


Learn More

Augmented Reality Research 1992 (USAF - L Rosenberg)

Cut to 30 years later, and the phrase “metaverse” has suddenly become the rage. At the same time, the hardware for virtual reality is significantly cheaper, smaller, lighter, and has much higher fidelity. And yet, the same problems I experienced three decades ago still exist. Like it or not, wearing a scuba mask is not pleasant for most people, making you feel cut off from your surroundings in a way that’s just not natural.

This is why the metaverse, when broadly adopted, will be an augmented reality environment accessed using see-through lenses. This will hold true even though full virtual reality hardware will offer significantly higher fidelity. The fact is, visual fidelity is not the factor that will govern broad adoption. Instead, adoption will be driven by which technology offers the most natural experience to our perceptual system. And the most natural way to present digital content to the human perceptual system is by integrating it directly into our physical surroundings.

Of course, a minimum level of fidelity is required, but what’s far more important is perceptual consistency. By this, I mean that all sensory signals (i.e. sight, sound, touch, and motion) feed a single mental model of the world within your brain. With augmented reality, this can be achieved with relatively low visual fidelity, as long as virtual elements are spatially and temporally registered to your surroundings in a convincing way. And because our sense of distance (i.e. depth perception) is relatively coarse, it’s not hard for this to be convincing.

But for virtual reality, providing a unified sensory model of the world is much harder. This might sound surprising because it’s far easier for VR hardware to provide high-fidelity visuals without lag or distortion. But unless you’re using elaborate and impractical hardware, your body will be sitting or standing still while most virtual experiences involve motion. This inconsistency forces your brain to build and maintain two separate models of your world — one for your real surroundings and one for the virtual world that is presented in your headset.

When I tell people this, they often push back, forgetting that regardless of what’s happening in their headset, their brain still maintains a model of their body sitting on their chair, facing a particular direction in a particular room, with their feet touching the floor (etc.). Because of this perceptual inconsistency, your brain is forced to maintain two mental models. There are ways to reduce the effect, but it’s only when you merge real and virtual worlds into a single consistent experience (i.e. foster a unified mental model) that this truly gets solved.

This is why augmented reality will inherit the earth. It will not only overshadow virtual reality as our primary gateway to the metaverse but will also replace the current ecosystem of phones and desktops as our primary interface to digital content. After all, walking down the street with your neck bent, staring at a phone in your hand is not the most natural way to experience content to the human perceptual system. Augmented reality is, which is why I firmly believe that within 10 years, AR hardware and software will become dominant, overshadowing phones and desktops in our lives.

This will unleash amazing opportunities for artists and designers, entertainers, and educators, as they are suddenly able to embellish our world in ways that defy constraint (see Metaverse 2030 for examples). Augmented reality will also give us superpowers, enabling each of us to alter our world with the flick of a finger or the blink of an eye. And it will feel deeply real, as long as designers focus on consistent perceptual signals feeding our brains and worry less about absolute fidelity. This principle was such an important revelation to me as I worked on AR and VR back in the early ’90s that I gave it a name: perceptual design.

As for what the future holds, the vision currently portrayed by large platform providers of a metaverse filled with cartoonish avatars is misleading. Yes, virtual worlds for socializing will become increasingly popular, but they will not be the means through which immersive media transforms society. The true metaverse — the one that becomes the central platform of our lives — will be an augmented world. And by 2030 it will be everywhere.

Louis Rosenberg is CEO & Chief Scientist at Unanimous AI.

DataDecisionMakers

Welcome to the VentureBeat community!

DataDecisionMakers is where experts, including the technical people doing data work, can share data-related insights and innovation.

If you want to read about cutting-edge ideas and up-to-date information, best practices, and the future of data and data tech, join us at DataDecisionMakers.

You might even consider contributing an article of your own!

Read More From DataDecisionMakers

Products You May Like

Leave a Reply

Your email address will not be published. Required fields are marked *