Exploring XR Tech That Powers A Wireless Metaverse With A Top Qualcomm Inventor

The “metaverse” is a rather worn-out term these days that, similar to the term AI a few years back, is being tossed around casually because it’s cachet. Grand visions of humans sharing virtual experiences that are indistinguishable from reality are being cast about for everything from enterprise collaboration platforms to the new modern virtual social network that Meta (Facebook) is selling hard recently. However, to actually enable or construct a metaverse of any kind takes an extremely sophisticated hardware and software ecosystem, preferably one without wires that otherwise keep humans tethered to a system or wall outlet. Wires undoubtedly breaks the immersion for most users, making the metaverse less real, unconvincing or less useful, depending on the application.

In short, to blend augmented reality, virtual reality or XR (Extended Reality) right, wireless headsets and glasses are the better path to the metaverse. However, wireless XR systems present a very difficult set of problems to solve, requiring cutting-edge mobile technologies, from positional sensors to ultra-low power processing engines for visuals, and of course extremely low latency wireless communication between the content source(s) and the user. As such, it makes perfect sense that mobile processing and communications powerhouse Qualcomm has been working on solving these problems for over a decade.

A Fireside Chat With A Metaverse Inventor

My analyst firm partner Marco Chiappetta and I had a chance to sit down with Qualcomm Technologies’ Senior Director of XR Technology, Martin Renschler – one of the company’s earliest engineers that you could also say is one of the founding fathers of the metaverse — to discuss the company’s role in developing wireless AR, VR and MR technologies, and where we’re headed in the future. The following are some of the highlights from our Q&A style interview with Martin that presents a fascinating perspective on how all this virtual and mixed-reality wireless magic comes together, and how the metaverse as we know it will continue to take shape.

When did Qualcomm decide that XR would be an area of ​​focus and a market opportunity worth pursuing? Were there any early development platforms you can speak about, that Qualcomm participated in, that the public may find interesting?

[MR] Back in November 2011, the XR R&D team was a part of our “impossible tasks” group – set up by the Office of the Chief Scientist to push forward what might be possible with mobile technology. Our goal: to use Qualcomm’s mobile technology expertise to develop a Snapdragon-based AR device. By CES 2014, we had a 720p, 80-fps running prototype which already included eye tracking, voice control and ML-based hand gesture detection. In April 2015, BMW showed its Augmented Vision glasses that were based on that design. New in the BMW design was infrared head pose tracking in a moving car, as well as video pass-through of the car’s cameras over Wi-Fi. This design also led to the ODG glasses.

What major legacy and current AR/VR 3rd party platforms utilize Qualcomm XR technologies?

[MR] Over 50 XR devices have been launched that are based on Snapdragon processors. Some use the entire XR stack from Qualcomm. This is in addition to these devices using the bedrock mobile technology foundations driven by Qualcomm – 4G & 5G and Wi-Fi being the most obvious examples.

It seems that Qualcomm’s pervasiveness in smartphones, and managing the wide array of sensors available across devices, has some significant parallels with XR. How do the sensor and sensing requirements for today’s mobile devices differ from an XR platform?

[MR] Smartphones and XR devices do share a lot of similarities, especially in terms of some of the core technology requirements like multiple flavors of connectivity (4G/5G, Wi-Fi, Bluetooth etc.), complex arrays of sensors and perception, and very small power and size envelopes. XR as a technology field is highly dependent on very low latency and the utilization of many concurrent cameras running at high frame rates. We thought early on that we could leverage Snapdragon’s hardware for XR applications, but from day one on, we had to design new software stacks for XR that created pipelines between the different hardware blocks without having to go through costly, complex API layers. In later chipsets, we moved more and more XR algorithms into dedicated hardware to further reduce latency and power. Today, we have an entire chipset line specifically dedicated to XR devices.

But the notion of committing significant design resources to bring a new XR-optimized chipset to market is not trivial, especially when you consider a fledgling market opportunity like the metaverse in its current state. So, what did Qualcomm actually have to re-engineer specifically in its Snapdragon mobile silicon arsenal to bring AR, VR and XR to life? “On the VR side, we added support for more concurrent tracking cameras, added new IP for hardened XR-specific algorithms such as 6DOF, added XR-specific features to the GPU (for example for foveated rendering), and the DPU (hardened color aberration correction for VR lenses). On the AR side, stay tuned on that front…” noted Renschler.

Again, this guy makes it sound easy, but it’s clear Qualcomm had to invest in significant R&D, but also needed to place specific silicon bets on a moving target of market and technology that was evolving at a rapid pace.

One of Qualcomm’s core competencies is obviously Wireless Connectivity. How do the Wireless Connectivity needs of standalone XR devices differ from those in a smartphone or ACPC?

[MR] Standalone XR devices aren’t that different from phones in their need for multiple different types of connectivity technologies all operating simultaneously somewhat in the device (a problem that Qualcomm is great at solving) – but XR in particular has a number of unique requirements necessary for the user experiences it offers. We had to develop specialized, low-latency protocols and codecs for XR devices that are operated in a split-rendering mode, where frames are rendered somewhere else and only re-projected on the device to correct for the time since the rendering. Most cellular and Wi-Fi protocols are optimized for throughput, but not for these low-latency and flawless frame-by-frame transmissions of renderings – so this was something we put in the extra effort to solve. And it’s solutions like these that often go unnoticed, but which provide the backbone of how XR actually works as a technology.

And with respect to the role that AI (Artificial Intelligence) and Machine Learning might play in all of this, Renschler notes that there are “several XR-specific features that are ML-based – such as head pose prediction, 3D-reconstruction, object tracking, and hand tracking – and Hexagon processor and dedicated AI blocks are a great fit for those tasks. The area provided for NPUs on Snapdragon is increasing by large amounts for every new generation, and AI innovation is a key goal for us as a team and a company.”

Although Qualcomm has a long history developing GPU and graphics technologies, and displays are attached to the vast majority of Qualcomm-powered devices, the display requirements for XR applications and use cases are very different. What has Qualcomm done in this area to advance XR display technology or other display related innovations?

[MR] Several features were added for XR: for example, driving color-sequential displays directly and doing color aberration correction in hardware. These display features are designed to provide optimal efficiency and capabilities for XR experiences specifically. This goes back to the heart of what we do at Qualcomm: drive ideas, solve problems, and lay the technology foundations to create the best end-user experiences.

What sort of experiences and capabilities does 5G connectivity afford XR that weren’t previously achievable on LTE or legacy networks? Is Wi-Fi critical for the future of XR as well?

[MR] Generally, cellular networks are based on shared resources. XR, however, requires consistent and guaranteed high bit rates as new, high-quality information is needed for every frame. What 5G added that helps XR is enhanced beam forming, which allows many users in the same cell to get high throughputs without interfering with each other, boosting the signal-to-noise ratio significantly from the higher, more focused data streams. Similarly, for Wi-Fi, best effort based transmission protocols over a shared channel don’t work well for XR if heavy background traffic is ongoing at the same time, so optimizations are required.

Development tools and software are key enablers for fostering new XR technologies. What are the areas of focus at Qualcomm that will draw in and spur developers to create new XR experiences?

[MR] We recently launched Snapdragon Spaces which is an open developer platform that allows for easy development of AR and VR apps. It also supports the conversion of existing games to become immersive when running on our AR glasses. Qualcomm’s XR stack is based on OpenXR and thus compatible with games and game engines that support OpenXR.

What would you say are the top two or three challenges with respect to advancing XR/AR/VR experiences moving forward? Is there a “holy grail” experience, capability or feature that Qualcomm or the industry is striving for?

[MR] XR today works very well for games that put you entirely in a virtual world (think about the classic VR headset that fully covers your eyes). Over the coming years, mixed- or augmented-reality experiences will emerge that place information and animations in your work or home environment. For this to work well, we need to completely understand the environment: basically we need real-time depth for every pixel, we need to know what kind of object we are seeing and where all the light sources in the room are. This will require distributed computing, as the power budget of wearable devices is limited. Wireless technologies that connect you to distributed compute nodes reliably at low latency are the key to this, and Qualcomm is well set up to accomplish this task.

Where We Go From Here, In Virtual Places And Experiences

When we asked Martin what he felt Qualcomm’s role was with respect to the metaverse and the promise it brings for human collaboration and communication in the future. He noted, “Qualcomm is fundamentally an enabler – we do the incredibly complex background innovation to solve technology problems, and enable the best possible devices and end-user experiences. We’re a strong supporter of standards (our cellular standards side of the house actually predates our chipset side), as standards enable all kinds of different devices to utilize our platforms and solutions in an easy and compatible way.

The vision of the metaverse includes standardization, so for example, the same virtual item or a virtual outfit can be taken along into the various virtual worlds of different vendors, and can be interacted with in a standardized, pre-defined way. In our traditional role as fundamental technology enabler, we will develop hardware acceleration to provide efficient encoding, encryption, transport, decoding, decryption and rendering of metaverse assets and interactions and drive this standardization forward.”

Renschler brings up a great point here. For the metaverse to take off, we all need to be speaking basically the same language. Engineering the enabling technologies that deliver and render the virtual constructs of the metaverse are obviously critical, but so is allowing different systems and solutions to interact with each other across standardized communications and visual representations. Otherwise, our virtual experiences will be limited to a certain number of closed, custom islands, rather than an open virtual world.

Regardless, it’s clear Qualcomm is staking a beachhead in wireless XR with the goal of a tether-free metaverse; one in-which we can move about in the real world freely but explore new experiences in collaboration, entertainment, the modern workplace and virtually-connected social interactions.

Leave a Comment

Your email address will not be published.