Five Ways Qualcomm Enables the OPEN Spatial Computing Industry

Throwback to 2016: After the traction we experienced with Standalone VR with the first Snapdragon reference design for VR with the Snapdragon 820, I took the lead of a separate business unit dedicated to XR, with the ambition of not only creating the processors to power headsets, but to propel the industry forward with technology and ecosystem initiatives that enable the industry, that enable the Open Metaverse. To do so, we had to think holistically about the market and its players – how to remove the barriers to foster a solid, healthy, interoperable growth of the market?

Looking at the mobile industry to create our vision for the Open Metaverse

Like Einstein said, “if you want to know the future, look at the past”. I looked back at my first job, back in the early 2000s, when I was bringing wireless data to cell phones enabling the mobile internet. I realized the early mobile internet then and the early spatial internet today share similar challenges: you need great displays, high performance at low-power compute, blazing fast connectivity, a large quantity of quality content, all this accessible from a light, practical device.

How did the mobile industry succeed in bringing accessible devices and amazing content to users all around the world?  Consumer electronics brands, OEMs, component manufacturers, content developers all came together and leveraged each other’s expertise to make phones wireless, from 3G to 4G, improved displays, increased compute performance without compromising power consumption, and then, bring amazing and accessible content, games, and AI-powered experiences to billions of people, all over the world through the mobile internet ecosystem.

Participating in the mobile revolution and witnessing the way it impacted the way we live, work, socialize, and have fun has shaped my vision for the XR industry. I strongly believe that the Metaverse will realize its full potential by fostering collaboration and interoperability, and Qualcomm has a major role to play. Here is how.

1. Snapdragon XR Platforms

To deliver immersive experiences, you first have to solve high performance processing at extremely low power in Silicon platforms. This is where all the computing and connectivity happen, and where we still leverage our expertise of display, graphic processing and rendering, on-device AI from mobile to build XR-dedicated and optimized processors.

Snapdragon XR Platforms power 80+ devices across the globe

Everything we do, at the core of our BU, aims to remove barriers and accelerate the adoption of spatial computing. And from the beginning of this amazing journey, we got incredible traction. We started from a smartphone processor in 2016, and now offer six dedicated XR platforms that power more than 80+ XR devices headsets across the world. The Meta Quest line is the most popular device, but we also power devices from Lenovo, HTC, Pico, Oppo, and more, many of them start-ups. Qualcomm’s processors are made to enable OEMs, from start-ups to global biggest brands, go to market.

We recently launched two new XR platforms, the Snapdragon XR2 Gen 2, and the Snapdragon AR1 Gen 1, making their commercial debut in mainstream MR and VR device Quest 3 and the next-gen Ray-Ban Meta smart glasses collection.

The Meta Quest 3 is the most powerful Quest device yet, and one of the first mainstream MR devices in the market. The Snapdragon XR2 Gen 2, platform that powers it, is amazingly powerful – it features 8 times the AI capabilities of the previous platform, more than twice the GPU performance. It’s built to power headsets that are lighter, more comfortable, and don’t require an extra battery pack. It’s also built to make these complex technologies accessible to all. 

In augmented reality, we are seeing two categories emerge, sleek and comfortable all-day smart glasses, and immersive AR glasses. Smart glasses feature cameras, photo and video capture, on-glass AI for real time translation, object detection, real time visual search – its advanced technology packed into glasses that look like regular glasses and can be useful for people who have hearing impairments for example, or simply to capture memories without pulling your phone out of your pocket. We are excited that Snapdragon AR1 is going to be enabling the new Ray-Ban Meta smart glasses collection. 

2. Snapdragon Perception

On top of this ‘hardware’ layer comes another layer, technologies for spatial computing, also called perception technologies. These refer to the technologies that understand the user inputs and outputs – such as head, hand and eye tracking, and the user environment – locate the coffee table you don’t want to bump into, identify the floor, identify what objects are in your field of view. 

Snapdragon Perception is a suite of AI-powered perception technologies that make immersion and interactions possible in MR, VR and AR.

Qualcomm develops perception technologies in-house so that customers don’t have to re-invent the wheel if they want to accelerate their commercialization. This is not the only reason why we work on providing these technologies to our customers. If not optimized to the hardware, each of these algorithms can consume 10x power and make the devices be larger than what they should. In addition, it is extremely complex to have all the algorithms running concurrently. So, by building the technology we can both optimize existing silicon and also do custom silicon blocks for these perception algorithms.    This means, by 'hardening’ them into the silicon, we significantly reduce the computational load and power consumption.

For example, in our latest Snapdragon XR2 Gen 2 platform, the Engine for Visual Analytics block features a custom-made IP designed to accelerate and improve 6DoF significantly, while using less power. Since Qualcomm designs processors, we are uniquely positioned to deliver hardware-software optimization to our customers.

Qualcomm is uniquely positioned to deliver hardware-software optimizations. For instance, the software-based Engine for Visual Analytics in Snapdragon XR2 Gen 2 utilizes less power while improving 6DoF.

3. Removing barriers for OEMs with reference designs

Reference designs allow OEMs and smaller XR ecosystem players to ease their entry into the XR market and/or accelerate their product development journey. It enables Qualcomm to scale and make our products available to more companies. Qualcomm partners with expert original design manufacturers (ODMs) who have the manufacturing capabilities and great expertise in all pieces of hardware needed to build a MR/VR headset or AR glasses: displays, cameras, battery, and more.

Reference designs are a blueprint for evaluation, software development, and testing, in a form factor that is accurate to a commercial product. They provide a great starting point for development while remaining versatile and flexible for manufacturers to build their own product variations. Since they don’t start from scratch, manufacturers and developers accelerate their product design and shorten their commercialization time. This also allows manufacturers without much expertise in the XR ecosystem to enter the market.   

Last year, we announced the Snapdragon AR2 Reference Platform, for immersive AR glasses that utilize distributed processing architecture, twelve brands involved in various stages of development on AR2.

Recently, we have partnered with the leaders in compact light engines, wave-guide optics and smart glass manufacturing to develop a new Smart Glasses reference platform for manufacturers who are seeking to commercialize smart glasses powered by our latest platform, the Snapdragon AR1 Gen 1 with multiple brands in various stages of development.

4. Removing barriers for developers with Snapdragon Spaces developer platform, and its newest feature, Dual Render Fusion 

Now, we all know that to really unblock the growth of a new computing paradigm, it is not only about the hardware, the device, but about the content and applications as well. We developed our own SDK – Snapdragon Spaces for developers to start working with Qualcomm hardware before even commercial products are available on the market. Spaces enable developers to create cross-platform, cross-device, experiences using powerful features including interaction technology for AR and VR/MR experiences. The Spaces SDK is based on OpenXR, to plug-in seamlessly to Unity AR Foundations and Epic Unreal Engine 4 for seamless content development and 3D app portability. Developers and OEMs can use Spaces on compatible devices, including the Lenovo VRX, MR developer headset Oppo Wheatstone, and the Lenovo Think Reality A3 and Motorola edge + smartphone kit to experiment and build customized experiences faster than before.

Spaces developer community

We also collaborate with operators including CMCC, Deutsche Telekom, KDDI Corporation, NTT QONOQ, T-Mobile, Telefonica, and Vodafone for them to leverage the cross-device and open ecosystem Snapdragon Spaces™ XR Developer Platform and help to define Snapdragon Spaces device requirements and compatibility. This is significant for the industry, allowing to give customers more options to wirelessly tether smartphones and glasses, distribute these technologies to their network, and launch innovative regional developer programs that accelerate augmented reality (AR) experiences.

I mentioned above the complex challenges that we are addressing through our first pillar, processors, to remove barriers and enable our hardware customers (OEMs) to deliver spatial computing devices. It’s currently complex to build a 3D spatial UX for developers, many of them do not have experience developing for  AR full apps or don’t see the incentive to invest as much now. We thought of ways we can enable them to add a touch of AR in their application and this is how we created Dual Render Fusion. It is the newest feature of Snapdragon Spaces, our SDK plug in for Unreal and Unity, that makes it possible for mobile developers to extend 2D mobile apps into world-scale 3D spatial experiences without prior XR experience required. This feature is useful for many use cases, like adding a 3D map to a navigation UI, or enhancing a cooking tutorial, like we did with KITTCH  or to discover Virtual Places. his helps lower the barrier for mobile developers creating content for productivity, gaming, culture, and more, start experimenting and accelerate the creation of great content.

5. Accelerate scalability through standardization and interoperability

Similar to the internet and the mobile internet, the Metaverse, or Spatial Internet, has to be well supported on different types of devices for it to be most widely used in our daily lives.

We are bringing our experience in standards and expertise in technology innovation and devices to the Metaverse Standards Forum as board member and contributor to collaborate on developing open solutions to the challenges of enabling rich and complex experiences on constrained wireless devices and future wearables. We contribute to different working groups to build core/cross-vendor extensions of perception features in the specifications to limit vendor-specific APIs and thus reduce fragmentation. Qualcomm is a founding member, official promoter and elected domain group leader for the first created domain group, the Standards Register WG, to develop and maintain the Metaverse Standards Register and Glossary.

Beyond the efforts in the Metaverse Standards Forum, Qualcomm is a key participant in open standards work in Khronos, 3GPP and ISO/IEC. Another example of our contribution to the ecosystem is this white paper here written by Thomas Stockhammer, the Chairman of MPEG-I Scene Description group.

As an active member of Khronos’ OpenXR since its incubation in 2017, we have been actively working with multiple companies in the OpenXR working group including Meta, Microsoft and Magic Leap to consolidate and advocate cross-vendor extensions to reduce fragmentation in the OpenXR ecosystem. We contribute to the design of APIs perception features and fostering the creation of standardized interaction models through our involvement in MRTK as steering committee members along with Microsoft and Magic Leap. With two of the original MRTK authors now contributing to MRTK from team Qualcomm, our commitment to standardize interactions across XR products, and to build meaningful building blocks to help Spaces developers build more immersive content with richer interactions has only gotten stronger.

To push this even further, we aim to Influence the design of WebXR modules and charters to strongly align with OpenXR APIs. Web-based XR is the most accessible way to experience the metaverse, so we work towards making it standardized with the OpenXR guidelines. We recently collaborated with Pluto and Igalia (maintainers of the Wolvic browser) to enable WebXR using Spaces on the Wolvic browser. We also participate in open standards work in Khronos, 3GPP and ISO/IEC.

While Qualcomm’s focus is on the standards needed to enable the core technologies and devices, we encourage others’ development of standards for broader aspects such as privacy, security, identity, attestation, ownership, and more, that will help guide the development and use of these core technologies and devices in a socially responsible manner.

What’s next?

We’ve made huge progress in terms of technology, processors, devices, content and standardization in the last couple of years, and I am a strong believer we will see this progress accelerate.

To achieve mainstream adoption, we must keep striving for lighter and sleeker headsets and AR glasses. How? By pushing further distributed computing, a disruptive architecture that enables to offload heavy compute to devices nearby instead of processing and rendering all workloads locally on headsets or AR glasses.

In parallel, we keep working relentlessly with and for developers to bring to them to tools they need to develop greater content, faster. Finally, developers, operators, software vendors, and OEMs must keep persevering in our efforts towards standardization and interoperability for smoother developer journeys.

For developers who would like to join forces to build the open metaverse, I invite you to join Snapdragon Spaces. I can’t wait to keep pushing forward the realization of the open metaverse with all of you!



To view or add a comment, sign in

Insights from the community

Others also viewed

Explore topics