Immersive Technology: What’s Next?

By Kenneth Jeng, Silicon Foundry

Silicon Foundry
4 min readFeb 27, 2021

The COVID-19 pandemic has driven home the urgency for digital transformation, heightening the interest in immersive technology as a substitute for face-to-face interaction. In addition, the development of innovative AR/VR solutions designed to enrich both digital and in-person experiences and intuitively facilitate the transfer of information has accelerated — industries from retail to manufacturing are investing in new ways of doing things in anticipation of a post-pandemic new normal.

As we enter 2021, practical virtual reality and augmented reality solutions appear to be at the cusp of a positive inflection point — over the course of the last few years, the advancement of underlying technologies, such as 5G and mobile computing, has facilitated the development of practical solutions that are now seeing increased traction: field service solutions, AR storefronts, VR teleconferencing, and VR-assisted design have seen accelerating adoption over the last year.

These rapid developments have been reflected by the graduation of AR and VR from Gartner’s Technology Hype Cycle after last appearing in the framework in 2018, highlighting their status as maturing technologies.

Figure 1 — Augmented reality in the Gartner Hype Cycle from 2005 to 2020.

Source: Wikitude

Given the development of increasingly practical use cases for AR/VR, what’s next for immersive technology? At Silicon Foundry, we believe that three developments are worth keeping track of.

AR Cloud:

The Augmented Reality (AR) Cloud is the concept of a real-time 3D spatial map overlaid onto the real world that allows information, digital objects, and experiences to be persistently tied to specific locations and accessible across different AR-enabled devices. Fully developed, it is the world’s digital twin: a digital copy that can be searched, engaged with, and manipulated in the cloud. The emergence of this new computing paradigm — a broad shift from 2D to 3D digital content — will enable millions of digital devices to interact with physical objects augmented with information and visualize dimensional spaces with virtual data, objects, and logic.

In addition to entertainment (AR gaming, concert and stadium experiences), real-time AR Cloud can potentially drive rich and intuitive integration of digital content in retail, manufacturing, and smart city experiences. A nascent example of the concept of the AR Cloud can be seen in Google’s AR Live View in Google Maps, which includes features such as the seamless integration between AR Live View and Google Maps location sharing. Competitors such as Apple have also introduced similar tools for outdoor AR navigation, with features such as Location Anchors, which fixes AR models into a specific location in the real world.

XR Streaming:

XR Streaming — the outsourcing of AR/VR rendering to external servers in the cloud — will likely be a key solution to overcoming current hardware limitations that put a significant damper on the level of immersion and quality of XR experiences. Similar to the impact of broadband internet and 4G adoption on media streaming over the last decade, the increasing penetration of high-bandwidth, low-latency network infrastructure (i.e. 5G) into the market and refinement of game streaming services (e.g. Google Stadia, Amazon Luna, Nvidia Geforce Now), will enable XR streaming as a viable solution.

By moving intensive processing and rendering demands to the cloud, mobile devices, which will provide sensor input, room detection, and gesture tracking, can provide users with high-fidelity experiences, while allowing users to avoid expensive equipment investments that currently act as a significant hurdle to adoption. In addition, the creation of XR content on a single platform will likely be simplified, allowing developers to produce content and applications faster and cheaper.

Brain-computer Interface (BCI):

Brain-computer interface technology — the facilitation of direct communication between the human brain and external computing devices — has begun to converge with VR. The development of BCI-enabled VR headsets would dramatically increase the bandwidth of human-VR interaction, allowing for richer and less cumbersome experiences. In early stages, this technology would entail the monitoring of user physical, emotional, and mental states and allow for proactive adaptations of the VR experience. Further development would allow users to proactively engage with the VR interface through the issuing of commands and text input without requiring physical interaction.

In February 2021, OpenBCI, a BCI company developing open-source solutions, announced the joint development of a new hardware-software platform for AR/VR headsets with Valve, a prominent video game developer that has successfully released the Valve Index, a high-end VR headset. The solution — called Galea — aims to include a range of sensors, such as electroencephalogram (EEG), electrooculography (EOG) electromyography (EMG), electrodermal activity (EDA), and photoplethysmography (PPG) sensors to capture biometric data.

--

--

Silicon Foundry

Silicon Foundry is an innovation advisory platform that builds bridges between leading multi-national corporations and global startup ecosystems.