Event Coverage 12 minutesFree Report

CES 2023: Metaverse Highlights—Exploring Key Innovations in Augmented Reality, Artificial Intelligence and Immersive Technologies

Print Friendly, PDF & Email

Introduction

The Coresight Research team attended CES (formerly the Consumer Electronics Show), hosted by the Consumer Technology Association (CTA) in Las Vegas from January 3, 2023 to January 8, 2023. The event brought together technology vendors, innovators, retailers and people from around the world to get an early look at key consumer electronics trends in the new year. The scope of CES has expanded in recent years beyond electronic gadgets to encompass new areas such as healthcare, wearable devices and mobility.

This year, CES had a strong metaverse focus. In this report, we present metaverse-related insights and technologies from the full event.

Metaverse Highlights from CES 2023: Coresight Research Insights

1. AR Will Link the Real and Virtual Worlds

While many software developers anticipate that virtual reality (VR) will be the most immersive portal to the metaverse and Web 3.0, many solutions seem to be concentrating on augmented reality (AR) technologies, which offer comparable degrees of immersion but also powerfully link the real and virtual worlds. There were numerous examples of cutting-edge AR solutions and technology at CES, and we believe that individuals will reach the metaverse more easily and practically by using AR.

At CES, we saw many examples of advanced AR solutions and technologies, and we believe that AR will be a practical and simple way for users to gain access to the metaverse. We detail notable exhibits and startups working on AR and immersive display technologies below.

  • TCL and Canon: Following the trend of developing lightweight, easy-to-use AR spectacles, electronics manufacturers TCL and Canon both revealed new glasses. TCL’s RayNeo X2 device is one of the first binocular full micro-LED optical waveguide AR glasses; it will have impressive abilities to display immersive and non-intrusive projections. Canon’s MREAL mixed-reality visor is still in the research phase. This lightweight solution may have both industrial and consumer use cases as the metaverse and the real world converge.
  • Graffity: Japan-based game developer Graffity utilizes ultra-lightweight AR spectacles developed by AR company Nreal to provide users with immersive games based on hand-tracking and AI (artificial intelligence) algorithms, which significantly improve immersion. Typically, hand tracking is associated with VR and fully immersive experiences; however, Graffity demonstrates that there are AR applications for it which may be easier for users to feel immersed in given limitations in VR technologies.

Graffity’s Hand-Tracking Sushi Game
Source: Graffity

 

  • Port 6: Port 6 is a technology developer that is working to bridge the interaction gap between humans and machines. It has developed an AR-based wristband that tracks hand movements and gestures, allowing users to navigate devices simply by moving their hand through the air. The wristband also simulates virtual touch, utilizing AI to provide realistic simulations and building on datasets to continually improve immersiveness and interactions. In the metaverse, it will be crucial for users to control avatars and actions with hand gestures, as they would in the real world.

Port 6 AR hand tracking
Source: Coresight Research

 

  • Recon Labs: Based in South Korea, Recon Labs is an AR product creation tool that helps shoppers visualize online purchases. Recon Labs has developed a platform, PlicAR, which is capable of capturing 2D objects and images and instantly creating a 3D asset. This type of technology is important for brands and retailers as they look to virtualize shopping experiences. In the future, we believe that many products, virtual and real-world, will be represented by NFTs (non-fungible tokens); providing realistic products online will be crucial to selling to users in various countries and locations.

Recon Labs’ 3D creation
Source: Coresight Research

 

  • SparX: Israel-based startup SparX has created an AR tool that allows real estate agencies and businesses to easily overlay physical spaces with a catalogue of 3D-generated furniture. This type of technology will allow businesses, and perhaps eventually users, to virtualize their physical spaces, test new decoration ideas and achieve high levels of home personalization and customization.

SparX AR Overlay for Physical Spaces
Source: Coresight Research

 

  • Arti AR: Arti AR overlays AR assets on live video. As the enterprise metaverse begins to take shape and companies continue to offer hybrid work, Arti AR’s tool could be important for improving workplace interactions and productivity.
  • Cellid: Cellid is developing waveguide-architecture display modules for AR glasses and high-performance spatial recognition software, with a focus on nanotechnologies and AI. The company’s goal is to enhance the user experience across various devices, including smartphones, tablets and smart glasses. Realistic and unobtrusive projections are superimposed over the users’ field of vision using Cellid’s AR glasses.

Cellid AR spectacles
Source: Coresight Research

 

  • Oorym: Israel-based startup Oorym has created an AR projection lens based on waveguide technology. A seamless image is projected onto the lens of AR glasses, enabling users to have seamless immersive experiences.

OORYM AR Projection Lens
Source: Coresight Research

 

2. Tech Innovation Continues To Improve Immersiveness

At CES, we observed many developers creating technology that will eventually make it possible for users to realistically experience nearly every sensation, from smell to touch to sound, in the metaverse, which will likely increase the number of hours per day a user spends within an experience, helping the metaverse to take shape.

Below, we explore notable examples of solutions being developed to improve the Web 3.0 user experience.

  • Diver-X: Diver-X has developed the “ContactGlove,” a haptic glove that the company aims to fully replace VR controllers, allowing users to perform virtual actions much as they would in the real world. The glove tracks finger movements, and Diver-X claims it is cheaper than traditional haptic gloves, though they will still cost just under $500. Haptic gloves are important for immersion, and as they continue to improve, they will likely be crucial for Web 3.0 adoption, providing non-gamers, who are unfamiliar with traditional gaming controllers and their layouts, with a sense of familiarity.
  • LG: LG revealed its 45-inch OLED 240Hz (refresh rate) UltraGear 45GR95QE curved gaming monitor, which significantly improves immersion for gamers, perhaps even reducing the need for VR headsets, which are still extremely expensive and have given many users a slew of issues.
  • Skyted: Mask-technology company Skyted has developed a voice-silencing mask. Although it is not specific to the metaverse, the product will likely be a crucial piece of technology for improving immersion. With a silencing mask, users will be able to access the metaverse from anywhere, at any time, in a private setting. The mask features a built-in microphone to allow users to communicate with friends and in virtual world settings without being heard in the physical world.
  • Vivoka: Studios such as Vivoka’s could eventually be an important technology for easily automating interactions in the metaverse and ensuring a chatbot or virtual idol is sufficiently customized for the specific brand or retailer and is able to seamlessly manage customer interactions at all hours of the day.
  • iRomaScents: Israel-based startup iRomaScents has created a device for retail and home experience that replicates scents, called iRomaScents Digital Scent Generator. Although it is mainly being marketed as a solution for retailers for shopping experiences and sampling scent-based products, IromaScents believes that its device can be linked (through cellular or Wi-Fi) to virtual experiences, autonomously producing scents when users enter certain experiences or activate scents based on their location.

iRomaScents Digital Scent Generator
Source: Coresight Research

 

  • OVR Technology: OVR has developed scent reproduction for the metaverse and immersive experiences. A single cartridge containing various scents can be combined to reproduce thousands of digital versions of well-known scents, and can be autonomously triggered by location in the metaverse.

OVR Technology’s scent-creation technology
Source: Coresight Research

 

  • 3D Game Market: Gaming startup 3D Game Market has developed a monitor that will allow gamers and users to turn almost any video game into a 3D experience through a traditional 2D monitor, without glasses or AR/VR headsets. Technologies such as these monitors are important bridges to Web 3.0 and the metaverse; it is difficult to obtain a fully immersive experience without headsets or goggles, but this monitor can help to bridge the gap.

3D Game Market at CES
Source: Coresight Research

 

  • OmniPad: Immersive technologies were a large focus at CES, but one of the most unique products we saw was OmniPad, a multidirectional “treadmill” for users to explore the metaverse and virtual worlds as they would in the physical world without having to worry about their position or being distracted by running into objects in the real world. Hardware such as OmniPad will be critical for providing a realistic experience in the metaverse.

OmniPad’s VR treadmill
Source: Coresight Research

 

  • Miros: Swiss-based VR company Miros has developed a robotics surface for VR experiences. When a user is immersed in a virtual environment, the surface changes as the user moves throughout the environment and performs different actions. The Miros surface can be used, for instance, to stimulate real-world senses based on what is happening in the virtual environment—an important product for improving immersion and making the metaverse a more realistic experience.

Miros Interactive VR Surface
Source: Coresight Research

 

  • MEDA.OOO: Personalization will be one of the key factors influencing metaverse and Web 3.0 immersion. MEDA.OOO is creating highly customizable virtual avatars for users to explore multiple virtual platforms, also opening the door for interoperability across virtual worlds and blockchains. MEDA.OOO avatars feature over 30 parameters just for facial expression, allowing users to express and customize their virtual selves in millions of different ways. MEDA.OOO is able to generate 3D avatars directly from 2D images; the 3D avatars are also capable of performing lip syncing, mimicking how humans speak.

MEDA.OOO Virtual Avatars
Source: Coresight Research

 

  • Pantheon Lab: Pantheon Labs is an “idols-as-a-service” startup that provides brands and retailers with a digital studio to create their own digital human influencers in what they claim to be minutes. Pantheon Labs also utilizes deep learning and generative AI algorithms to quickly create hyper-realistic eye-catching marketing videos utilizing idols that are indistinguishable from humans, enabling brands and retailers to maintain greater control over marketing efforts by launching new campaigns at any time that perfectly align with their values and narratives. Fresh and innovative marketing campaigns that can be controlled for high levels of personalization may significantly improve interaction and immersion.

Digital humans created by Pantheon Lab
Source: Coresight Research

 

  • G’Audio Lab: Leveraging AI to analyze vast datasets, G’Audio Lab is a software developer that has created audio technology for the metaverse. Using its software, headphones are able to provide directional sound, autonomously increasing and decreasing volume, or playing various sounds, depending on virtual location. G’Audio Lab’s software and similar technologies will likely play a key role in metaverse immersion as they set the stage for users to hear in the metaverse as they would in the real world.

G’Audio Lab’s AI-based Metaverse Audio
Source: Coresight Research

 

  • Meloscene: California-based VR platform Meloscene has created a virtual collaborative studio in which musicians can interact with one another in real time, listening to each other’s music and performing together in a virtual studio. Leveraging audio, VR and AI technologies, Meloscene’s platform allows artists and creators from all over the world to collaborate in a single location as if they were physical with one another—a concept that is at the heart of the metaverse.

Meloscene, an interactive VR music platform
Source: Coresight Research

 

AI Will Be Key to the Evolution of the Metaverse

AI is just scratching the surface of what it will be able to accomplish. Many developers at CES are beginning to utilize advancements in algorithms such as deep learning, natural language processing (NLP) and machine learning. We believe that AI will transform the metaverse in the near future as it will allow developers to create enhanced virtual simulations and graphics and completely automate interactions with customers through virtual assistants.

Below, we explore innovative exhibits and examples of how developers are incorporating AI to enhance graphics and improve interactions.

  • MeetKai: MeetKai is an advanced conversational AI solution that can understand customer queries and produce realistic speech while recalling context for a true, multiturn conversation and meaningful dialogue/interaction with customers, according to the company.
  • Talkr.AI: Voice agent developer Talkr.AI provides a no-code platform for brands, retailers and businesses to create voice agents across multiple channels. Utilizing AI technology, voice agent “Ivy” works with massive datasets and NLP algorithms to provide a seamless customer interaction experience with a consistent idol across multiple channels, whether on a regular website or within an immersive metaverse experience. Talkr.AI claims that Ivy increases personal assistant performance by 30%.

Talkr.AI offers AI-based, omnichannel virtual assistants
Source: Coresight Research

 

  • PONS.AI: AI’s ability to create content continues to improve significantly. Applying AI’s ability to seamlessly generate content, NFT-creator PONS.AI applies AI-based generative algorithms to create NFT-based artwork that can be imported into The Sandbox metaverse. Land parcels in The Sandbox, represented by NFTs, can be customized and designed with AI, making it easy for a user to purchase and decorate their virtual space.

PONS.AI provides AI-generated NFTs
Source: Coresight Research

 

  • StoryFile: Conversational AI company StoryFile created Conversa, an application that provides businesses with the means to create interactive videos that “converse” with a customer or user. The technology underpinnings of what StoryFile describes as ”Video 3.0” are provided by its ”Natural Conversation Storytelling System” and “Artificially Intelligent Interactive Memory System,” which the company has obtained patents for. As more storefronts and retailers virtualize in-store experiences and add more digital elements, this type of technology could be crucial from a customer service standpoint, enabling retailers to provide automated interactions between digital agents and customers that are virtually indistinguishable from human conversations.

Conversa AI videos
Source: Coresight Research