Navigating the Shift: Exploring the Transition from Enterprise to Consumer AR Glasses
From enterprise use to personal use
In May 2014, Google launched the Google Glass Explorer edition for consumers. Since then, various new products have been introduced, with enterprise glasses becoming the mainstream.
However, in recent years, the demand for consumer-grade AR glasses has increased, particularly after the metaverse concept gained popularity in 2021. As seen in the figure below, Chinese companies launched several consumer-grade smart glasses after 2021.
Compared to AR glasses used in industries, smart glasses for consumers require larger Fields of View (FOVs), lighter weight, clearer images, and longer battery life.
The smart glasses market is a potential market with few competitors
Different types of AR glasses have varying technical specifications based on their intended purposes. Generally, they incorporate wireless technologies such as WiFi, Bluetooth, and GPS, along with a camera and various sensors (e.g., gyroscope, accelerometer, ambient light sensor, etc.).
Devices can be tethered to a stationary computer or a smaller device, or they can be standalone. This is a trade-off between computing power and the device's intrusiveness, fashionability, and social acceptability. Typically, tethered devices are larger, binocular AR headsets with larger FOVs, while untethered ones are smaller smart glasses with smaller FOVs, limited AR usability, and fewer sensors.
Types of AR Glasses
Connected glasses: These are displayless glasses with Bluetooth or WiFi connectivity.
Smart glasses: Smart glasses can be divided into two groups based on their displays: rear mirror and monocular smart glasses. Rear mirror glasses have a display discreetly positioned at the edge of the wearer's field of view (similar to a car's rearview mirror). Monocular smart glasses have an optical engine positioned in front of one of the wearer's eyes, allowing them to see reality through the display while projecting digital information directly into their field of view.
l AR HMDs: These are headsets with binocular see-through displays with a relatively large FOV (e.g., 90° for the Meta 2) that can provide true AR and usually have bulkier optics than regular smart glasses.
The Main Hardware Components of AR Products
- Among all modules in AR glasses, the optical module is the most important, consisting of optical lenses and microdisplays, and directly affecting image quality. Whether AR glasses suit consumers and workers depends on choosing the appropriate modules.
- A computing module, an SoC, provides AR input, virtual and real fusion to output computing power support, and supports machine vision and interactive technology. To gather surrounding information in the real world, the camera has become the major sensor in the processing module of AR glasses; usually, AR glasses use several cameras and SLAM technology to model the real environment in virtual form.
Source: Organized based on public information
Displays, Optical solutions, Sensors, and Chips are important components of AR glasses
Composition of AR glasses: An AR glasses is mainly composed of a camera (sensor), an optical module (micro-display, optical solution), a CPU processing center (chip, perception interaction, etc.), a bracket, and other parts.
Micro LED is the ideal solution for AR glasses micro display
AR works in an external environment and requires a microdisplay with strong brightness. Micro LED has become the most ideal solution for AR glasses' microdisplays due to its advantages of high brightness, low latency, and low power consumption. However, it is currently used in near-eye applications.
The optical waveguide solution is gradually becoming the mainstream optical technology of AR glasses
AR glasses need to see the virtual world while seeing the real world, and have high requirements for optical modules. The optical solution involved is the core technology of AR glasses. The optical technology of AR glasses continues to be iteratively upgraded, and the optical waveguide solution has gradually become the mainstream optical technology path in the future due to the advantages of thin lenses, large viewing angles, and high light transmittance.
SLAM is the mainstream tracking and positioning technology of AR, which will enhance the user's overall perception of the environment
During the interaction process of AR glasses, it is necessary to use sensors including cameras to capture various actions of users in real time (including tracking and positioning, interactive gesture recognition, daily shooting functions, etc.). Create a multi-dimensional perception effect, and then present it to the user's eyes through the screen. In terms of tracking and positioning, SLAM (instant positioning and map construction) is the mainstream technology. It calculates its own position and builds a global map of the space based on the visual/motion information captured by cameras and inertial sensors. As the user's mobile range gradually expands, the global map contains more scene information, effectively improving the user's overall perception of the environment.
Eye Sensor |
Built-in external sensor |
Inertial sensor |
|
Types |
Eye tracking, iris recognition sensor |
Traditional cameras, fisheye cameras, infrared ToF sensors, etc |
Gyroscope, Accelerometer, Magnetometer |
Function |
Monitor the movement of human eyes and identify the wearer |
Can sense 3D imaging, proximity sensing, ambient light sensing, gesture recognition and other functions |
Capture head movements, navigation and positioning |
Products |
HoloLens 2, Nreal light, etc |
Rokid Glass 2 etc |
Rayneo air 1s, Nreal Air, Rokid glass 2, INMO Air2 |
AR chips are gradually developing towards customization, among which Qualcomm is still the main chip supplier
RK3588 |
SD 820 |
Zhanrui W517 |
SD XR2 |
HPU |
Hisilicon XR Chip |
|
Features |
It can meet the computing power requirements of most artificial intelligence models and empower various AI scenarios |
It can realize relative viewing angle dynamic reflection, HDRR, realistic color and lighting, human eye lighting simulation, temporal anti-aliasing and other functions |
Using a quad-core processor, it has the characteristics of high performance, low power consumption, and powerful AI performance to provide more diverse AI application scenarios |
Strong GPU processing power, support for eye-tracking visually focused rendering, and enhanced variable-rate shading for smoother refresh rates |
Coprocessor chip that can work with CPU and GPU |
Can support 8K decoding capability, integrated GPU, NPU (network processor) |
Supplier |
Ruixinhui |
Qualcomm |
Zhanrui |
Qualcomm |
Microsoft |
Hisilicon |
Products |
SeerLens |
HiAR Glasses 2nd |
Rayneo X2 |
HoloLens 2 |
Rokid Vision |
Optical waveguides and Micro LEDs with superior performance will be the mainstream solutions for future AR products
At this stage, the bottlenecks of AR glasses with different principles mainly lie in the brightness of the eye-catching picture to meet the outdoor strong light environment, a certain level of color reproduction, color saturation, color uniformity, picture distortion correction, and a larger range of orbital movement in the horizontal and vertical directions. Currently, the Micro LED microdisplay + optical waveguide solution is the theoretically best-balanced solution in terms of high light transmittance, high brightness, high picture quality, light weight, and low power consumption, and will become the primary choice for future AR products.
TCL RayNeo X2 AR |
xiaomi Wireless AR Glass Discovery Edition |
Nreal Air |
Rokid Max |
||
Release |
2023/07 |
2023 |
2023 |
2022 |
2023 |
Using Scenario |
calls, translation, navigation, taking photos, inscriptions, voice conversion, movies, games |
Intelligent translation, real-time navigation, information reminder, quick photo, etc. |
Movie, games, translation, navigation, photography |
Watching movies, games, device projection |
Watching movies, games, device projection |
Micro-Display |
Micro-OLED |
Micro-LED |
Micro-OLED |
Micro-OLED |
Micro-LED |
Optical |
Diffractive optical waveguide |
Diffractive optical waveguide |
Curved prism |
Birdbath |
Birdbath |
Chip |
ZhanRui AI chip 4-core 1.8GHz |
Qualcomm Snapdragon XR2 chip |
Qualcomm Snapdragon XR2 Gen 1 Chip |
No |
No |
Sensor |
Gyroscope, Accelerometer, Magnetometer |
||||
Key Features |
Binocular full-color MicroOLED diffractive waveguide Wireless connections 8 million pixel high-definition camera 20000:1 contrast ratio Lightweight SLAM 500mAh battery |
Binocular full-color MicroLED diffractive waveguide 16 million pixel high-definition camera Qualcomm Snapdragon XR2 chip 100000:1 contrast ratio Brightness up to 1000 nits SLAM+ gesture recognition 590 mAh battery |
Qualcomm Snapdragon XR2 Gen 1 Chip MicroOLED binocular waveguide gesture interaction Brightness up to 1200 nits 126g Spatial positioning: SLAM+6Dof Camera: SLAM camera + low-power AON camera |
Micro-OLED, 130-inch large screen 108% sRBG Eye protection 79g 100000:1 contrast ratio Brightness up to 400 nits 60Hz refresh rate 46 FOV |
120Hz refresh rate Micro-LED Maximum 215-inch projection screen 50 FOV 1080P FHD 75g Up to 600 Nits brightness Eye protection: low blue light, no flicker 2d/3d switching |
Source: Organized based on public information
The mass production and launch of INMO Air2 are expected to foster competition and technological innovation in the C-end market. Many AR startups that entered the market last year are anticipated to release related C-end hardware products this year. As AR continues to evolve in 2023, it will pave the way through careful experimentation, exploration, and innovation.