The development of autonomous driving technology is constantly advancing, and multi-sensor fusion technology is one of the key technologies to achieve autonomous driving. This technology improves the perception and decision-making capabilities of autonomous driving systems by integrating information from different sensors on the vehicle, such as cameras, radars, lidars, ultrasonic sensors, and inertial sensors.
According to the latest market research, the global market for autonomous driving on-board sensors is growing significantly. The global autonomous vehicle sensor market revenue was approximately USD 8798.8 million in 2023 and is expected to reach USD 17520 MN by 2030, growing at a CAGR of 10.3%.
The types of sensors for autonomous driving mainly include camera sensors, radar sensors, and lidar sensors. Among them, radar sensors accounted for the largest share of the autonomous vehicle sensor market, with a market share of nearly 88% in 2019, while camera sensors and lidar accounted for about 11% and 2%, respectively.
According to the research of Guojin Securities, with the continuous increase in the penetration rate of ADAS functional modules, the demand for the sensor market in the short term is mainly driven by cameras and millimeter-wave radar.
Figure: Global Autonomous Vehicle Sensors Market Size (USD USD)
Lidar: core sensor, solid-state trend, cost reduction, miniaturization, integration
In the process of continuous evolution of autonomous driving, with its unique 3D environment modeling, lidar has become the core part of multi-sensor fusion for autonomous driving. In L3 and above autonomous driving sensing solutions, at least one LiDAR is required. With the expansion of mass production, the improvement of technology iteration, and the continuous reduction of costs, LiDAR is also developing towards miniaturization, low power consumption, and ASIC integration. Lidar detects and measures the distance and speed of objects around a vehicle by emitting laser beams and measuring the light emitted, and these sensors are able to provide an accurate three-dimensional map of the environment, which is essential for navigation and obstacle avoidance in autonomous vehicles.
According to Guojin Securities, the mass production of L3 autonomous vehicles before 2020-2022 may be dominated by MEMS lidar, because its cost is relatively low, and the microgalvanometer technology is more mature, which can be mass-produced at low cost in a short time; After 2022, at the mass production stage of L4 or above autonomous driving, it is expected that the sidelobe effect of OPA (optical parametric amplifier) or the human eye protection problem of 3D flash will be solved to a large extent, and it may replace MEMS as a solid-state lidar with no moving parts.
Figure: How LiDAR works
A lidar system usually consists of a transmitting system, a receiving system, and information processing. The ranging principle of LiDAR is based on the speed of light and the time difference between the transmission and reception of light pulses. Since the speed of light is very fast, the resolution of the measurement time needs to be very high, often in the nanosecond range. In this way, even very short distance changes can be accurately measured. In addition, lidar can be used in a variety of applications, including autonomous vehicles, topographic mapping, environmental monitoring, and more. In the field of autonomous driving, lidar can provide an accurate 3D map of the vehicle's surroundings, helping the vehicle identify and locate obstacles for safe navigation. Lidar technology is currently evolving, and is expected to be more widely used in autonomous driving and other fields as costs decrease and performance improves.