Home » knowledge » LiDAR Demystified: An In-Depth Guide to the ‘Great Wavelength Debate’

LiDAR Demystified: An In-Depth Guide to the ‘Great Wavelength Debate’

It is widely believed that advanced driver assistance systems (ADAS) and autonomous driving (AD) are successful because they can effectively sense the environment around the vehicle and feed the sensed information into algorithms that enable autonomous navigation. Given the absolute reliance on sensing technology in life-or-death situations, systems often use multiple sensor modalities and implement data fusion to augment each other and provide redundancy. This allows each technology to leverage its strengths and provide a better combined solution.

By: Bahman Hadji, Director of Automotive Perception, Onsemi

It is widely believed that advanced driver assistance systems (ADAS) and autonomous driving (AD) are successful because they can effectively sense the environment around the vehicle and feed the sensed information into algorithms that enable autonomous navigation. Given the absolute reliance on sensing technology in life-or-death situations, systems often use multiple sensor modalities and implement data fusion to augment each other and provide redundancy. This allows each technology to leverage its strengths and provide a better combined solution.

In future ADAS and AD vehicles, there are three main modes of sensors, namely image sensor, radar and LiDAR. Each sensor has its own advantages, and they can be combined into a complete sensor suite that enables autonomous perception algorithms to make decisions through sensor fusion and provides data that provides color, intensity, velocity, and depth information for every point in the scene .

LiDAR Demystified: An In-Depth Guide to the ‘Great Wavelength Debate’
Figure 1: Sensor fusion takes advantage of each mode to provide complete information about the environment around the vehicle.

Although the concept of using light to measure distances has been around for decades, of the three main modalities, LiDAR is an emerging commercialized technology suitable for the mass market. The automotive LiDAR market is set to show impressive growth, expected to grow from $39 million in 2020 to $1.75 billion in 2025 due to the proliferation of autonomous systems requiring complete sensor suites (Yole Développement, 2020). This is a huge business opportunity. There are hundreds of companies focusing on LiDAR technology, and the cumulative investment in these companies has exceeded 1.5 billion US dollars by 2020, and this data comes from the first SPAC launched by several LiDAR companies at the end of 2020. Before the wave of public offerings. But when there are so many companies working on the same technology, and the technology is based on completely different wavelengths of light (905 nm and 1550 nm being the most prominent example), eventually one technology wins and combines the strengths of the others . As we have seen time and time again, network technology Ethernet, video technology VHS.

For users of LiDAR technology, i.e. automakers and companies that design and manufacture autonomous robotic vehicles for passenger and freight, their needs are first and foremost. Ultimately, these companies want suppliers to offer highly reliable, low-cost LiDAR sensors while meeting the range and detection performance specifications for low-reflectivity objects. While all engineers have strong claims, these companies may not know what technology is being implemented if the vendor can meet performance and reliability requirements at the right cost. That’s why this article aims to help clarify the fundamental debate: Which wavelength will dominate in automotive LiDAR applications?

LiDAR overview

To solve this problem, we first need to understand the structure of the LiDAR system. LiDAR systems come in different configurations. Coherent LiDAR, a type of LiDAR known as frequency-modulated continuous wave (FMCW), mixes the emitted laser signal with reflected light to calculate the distance and velocity of objects. Although FMCW has certain advantages, it is still a less commonly used method compared to the most common LiDAR method, Direct Time-of-Flight (dToF) LiDAR. This technology measures the time it takes for an ultra-short light pulse to be emitted from an illumination source, to reach an object, and to be reflected back to the sensor to achieve the purpose of ranging. It uses the speed of light to directly calculate the distance to an object using simple mathematical formulas for time, speed and distance. Although the choice of wavelength primarily affects the transmit and receive functions, a typical dToF LiDAR system has six major hardware functions.

LiDAR Demystified: An In-Depth Guide to the ‘Great Wavelength Debate’
Figure 2: Block diagram of a typical dToF system, with green sections representing some focus areas of ON semiconductor‘s products.

Table 1 shows a list of various LiDAR manufacturers, from well-known Tier 1 automotive suppliers to startups around the world. Market reports and public sources indicate that the vast majority of companies’ LiDARs operate at near-infrared (NIR) wavelengths, not short-wave infrared (SWIR) wavelengths. Additionally, while SWIR LiDAR vendors that focus on FMCW can only use the corresponding wavelengths, most vendors with direct time-of-flight implementations have the option to build systems using NIR wavelengths while also taking advantage of their existing IP-related capabilities such as Beam Steering and Signal Processing.

LiDAR Demystified: An In-Depth Guide to the ‘Great Wavelength Debate’
Table 1 List of LiDAR manufacturers using NIR and SWIR wavelengths

*The above list is not exhaustive, data sourced from Yole, IHS Markit and public sources

Given that most, but not all, manufacturers have chosen NI wavelengths, how did they make this decision? What impacts do they need to consider? This article focuses on the fundamental physics related to the properties of light and semiconductor materials that make up LiDAR components.

In a LiDAR system, the photons emitted by the laser should be reflected back after reaching the object and then received by the detector. In the process, these photons must compete with ambient photons from the sun. By looking at the solar radiation spectrum, and taking into account atmospheric absorption, we found that irradiance at certain wavelengths decreases, thus reducing the amount of photons that exist as system noise. The solar irradiance at 905 nm wavelength is 3 times higher than 1550 nm, which means that the NIR system has to deal with more noise interfering with the sensor. But this is just one factor to consider when choosing a wavelength for a LiDAR system.

LiDAR Demystified: An In-Depth Guide to the ‘Great Wavelength Debate’
Figure 4: Atmospheric absorption of light produces distinct peaks

sensor

The components in a LiDAR system responsible for sensing photons are different types of photodetectors, so it must be explained why they can be made of different semiconductor materials depending on the wavelength to be detected. In semiconductors, the bandgap separates the valence and conduction bands, and photons provide energy to help electrons overcome the bandgap problem, allowing the semiconductor to conduct electricity and, in turn, photocurrent. The energy of each photon is related to its wavelength, and the band gap of a semiconductor is related to its sensitivity, which explains why the required semiconductor material depends on the wavelength of the light to be detected. Silicon is the most common and least expensive semiconductor to manufacture, responding to visible and NIR wavelengths up to around 1000 nm. To detect wavelengths outside the SWIR range, rarer III/V semiconductors can be alloyed, enabling materials such as InGaAs to detect wavelengths from 1000 nm to 2500 nm.

Early LiDARs used PIN photodiodes as sensors. The PIN photodiode itself has no gain, so weak signals cannot be easily detected. Avalanche photodiodes (APDs) are the most commonly used sensor type in LiDAR today and provide the appropriate gain. However, APDs also need to operate in linear mode like PIN photodiodes to integrate the photon arrival signal, and suffer from uneven part textures where very high bias voltages are required. The latest sensors in LiDAR, which are becoming more widely used, are based on single-photon avalanche Diodes (SPADs), which have very large gain and are capable of producing a measurable current output from each detected photon. Silicon photomultipliers (SiPMs) are silicon-based SPAD arrays with the added advantage of being able to distinguish single photons from multiple photons by observing the amplitude of the generated signal.

LiDAR Demystified: An In-Depth Guide to the ‘Great Wavelength Debate’
Figure 5: Different photodetector types used to detect signals in LiDAR

Returning to the topic of wavelength, all of these types of photodetectors can be either silicon (for NIR detection) or III/V semiconductors (for SWIR detection). On the other hand, manufacturability and cost are the keys to technological viability, and CMOS silicon foundries enable low-cost, mass production of such sensors. Because of this, LiDAR has gradually begun to adopt SiPM on the basis of achieving higher performance. Although there are APDs and SPADs suitable for SWIR, they are difficult to integrate with readout logic because silicon-based processors are not used. Finally, since III/V-based SPAD arrays and photomultipliers (similar to SiPM) for SWIR have not yet been commercialized, the ecosystem is more suitable for NIR wavelengths.

laser

Photon generation is a completely different process. Semiconductor PN junctions can be used as gain media to make lasers; this can be achieved by pumping an electrical current through the junction, causing the resonant emission of photons as atoms move into lower energy bands, resulting in a coherent laser beam output. Semiconductor lasers are based on direct bandgap materials such as GaAs and InP, which are very efficient for photon generation when atoms enter lower energy bands compared to indirect bandgap materials (silicon).

The two main types of lasers used in LiDAR are edge-emitting lasers (EELs) and vertical-cavity surface-emitting lasers (VCSELs). Compared with VCSEL, EEL has lower cost and higher output efficiency, so it is more widely used at present. But EELs are more difficult to package and assemble into arrays, and are also affected by wavelength variations over temperature, forcing detectors to look for a wider wavelength band of photons to detect more ambient photons as noise.

Although newer VCSEL technologies are more expensive and less power efficient, they offer the advantage of simple and efficient packaging because their beams are generated from the top. As the cost of VCSELs will continue to decrease significantly and their efficacy will increase, their market adoption will begin to rise. EEL and VCSEL can be used for NIR and SWIR wavelength generation, the key difference between the two is that NIR wavelengths can be generated using GaAs, while SWIR wavelengths require the use of InGaAsP. Large-scale fabs can help reduce GaAs laser costs, which once again highlights the advantages of the NIR LiDAR manufacturer ecosystem from a cost and supply chain security perspective.

LiDAR Demystified: An In-Depth Guide to the ‘Great Wavelength Debate’
Figure 6: Different laser types used in LiDAR

Laser power and eye safety<.strong>

When discussing the great wavelength debate, the impact of LiDAR systems on human eye safety must be considered. The dToF LiDAR concept involves firing short laser pulses into the scene along a specific viewing angle at high peak power. Pedestrians standing on the emission path of LiDAR need to ensure that their eyes will not be damaged by laser beams directed in their own direction. The IEC-60825 specification specifies the maximum allowable exposure to light of different wavelengths. NIR light, which is similar to visible light, can pass through the cornea to the retina of the human eye, while most of the SWIR light can be absorbed in the cornea, so the exposure is higher.

LiDAR Demystified: An In-Depth Guide to the &#8216;Great Wavelength Debate&#8217;
Figure 7: IEC-60825 Eye-safe laser exposure specification

From a performance standpoint, being able to output many orders of magnitude higher laser power is an advantage for 1550 nm-based systems, as more photons are emitted and thus more returning photons are detected. But higher laser powers also mean thermal tradeoffs. It is important to note that proper eye-safe design must be done regardless of wavelength, and must clearly consider the energy of each pulse and the size of the laser aperture. For 905 nm-based LiDARs, the peak power can be increased by either factor, as shown in Figure 8 below.

LiDAR Demystified: An In-Depth Guide to the &#8216;Great Wavelength Debate&#8217;
Figure 8: NIR LiDAR eye-safe laser design based on different optics and laser parameters

Comparison of NIR and SWIR LiDAR systems

The above focuses on the amount of laser power that can be output, and now we continue to discuss the sensor used. Clearly, higher performance sensors that can detect weaker signals can benefit the system in multiple ways – enabling longer ranges, or being able to use less laser power to achieve the same range. Onsemi has developed a series of NIR LiDAR SiPMs that improve Photon Detection Efficiency (PDE), a key parameter indicating sensitivity. Its new RDM series sensor PDE achieves a market-leading 18%.

LiDAR Demystified: An In-Depth Guide to the &#8216;Great Wavelength Debate&#8217;
Figure 9: Process Development Roadmap for ON Semiconductor’s SiPM

To compare the performance of NIR dToF LiDAR with SWIR dToF LiDAR, we modeled the system using the same LiDAR architecture and environmental conditions with different laser and sensor parameters. The LiDAR architecture is a coaxial system with a 16-channel detector array and a scanning mechanism spread over the entire field of view, as shown in Figure 10 below. This system model has been validated in hardware, allowing us to accurately estimate the performance of the LiDAR system.

LiDAR Demystified: An In-Depth Guide to the &#8216;Great Wavelength Debate&#8217;
Figure 10: System model of the dToF LiDAR sensor

LiDAR Demystified: An In-Depth Guide to the &#8216;Great Wavelength Debate&#8217;
Table 2: LiDAR sensor and laser parameters simulated by NIR and SWIR system models

Due to the use of a higher PDE InGaAs alloy, the 1550 nm system uses a higher laser power and a higher PDE sensor, which allows for better ranging performance in our system simulations. Over 80° at 30 fps, 500 kHz laser frequency, and 1 ns pulse width by using system-level parameters of 100 klux ambient light filtered by a 50 nm bandpass filter on the sensor lens (focused at approximately 905 nm and 1550 nm, respectively) Scanning with a horizontal 0.1° x 5° field of view and using a 22 mm lens diameter yields the following results.

LiDAR Demystified: An In-Depth Guide to the &#8216;Great Wavelength Debate&#8217;
Figure 11: Simulation results of similar LiDAR systems based on 905 nm and 1550 nm

As expected, the 1550 nm system is able to range farther from low-reflectivity objects, up to 500 meters with a 99% probability of ranging. However, the 905 nm-based system can still achieve a range of more than 200 meters, which shows that both types of systems can meet the requirements of automotive long-range LiDAR under typical environmental conditions. In harsh environmental conditions such as rain or fog, the water absorption of SWIR light can cause its performance to degrade faster than NIR-based systems, which is another consideration.

Cost Considerations

After an extensive study of the technology used in LiDAR systems and the impact of using different wavelengths, we now return to cost considerations. We explained earlier that the sensors for NIR LiDAR use a native CMOS silicon foundry process, which minimizes the cost of semiconductors. In addition, by using stacked-die technology that foundries now have mastered, CMOS readout logic and sensors can be integrated into a single chip, which further compresses the signal chain and reduces cost.

In Contrast, SWIR sensors use higher-cost III/V semiconductor foundries (such as InGaAs) and new Ge-Si hybrid technology, which reduces SWIR sensor cost and allows easier integration of readout logic, but even after the technology matures , it is estimated that it is still more than 5 times more expensive than traditional CMOS silicon. From the laser side, the difference in size of GaAs wafers used to make laser chips for NIR systems versus InGaAs wafers used to make laser chips for SWIR systems also results in cost differences, while NIR systems can use VCSELs and there are more off-the-shelf suppliers , this fact also reduces integration costs.

Combining the above factors, an analytical survey conducted by IHS Markit (Amsrud, 2019) showed that SWIR systems cost 10 to 100 times more than NIR systems when using the same type of components (sensors or lasers). In 2019, the average total component cost of NIR system sensors and lasers is estimated to be $4-20/channel, which will drop to $2-10/channel by 2025. In comparison, the average total component cost of sensors and lasers for a SWIR system was estimated at $275/channel in 2019, falling to $155/channel by 2025. Considering that LiDAR systems contain multiple channels, even with a 1D scanning method, this would be a huge cost difference, as vertical arrays of single-point channels would still be required.

LiDAR Demystified: An In-Depth Guide to the &#8216;Great Wavelength Debate&#8217;
Table 3: Summary of Cost Considerations (Source: IHS Markit)

LiDAR market dynamics are also not favorable for the SWIR camp. The autonomous driving market is not developing as rapidly as the market expected five years ago, and Level 4 and Level 5 autonomous systems that must use LiDAR are still years away from large-scale deployment. At the same time, the industrial and robotics markets utilizing LiDAR are more cost-sensitive, and the ultra-high-performance benefits of SWIR systems are not indispensable, and there is no way for these system manufacturers to reduce component costs by increasing production volumes as is commonly said. Cost reduction can be achieved when production is increased, but cost reduction is required to achieve mass production, which is actually the problem of “the chicken or the egg”.

Summarize

After delving into the technology and the differences between NIR and SWIR systems, it becomes apparent why the vast majority of LiDAR systems today use NIR wavelengths. While the outlook for the future is not 100% certain, it is clear that cost and availability are key considerations for ecosystem suppliers, and NIR systems are undoubtedly more economical due to the technical advantages and economies of scale of CMOS silicon. While SWIR supports long-range LiDAR systems, NIR-based LiDAR can also meet automotive long-range ranging needs, while also performing well in short-to-medium-range configurations required for ADAS and AD applications.

At present, NIR-based LiDAR has achieved high-volume production in the automotive market, indicating that the technology has achieved commercialization and passed the market test, but integration will still take time, and win or lose, it will need to go through upheaval and adjustment. There were 30 different manufacturers in the auto industry at the turn of the 20th century, increasing to nearly 500 in the following decade, but only a few years later, most manufacturers have disappeared. LiDAR manufacturers are expected to experience a similar evolution by 2030.

references

Yole Développement(2020). LiDAR for Automotive and Industrial Applications – Market and Technology Report2020

Amsrud, P. (2019 September 25). The race to a low cost LIDAR system [Conference Presentation]. Automotive LIDAR 2019, Detroit, MI, United States. IHS Markit.

Nick84(2013) CC BY-SA 3.0, via Wikimedia Commons

The Links:   LM64P89L 7MBR50NF060