1. Introduction
Advanced driver-assistance systems (ADAS) are currently an area of focus in the automotive industry. Modern vehicles are equipped with different ADAS, increasing the driver’s comfort and safety, as depicted in Figure 1. According to the German Federal Statistical Office, fatalities in road accidents in Germany dropped from 21,330 in 1970 to 2562 by 2021 [1], regardless of the tremendous increase in the number of motor vehicles that took place during these years. Figure 2 summarizes the ADAS role in reducing the number of fatalities in traffic accidents.
The complexity of ADAS is also increasing rapidly, and validation of such systems is becoming challenging. The validation of such a complex system in the real world is expensive and time-consuming. According to the RAND cooperation statistical studies, 5 billion km of test driving is required to demonstrate the autonomous vehicle failure rate being lower than that of humans [4]. Similar statistical consideration given in [5] also proves that 240 million km of the test drive is required for the verification of the ADAS, which is not feasible to attain. Numerous validation processes have been adopted to overcome this challenge, model-in-the-loop (MiL), hardware-in-the-loop (HiL), and software-in-the-loop (SiL) [5]. Moreover, we see increasing efforts and work from academia and industry with regard to virtual validation processes. Research projects including VIVID [6], DIVP [7], VVM [8], and SET level [9] also endorsed the same concept. In addition, the automotive industry has started considering type approval based on virtual tests [10]. In addition, the ADAS complexity requires joint efforts and collaboration of industrial players from different domains, which is only possible with an effective exchange of models without intellectual property (IP) violation [11]. Therefore, open standards and interfaces have been introduced, including an open simulation interface (OSI) and functional mock-up interface (FMI) [11,12].
The virtual environment and environmental perception sensors exhibit the complexity and behavior of real-world scenarios, and sensors are essential elements and enablers in all these activities [13]. However, the state-of-the-art virtual scenarios and environmental perception sensors provided by simulation tool vendors typically offer a generic, parameterizable model of the optical or electromagnetic wave propagation but often do not consider sensor-specific effects in detail. Moreover, no commonly accepted metrics or standards exist to prove the fidelity of virtually developed environmental perception sensor models and scenarios [14].
This paper contributes to the ADAS virtual testing and validation with a tool-independent, high-fidelity LiDAR sensor model. The proposed model is developed by using a standardized OSI 3.0.2 and FMI 2.0. It was integrated successfully into the virtual environment of CarMaker from IPG Automotive, and the AURELION of dSPACE to verify the exchangeability [15,16]. The operational performance of the LiDAR FMU model is the same, irrespective of the tool used. The presented virtual sensor includes the complete signal processing toolchain of the Blickfeld Cube 1 LiDAR sensor. Moreover, it also considers the optical, electrical, and environmental effects to generate a realistic output. Furthermore, a real-world static test scenario is accurately constructed in the virtual environment of CarMaker. Finally, real and virtual test results are compared to verify the fidelity of the LiDAR sensor model on the time domain and point cloud levels. Furthermore, key performance indicators (KPIs) are defined to authenticate the accuracy of the sensor model at the point cloud level.
The paper is structured as follows: Section 2 describes the LiDAR sensor background. Then, an overview of the state-of-the-art LiDAR sensor models is given in Section 3. Section 4 describes the modeling approach of the proposed LiDAR sensor model. The LiDAR FMU modules specifications are explained in Section 5. The results are discussed in Section 6. Finally, Section 7 and Section 8 provide the conclusion and outlook.
2. Background
LiDAR is a range measurement technique that has been used in the military and aviation fields for many decades. However, since the first realization of an autonomous vehicle on the road, original equipment manufacturers (OEMs) have enhanced vehicles’ autonomous capabilities by installing different ADAS [17]. As a result, LiDAR technology has become indispensable for autonomous driving (AD) due to its better angular resolution, and field-of-view (FoV) than RADAR [18].
LiDAR Working Principle
The LiDAR sensor measures the round-trip delay time (RTDT) that laser light takes to hit an object and returns to calculate the range [19], as depicted in Figure 3.
We have
(1)
where R denotes the target range, c is the speed of light, and is the RTDT, also known as time of flight (ToF). The LiDAR sensor measures the range and, together with the spatial laser beam deflection, the position by using pulsed or modulated waveforms [19].3. State of the Art
Automotive perception sensor models can be divided into three categories; ideal, phenomenological, and physical models depending on their modeling approach and covered effects [20].
Ideal sensor models, also known as “ground truth (Ground truth provides the simulated objects’ actual values, dimensions, position, velocities, orientation, and bounding box.)” sensor models, use the object list provided by the simulation framework in the world coordinate system as an input. The term ground truth is borrowed from remote sensing, where it refers to information collected on location for data calibration [21]. These models’ output is a filtered object list as per the sensor-specific FoV [22]. The ideal LiDAR sensor models are presented in [23,24], and they do not consider any sensor-related imperfections except FoV and object occlusion. Therefore, these models have low complexity, require less computation time, and can test the highly automated driving (HAD) function operation in the early stage of development. It should be noted that the ideal models which are described in the literature, are mostly generic, and they can fulfill the requirements of different environmental perception sensor types, including LiDAR, RADAR, and camera [23,24]. OSI provides the osi3::GroundTruth interface for such models.
Phenomenological sensor models use the object list as an input and apply weather conditions, false alarms (positive/negative), detection probability, and sensor-related effects, including FoV and limited detection range. This type of sensor model output either raw data (point clouds) for LiDAR sensors or object lists [25]. Muckenhuber et al. [26] proposed a generic sensor model that requires a ground truth object list as an input. The model’s output is a sensor-specific object list, including FoV, object class definition, occlusion, and probability of false positive and false negative detections. Linnhoff et al. [27] introduced an object-based LiDAR sensor model that outputs object lists and considers partial occlusion of objects, limitation of angular view, and decreases in the effective range due to atmospheric attenuation. Hirsenkorn et al. [25] presented a generic non-parametric statistical methodology to replicate the behavior of the real-world sensor. The developed model includes various sensor errors that include ranging, latency, false-positive, and occlusion. The model’s output can be either an object list or raw data. A LiDAR model consisting of geometrical and physical submodules is presented in [28,29]. The input of the geometrical model is the object list, including the occlusion, FoV, and beam divergence. The model’s output is an object list that can be extended to point clouds.
Physical sensor models are based on the physical aspects and are numerically complex. Hence, they require a lot of computational power and, thus, might not be real-time capable. The subsequent models use the rendering techniques provided by the simulation framework as input and generate the raw data (point clouds) as an output containing distance, intensity, and timestamp. Several rendering techniques generate the synthetic LiDAR sensor raw data; ray tracing, ray casting, rasterization (z-buffers), and ray path [14,30]. Philipp et al. [31] developed a ray casting-based LiDAR sensor model to generate point clouds. Their presented model includes beam divergence, signal-to-noise ratio (SNR), detection threshold, and material surface reflection properties. Gschwandtner et al. [32] introduced the Blender sensor simulation (Blensor), an open-source LiDAR plugin. The Blensor toolkit uses the ray casting mechanism of Blender, and it considers the sensor noise, materials’ physical properties, and free space path losses (FSPL). Goodin et al. [33] established a ray casting LiDAR sensor model and incorporated it into the virtual navigation environment (VANE). The model can simulate the effects of beam divergence and a Gaussian beam profile. In [34], an open-source multi-purpose LiDAR simulator HELIOS is proposed. HELIOS exploits a ray casting approach and provides the scan pattern of four beam deflection units: fiber array, rotating, oscillating, and conic mirror. The effects of beam divergence, atmospheric attenuation, scanner efficiency, and material surface properties are also modeled. Hanke et al. [35] apply a ray tracing rendering technique for generating synthetic point clouds. Moreover, the suggested model includes beam divergence, material reflection properties, detection threshold, noise effects, and atmospheric attenuation. Li et al. [28] developed a physical sensor model that requires ray tracing data as an input. In addition, the model contains beam divergence and power loss due to rain, fog, snow, and haze. Zhao et al. [29] extend the work of [28]. Their model takes the probability of false alarms due to the backscattering from water droplets. In addition, the physical effects of beam divergence and object surface reflectivity were studied by Goodin et al. [36], who also analyzed the LiDAR signal attenuation and range error due to the rainfall. The commercial and open-source simulation platforms also provide the LiDAR sensor models with different fidelity levels.
CARLA is an open-source simulation environment that offers a LiDAR model that simulates laser rays by using ray casting. The CARLA LiDAR model takes into consideration physical effects, noise, the drop-off in intensity, and the number of point clouds due to external perturbations [37]. CarMaker provides the real-time capable ray tracing-based LiDAR model known as LiDAR raw signal interface (LiDAR RSI). The LiDAR RSI regards propagation losses, object geometry, material surface properties, and incidence angle of ray for intensity calculation of point clouds [20]. DYNA4 from Vector offers a LiDAR model which uses a ray casting technique that outputs raw data intensity based on physical effects, the material surface reflectivity, and ray angle of incidence [38]. The VTD from Vires LiDAR model operates on a ray tracing engine from NVIDIA [14,35]. The AURELION LiDAR model is also based on ray tracing. It considers material surface reflectivity, covers noise, atmospheric attenuation, and fast motion scan effect [39].
This work categorizes the ideal and phenomenological sensor models as low fidelity because of their simplified modeling approach and covered effects. For instance, the abovementioned physical sensor models simulate point clouds according to ray tracing or ray casting detections but don’t consider sensor-specific effects in detail. Such considerable effects are optical losses, inherent detector effects, effects generated by the electrical amplification, and noise produced by sunlight. That is why we classified them as medium-fidelity LiDAR sensor models. On the other hand, the sensor model proposed in this work considers ray tracing detection as input and applies sensor-specific imperfections of the LiDAR sensor to output the realistic time domain and point cloud data. Therefore, the sensor model considers the complete signal processing steps of an actual LiDAR sensor. Because of that, the proposed LiDAR model is classified as a high-fidelity sensor model.
The imperfections implemented in the proposed high-fidelity model significantly impact LiDAR sensor performance. To prove this point, we have compared the presented LiDAR model, the state-of-the-art LiDAR model from commercial software results, with the real measurements at the point cloud level. In addition, the presented LiDAR sensor model is also validated at the time domain level. An overview of the working principle, covered effects, and validation approaches of the LiDAR sensor models presented in this section is listed in Table 1.
4. LiDAR Modeling Building Blocks
The modeling and simulation of ADAS sensors are complex, multi-domain, concurrent, and distributed, requiring expert teams from different disciplines to develop and authenticate. However, the rapid development of such a system is possible if every participant prepares their partial solution and integrates it with other partners’ solutions with ease [41].
4.1. Open Standards
As given in Section 1, the open standards gained significant interest from the automotive industry in the last few years because it allows the accessible exchange of simulation models between different tools. The LiDAR model developed in this work is for industrial use. Therefore, we used standardized interfaces FMI and OSI to make it tool-independent. We have verified the interchangeability of the proposed sensor model by successfully integrating it into the co-simulation environment of CarMaker and AURELION.
The co-simulation framework provides flexibility to a couple more than one simulation model by using standardized interfaces FMI and OSI [41].
4.2. Functional Mock-Up Interface
As mentioned earlier, generic, standardized interfaces are required in the automotive industry to exchange the model between different simulation tools without IP infringement [41]. Therefore, FMI is a solution to this industrial need. It was an initiative of Daimler AG with the primary aim to improve the exchange of simulation models between the OEMs and suppliers [11].
The FMI standard 2.0 contains two types of protocols [42]:
FMI for Model Exchange is an interface to the dynamic system model described by differential, algebraic and discrete-time equations. Dynamics models don’t have a solver, and their C-code is generated by simulation environments that other simulation tools can also use in online and offline simulations.
FMI for Co-Simulation is an interface to connect more than one simulation tool and subsystems in a co-simulation environment. Each subsystem in the co-simulation interface has its own solver to solve independently between two communication steps. The master algorithm steers the data sharing between the subsystems (slaves).
The focus of this research paper is on FMI for a co-simulation stand-alone use case. FMI co-simulation stand-alone can be applied in different use cases as given in [42], but this paper focuses on the single process use case shown in Figure 4. In such a case, master and slave both run on the same process. The master controls the coupled simulation, and the slaves consist of the model and solver. FMU is the component that implements the FMI interface [11].
4.3. Open Simulation Interface
As mentioned earlier, virtual validation of ADAS and AD is indispensable. Therefore, logical interfaces from the virtual sensors to ADAS are also required. OSI is the first reference implementation of the ISO 23150 for the virtual development and validation of ADAS systems and sensors endorsed in the Pegasus Project [43]. OSI is a generic interface that uses a protocol buffer message format developed by Google to exchange information between the environmental simulation tools, ADAS sensor models, and ADAS [12]. It also provides the flexibility to integrate the sensor models into the co-simulation environment by using the so-called OSI sensor model packaging (OSMP) [44]. The OSMP FMUs transfer the positions and the sizes of the Google protobufs to exchange the data between the simulation environment and sensor model [45]. To study the detailed description of the OSI interfaces, the reader is referred to [12].
5. LiDAR Sensor Model
Figure 5 depicts the toolchain and the signal processing steps of the proposed LiDAR model. The sensor model considers the scan pattern and complete signal processing steps of Blickfeld Cube 1. As mentioned earlier in Section 1, the model itself is built as an OSMP FMU and uses the virtual environment of CarMaker. It provides the ray tracing framework with a bidirectional reflectance distribution function (BRDF) that considers the direction of the incident ray , material surface, and color properties [46]. The LiDAR FMU model uses the ray tracing module of CarMaker. The material properties of the simulated objects, angle-dependent spectral reflectance , and reflection types, including diffuse, specular, retroreflective, and transmissive, are specified in the material library of CarMaker.
The FMU controller passes the required input configuration to the simulation framework via osi3::LidarSensorViewConfiguration. The simulation tool verifies the input configuration and provides the ray tracing detections via osi3::LidarSensorView::reflection interface time delay and relative power [45].
Afterward, the FMU controller calls the LiDAR simulation library and passes the ray tracing data for further processing. The central component of the simulation library is the simulation controller. It is used as the primary interface component to provide interactions with the library, for instance, configuring the simulation pipeline, inserting ray tracing data, executing the simulation’s steps, and retrieving the results.
The next block in the pipeline is the link budget module, which calculates the photons over time. The task of the detector module is to capture these photons’ arrivals and convert them into an electrical current signal . In the proposed LiDAR model, we have implemented silicon photomultipliers (SiPM) as a detector [47]. Still, it can also support avalanche photodiode (APD) and single-photon avalanche diode (SPAD) detector models.
The third block in the pipeline is the circuit module. Its task is to amplify and convert the detector’s photo current signal to a voltage signal that is processed by the ranging module.
The last part of the toolchain is the ranging module, which determines the range and intensity of the target based on the received from the analog circuit for every reflected scan point. Finally, the effect engine (FX engine) is a series of interfaces that interacts with environmental or sensor-related effects and the blocks of the simulation pipeline. These interactions can involve, for example, the consideration of thermal noise in electrical components, signal attenuation due to weather phenomena, and backscattering. It should be noted that this paper only considers the environmental condition sunlight effect.
This section will cover a detailed description of scan patterns and LiDAR simulation library components.
5.1. Scan Pattern
The described LiDAR FMU model uses the Blickfeld Cube 1 elliptical shape scan pattern as given in Figure 6. Cube 1 is comprises of a single laser source to emit laser pulses and a beam deflection unit, a so-called scanner that deflects the beam to obtain an environment image. The scanner deflects the laser beam using two 1D microelectromechanical mirrors (MEMS) scanners oriented horizontally and vertically, and having a phase difference of [48]. The block diagram of the MEMS LiDAR sensor is shown in Figure 7. The scan pattern is imported inside the LiDAR FMU via the FMU Controller.
5.2. Link Budget Module
The relative power obtained from the ray tracing module does not consider the environmental condition of sunlight and optical losses. The link budget module considers these effects, and the received power can be given as
(2)
where is the target reflectivity, denotes the diameter of the optical aperture, is the target range, the direction of the incident ray is given by , receiver optics loss factor is given by , shows the atmospheric loss factor, and the transmit power is denoted by [50].The total received power by the detector over time can originate from different sources including internal reflection , target receive power , and sunlight power . That is why can be given as
(3)
The can be calculated as
(4)
where the illuminated area of the laser spot on the target is , denotes the optical bandwidth of the bandpass daylight filter, the optical loss factor is given by , is the target distance, and is the solar spectral irradiance at air mass 1.5 [50]. In this work, values are taken from the ASTM G173-03 standard [51].It is possible to model the optics at the photon level based on the power equation to make the simulation more accurate. For this approach, the power signal must be sampled with time interval of [47]. Then, the sampled power equation takes the form of
(5)
with . The mean of incident photons on the SiPM detector within a one-time bin can be written as(6)
where is the energy of a single laser photon at the laser’s wavelength, h is the Planck constant and is the frequency of the photon [52]. The SiPM detector generates Poisson distributed shot noise due to the statistical arrival of photons. That is why the arrival of photons can be modeled as a Poisson process [53](7)
The output of the link budget module is given in Figure 8.
5.3. SiPM Detector Module
We have implemented the SiPM detector module that provides an output current proportional to the number of photons [47]. In contrast to the SPAD, the SiPM detector yields better multi-photon detection sensitivity, photon number resolution, and extended dynamic range [54,55]. The SiPM detector response for a given photon signal can be calculated as
(8)
where is the SiPM detector sensitivity, the impulse response of the detector is given as . is given as(9)
is the SiPM recovery time [47,54,55].
The output of the SiPM detector module is given in Figure 9.
5.4. Circuit Module
We use the small-signal transfer function of the analog circuit model to obtain the voltage signal ,
(10)
where is the operating voltage of the circuit model, is the inverse discrete Fourier transform (IDFT), shows the discrete Fourier transform (DFT), and denotes the small-signal voltage of the circuit model [47]. The output voltages of the circuit module are given in Figure 10.5.5. Ranging Module
The ranging algorithm takes the voltage signal from the circuit module as its input. Then, it calculates the target range and the signal intensity for each scan point. The range is given in meters while the intensity is mapped linearly to an arbitrary integer scale from 0 to 4096 as used in the Cube 1 products.
The algorithm applies several threshold levels to distinguish between internal reflection, noise, and target peaks. The target range is determined based on the relative position of the target peaks to the internal reflection, while the signal intensity is calculated from the peak voltage levels. The output of the ranging module is given in Figure 11.
6. Results
The model’s accuracy presented in this paper is authenticate on two interfaces: time-domain and point cloud. We used single-point scatter to validate the model in the time domain.
6.1. Validation of the Model on Time Domain
The primary reason to verify the LiDAR model on the time domain is that the link budget, detector, and circuit modules are working as intended. Furthermore, comparing the time domain signals (TDS) establishes the association between measured and modeled noise and amplitude levels because it is difficult to compare the simulated and measured noise at the point cloud level.
A 10% diffuse reflective Lambert plate is placed at the distance of 10 m, 15 m, 20 m, 25 m, 30 m, 35 m, and 40 m in front of the sensor, as shown in Figure 12. To verify the model on the time domain level, only single-point scatter is considered from the surface of the target. The comparison of the simulated and real measured TDS and their amplitude differences are shown in Figure 13 and Figure 14, respectively. The amplitude difference can be written as
(11)
where is LiDAR FMU TDS, and denotes the measured TDS of the LiDAR sensor.It can be seen that object’s peaks, shape, and noise level match quite well. But the difference in amplitude can be observed at different distances, as given in Figure 14. This is because we use the small-signal transfer function, not an analog circuit model, to prevent computational burden. To quantify the amplitude difference , we use the mean absolute percentage error ( ) metric
(12)
where is the simulated value, the measured value is denoted by , and n shows the total number of data points [56]. The of voltages is .Afterward, we validated the ranging model by comparing the intensities value shown in Figure 15. Again, the result shows good agreement between the simulated and measured data. Here, it should be noted that the voltage mismatch is directly proportional to the intensities discrepancy because the voltages are mapped linearly to the arbitrary integer intensity scale.
6.2. Validation of the Model on Point Cloud Level
To validate the sensor model on the point cloud level, we took the lab tests in ideal conditions and proving ground measurements in real environmental conditions.
6.2.1. Lab Test
In the next step, the model is analyzed at the point cloud level, and for this purpose, the same test setup is used, as shown in Figure 12. Now, all the reflections from the Lambertian plates during the scan are considered. The performance of the LiDAR sensor model significantly depends upon its fidelity level and virtual environmental modeling, including the target’s surface reflective properties. However, as mentioned earlier, no metrics or KPIs are available to verify the sensor model accuracy at the point cloud level. Therefore, we define three KPIs based on expert knowledge to confirm whether the model is ready for ADAS testing, as follows.
-
(1). The number of received points from the surface of the simulated and real objects of interest.
-
(2). The comparison between the mean intensity values of received reflections from the surface of the simulated and real targets.
-
(3). The distance error of point clouds obtained from the actual and virtual objects should not be more than the range accuracy of the real sensor, which is 2 cm in this case.
The number of received points is an important KPI. Because neural networks and deep learning algorithms are applied to 3D LiDAR point clouds for object recognition, segmentation, and classification. If the number of LiDAR points received from the simulated and measured objects is different. It will influence the performance of object-recognition algorithms and the ADAS. This KPI depends on the simulated and real scan pattern’s similarity. For this paper, the LiDAR sensor model and LiDAR sensor use the same scan pattern shown in Figure 6. In the author’s opinion, this is an important KPI to be considered for the accuracy verification of the model.
The intensity values of received reflections in simulation and actual measurement are also considered for the environmental modeling and sensor model verification. However, if the reflectivity of the modeled object is not the same as in the real world, mean intensity values and the number of received reflections will not match. Therefore, this KPI is also essential to obtain a realistic output.
Furthermore, the distance error of the point clouds received from the simulated and measured object of interest should not be more than the range accuracy of the real sensor. We have
(13)
where ground truth distance is denoted by , , and are the mean distance of reflections received from the surface of simulated and the real object of interest. The ground truth distance is calculated from the sensor’s origin to the target’s center, and it can be written as(14)
where the target’s x, y, and z coordinates are denoted by subscript t and sensors by s [57]. OSI ground truth interface osi3::GroundTruth is used to retrieve the sensor origin and target center position in 3D coordinates.The exemplary 3D point clouds of LiDAR FMU and real measurement are given in Figure 16a,b, respectively. Figure 17 shows the simulated and real measured spherical point clouds obtained from the Lambertian plates. The horizontal and vertical spacing of simulated and real measured point clouds are and , respectively. That shows the horizontal and vertical spacing of points in simulated and real scan patterns are the same.
Figure 18 compares the number of received points , and mean intensity values . The number of received points and mean intensity from the object of interest in the simulation and real measurements show good agreement. However, a slight mismatch between these quantities can be observed. The possible reasons for the deviation are the ambient light that is not constant over the entire measurement field. We used the metric as given in Equation (12) to quantify the difference between the number of received points and mean intensity . The results are given in Table 2.
The distance error is shown in Figure 19, and it is below the range accuracy of the real sensor, which is 2 cm in this case. These results confirm that the scan pattern and modeled object reflective properties are the same as the real sensor and object. Furthermore, the sensor model provides realistic output.
6.2.2. Proving Ground Tests
We have conducted static tests at the FAKT-motion GmbH proving ground in Benningen to record the real data in daylight. The intensity of daylight was 8 klux. A 10% reflective Lambertian plate is placed at the distance of 5 m, 10 m, 15 m, 20 m, 25 m, 30 m, 35 m, and 40 m in front of the Cube 1 mounted on the test vehicle as shown in Figure 20. The scan pattern used for this measurement campaign is shown in Figure 21.
Figure 22 compares the number of received points for the real and simulated object of interest. It should be noted the sensor model provided by the CarMaker also uses the same scan pattern, as shown in Figure 21. The real sensor and LiDAR FMU model can detect the object of interest up to the 30 m. However, the exact dimension of the target cannot be estimated from the real and LiDAR FMU point clouds, as shown in Figure 23. This is because the background noise due to the sunlight, detector shot noise, and thermal noise of electronic circuitry masks the weak reflections received from the Lambertian target. Consequently, the peak detection algorithm of real and LiDAR FMU models cannot distinguish between the noise and target peaks. It should be noted LiDAR FMU model uses the same peak detection algorithm and detection thresholds as Cube 1. On the other hand, a state-of-the-art sensor model provided by commercial software can detect the target till 40 m, and the dimension of the target can be estimated easily from the yielded point clouds. This is the case because the LiDAR models provided by the commercial software are generic and parameterizable. These sensor models do not consider the complete signal processing toolchain of specific LiDAR hardware and related sensor imperfections. Instead, they allow the user to integrate the sensor-specific scan pattern, obtain the ideal point clouds, and apply the signal processing steps and imperfection of the LiDAR sensor to get simulation results close to the specific LiDAR sensor in post-processing.
The authors believe the sensor models provided by the commercial and open source tool vendors can be used for the ADAS testing that requires ideal or medium fidelity point clouds. However, in use cases where a high-fidelity LiDAR model’s output is required, the scan pattern, complete signal processing toolchain, and sensor-specific imperfections of real LiDAR sensor, as mentioned in Section 1 need to be considered.
Figure 24 compares the mean intensity of Cube 1 and LiDAR FMU model. The mean intensity values received from the object of interest in simulation and real measurement are similar. We used Equation (12) to quantify the difference between the simulated and real measured values. The for the mean intensity is 11.1%, and it is greater than the of lab tests mean intensity values as given in Section 6.2.1. Although we have modeled the daylight intensity in the LiDAR FMU model, it is still challenging to model 100% environmental conditions in the simulation. The increase in the for mean intensity values is due to environmental losses. Furthermore, it is impossible to compare state-of-the-art LiDAR sensor model intensity with the real measurement because their signal processing steps to calculate the intensity are not the same, which is why their units are different. The real measured point cloud intensity is in arbitrary units (a.u.), and state-of-the-art sensor model intensity is in Watts. Figure 25 shows the comparison of distance error of the real and virtual sensors. The distance error of the state-of-the-art LiDAR model is less than cm. This is because this sensor model provides the ideal point clouds, and its mean distance is closer to the ground truth distance . However, the real LiDAR sensor and LiDAR FMU point clouds are noisy and dispersed, which is why their distance error is more than 1 cm.
7. Conclusions
In this work, we have introduced a process to develop a tool-independent, high-fidelity LiDAR sensor model by using FMI and OSI standardized interfaces. The model was integrated successfully into the virtual environment of CarMaker and AURELION from dSPACE to show its exchangeability. Moreover, the LiDAR FMU model provides the same results regardless of the tool used. The developed LiDAR sensor model includes the complete signal processing steps of the real LiDAR sensor and considers the sensor-specific imperfections, including optical losses, inherent detector effects, effects generated by the electrical amplification, and noise produced by the sunlight to output the realistic data. The virtual LiDAR sensor outputs the time domain and point cloud data. The real and simulated time domain results comparison show that simulated and measured signals’ peak shape, noise, and amplitude levels match well. The of amplitude is . Furthermore, KPIs are defined to authenticate the simulated and measured point clouds. The presented lab test results demonstrate that the is and for the number of points and mean intensity values obtained from the simulated and real Lambertian plates. Moreover, the distance error is below 2 cm. In addition, the static tests were performed at the proving ground on a sunny day to record the real data. Real measurement results are compared with the state-of-the-art LiDAR model provided by the commercial and the proposed LiDAR model to show the presented model fidelity. The results show that although the state-of-the-art LiDAR model uses a similar scan pattern as the real sensor, it cannot exhibit the same results as a real sensor. This is because it does not include the complete signal processing steps of the real sensor and related imperfections. Such models are useful for testing the ADAS in use cases where low and medium-fidelity LiDAR point clouds are sufficient. The scan pattern, complete signal processing steps, and sensor-specific imperfections must be considered when high-fidelity output is required. It is also concluded that the material properties and reflectivity of the modeled and actual object of interest should be the same; otherwise, simulation and actual results will not match.
8. Outlook
The model will be further validated in the next steps as per the ASTM E3125-17 standard. Moreover, the model fidelity will be validated using the different state-of-the-art metrics available. Furthermore, rain and fog effects on the performance of automotive LiDAR sensors will be modeled and validated.
Conceptualization, A.H.; methodology, A.H.; software, A.H., M.P., M.H.K., M.F. and M.S.; validation, A.H.; formal analysis, A.H., M.P., M.H.K. and T.Z.; data curation, A.H. and M.H.K. writing—original draft preparation, A.H.; writing—review and editing, A.H., M.P., M.H.K., M.F., M.S., Y.C., L.H., T.Z., T.P., M.J. and A.W.K.; visualization, A.H.; supervision, A.W.K. and T.Z.; project administration, T.Z. All authors have read and agreed to the published version of the manuscript.
Not applicable.
Not applicable.
Not applicable.
We like to thank Abdulkadir Eryildirim for reviewing this manuscript.
The authors declare no conflict of interest.
The following abbreviations are used in this manuscript:
ADAS | Advanced Driver-Assistance System |
LiDAR | Light Detection And Ranging |
RADAR | Radio Detection And Ranging |
FMU | Functional Mock-Up Unit (FMU) |
OSI | Open Simulation Interface |
FMI | Functional Mock-Up Interface (FMI) |
MAPE | Mean Absolute Percentage Error |
ABS | Anti-Lock Braking System |
ACC | Adaptive Cruise Control |
ESC | Electronic Stability Control |
LDW | Lane Departure Warning |
PA | Parking Assistant |
TSR | Traffic-Sign Recognition |
MiL | Model-in-the-Loop |
HiL | Hardware-in-the-Loop |
SiL | Software-in-the-Loop |
IP | Intellectual Property |
KPIs | Key Performance Indicators |
OEMs | Original Equipment Manufacturers |
FoV | Field of View |
RTDT | Round-Trip Delay Time |
ToF | Time of Flight |
HAD | Highly Automated Driving |
RSI | Raw Signal Interface |
OSMP | OSI Sensor Model Packaging |
SNR | Signal-to-Noise Ratio |
FSPL | Free Space Path Losses |
BRDF | Bidirectional Reflectance Distribution Function |
SiPM | Silicon Photomultipliers |
APD | Avalanche Photodiode |
SPAD | Single-Photon Avalanche diode |
FX Engine | Effect Engine |
MEMS | Microelectromechanical Mirrors |
IDFT | Inverse discrete Fourier Transform |
DFT | Discrete Fourier Transform |
TDS | Time Domain Signals |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Figure 2. Decrease in road fatalities despite the increase in the number of motor vehicles due to the advances of ADAS in Germany, Source: adapted from [1,3]. ABS, anti-lock braking system; ACC, adaptive cruise control; ESC, electronic stability control; LDW, lane departure warning; PA, parking assistant; TSR, traffic sign recognition.
Figure 3. LiDAR working principle. LiDAR sensor mounted on ego vehicle simultaneously sends and receives the laser light partly reflected from the surface of the target and measures the distance.
Figure 6. Specification of scan pattern used by LiDAR FMU model and LiDAR sensor: [Forumla omitted. See PDF.] horizontal and [Forumla omitted. See PDF.] vertical FoV, 80 scan lines, frame mode only up, [Forumla omitted. See PDF.] horizontal angle spacing, frame rate [Forumla omitted. See PDF.] Hz, maximum detection range is 250 m, and minimum detection range is [Forumla omitted. See PDF.] m.
Figure 8. The output of the implemented link budget module for 5% reflective point scatter targets.
Figure 9. The output of the implemented SiPM detector module for 5% reflective point scatter targets.
Figure 10. The output of the circuit module for 5% reflective point scatter targets.
Figure 11. The output of the ranging module for 5% reflective point scatter targets.
Figure 12. (a) Static simulation scene to validate the time domain and point cloud data. (b) Real setup to validate the time domain and point cloud data. The 10% reflective target was placed in front of the sensor at different distances. The coordinates of the actual and simulated sensor and target are the same. The ground truth distance [Forumla omitted. See PDF.] is calculated from the sensor origin to the target center.
Figure 13. LiDAR FMU and real measured TDS comparison. The target peaks and noise levels match well. Furthermore, the LiDAR FMU model provides the same results in AURELION from dSPACE for the time domain signals. We used the osi3::GroundTruth interface to get the target’s position in the virtual environment.
Figure 14. The voltages difference [Forumla omitted. See PDF.] of simulated and measured target peaks.
Figure 15. The validation of the ranging module. The simulated and measured intensities values show good agreement.
Figure 16. Exemplary visualization of the Cartesian point clouds received from all the objects in the FoV of LiDAR FMU and real sensor. (a) The LiDAR FMU 3D Cartesian point clouds. (b) The 3D point clouds of real sensors. It should be noted that the modeled and actual objects’ material properties are different except for the Lambertian target. That’s why the number of points [Forumla omitted. See PDF.] received from the ground and walls are different in simulation and actual measurement.
Figure 17. Visualization in spherical coordinates of points obtained from the actual and simulated Lambertian plate placed at 15 m. The horizontal spacing between the simulated and measured points is [Forumla omitted. See PDF.] and vertical spacing is [Forumla omitted. See PDF.].
Figure 18. (a) The number of received points from the object of interest in simulation and real measurement is approximately the same at all distance values. However, a slight mismatch in the number of reflections can be observed because it is impossible to replicate the 100% real-world conditions in the simulation, for instance, ambient light. (b) The mean intensity [Forumla omitted. See PDF.] values show good agreement. It can also be observed that the standard deviation of real measured intensity values is higher than the simulated intensity values because the ambient light condition influences the real measured intensity values.
Figure 19. The distance error is below the range accuracy, that is [Forumla omitted. See PDF.] cm.
Figure 20. (a) The test vehicle was equipped with a LiDAR sensor and global positioning system (GPS). The GPS ADMA-G-PRO+ from Genesys Inc. is used as the reference sensor with a range accuracy of [Forumla omitted. See PDF.] m. The size of the [Forumla omitted. See PDF.] reflective plate is [Forumla omitted. See PDF.] × [Forumla omitted. See PDF.]. (b) The static simulation scene. The ground truth distance [Forumla omitted. See PDF.] is calculated from the sensor reference point to the center of the plate target in simulation and real measurement by using Equation (14).
Figure 21. Specification of scan pattern used by the real and virtual LiDAR sensors for proving ground tests: [Forumla omitted. See PDF.] horizontal and [Forumla omitted. See PDF.] vertical FoV, 40 scan lines, frame mode only up, [Forumla omitted. See PDF.] horizontal angle spacing, frame rate [Forumla omitted. See PDF.] Hz, maximum detection range is 250 m, and minimum detection range is [Forumla omitted. See PDF.] m.
Figure 22. The comparison between the number of received points [Forumla omitted. See PDF.] obtained from the simulated and real 10% Lambertian plate. The actual measured and LiDAR FMU received point cloud [Forumla omitted. See PDF.] are similar. However, the number of points [Forumla omitted. See PDF.] yielded by the state-of-the-art LiDAR sensor model are higher. The [Forumla omitted. See PDF.] for the number of received [Forumla omitted. See PDF.] of LiDAR FMU is [Forumla omitted. See PDF.] and [Forumla omitted. See PDF.] for the state-of-the-art LiDAR sensor model up to 30 m.
Figure 23. The exemplary point clouds provided by real and virtual LiDAR sensor models for [Forumla omitted. See PDF.] m × [Forumla omitted. See PDF.] m Lambertian plate placed at 30 m. (a) Real measured point clouds (b) LiDAR FMU point clouds (c) State-of-the-art LiDAR sensor model. The real and LiDAR FMU points are noisy and dispersed. However, the state-of-the-art LiDAR model points are ideal and aligned.
Figure 24. The comparison of measured and simulated mean intensity [Forumla omitted. See PDF.] values. The mean intensity [Forumla omitted. See PDF.] values show good agreement. The [Forumla omitted. See PDF.] for the mean intensity [Forumla omitted. See PDF.] is [Forumla omitted. See PDF.]. It should be noted that comparing intensity values of the real measurement and state-of-the-art sensor model is impossible because their units are different.
Figure 25. The distance error [Forumla omitted. See PDF.] of real and virtual sensors is below the range accuracy of the real sensor ±2 cm.
Overview of the state-of-the-art LiDAR sensor model working principles and validation approaches.
Authors | Model Type | Input of |
Output of |
Covered |
Validation |
---|---|---|---|---|---|
Hanke et al. [ |
Ideal/low-fidelity | Object list | Object list | FoV and object occlusion | N/A |
Stolz & Nestlinger [ |
Ideal/low-fidelity | Object list | Object list | FoV and object occlusion | N/A |
Muckenhuber et al. |
Phenomenological/ |
Object list | Object list | FoV, object class definition, |
Simulation result |
Linnhoff et al. |
Phenomenological/ |
Object list | Object list | Partial occlusion of objects, |
Simulation result |
Hirsenkorn et al. |
Phenomenological/ |
Object list | Object list | Ranging errors, latency, |
Simulation result |
Zhao et al. [ |
Phenomenological/ |
Object list | Object list or |
Occlusion, FoV and |
Simulation result |
Li et al. [ |
Physical/ |
Object list | Object list or |
Occlusion, FoV and |
Simulation result |
Philipp et al. [ |
Physical/ |
Ray-casting | Point clouds |
Beam divergence, SNR, |
Qualitative compar- |
Gschwandtner |
Physical/ |
Ray-casting | Point clouds | Sensor noise, materials physical |
Simulation results |
Goodin et al. [ |
Physical/ |
Ray-casting | Point clouds | Beam divergence and a |
Simulation results |
Bechtold & Höfle |
Physical/ |
Ray-casting | Point clouds | Beam divergence, atmospheric |
Simulation results |
Hanke et al. [ |
Physical/ |
Ray-tracing | Point clouds | Beam divergence, material |
Qualitative comparis- |
Li et al. [ |
Physical/ |
Ray-tracing | Point clouds | Beam divergence, power |
Simulation results for |
Zhao et al. [ |
Physical/ |
Ray-tracing | Point clouds | False alarm due to the |
Qualitative comparis- |
CARLA [ |
Physical/ |
Ray-casting | Point clouds | signal attenuation, noise |
N/A |
CarMaker [ |
Physical/ |
Ray-tracing | Point clouds | Noise, the drop-off in intensity, |
N/A |
DYNA4 [ |
Physical/ |
Ray-casting | Point clouds | Physical effects, the material |
N/A |
VTD [ |
Physical/ |
Ray-tracing | Point clouds | Material properties | N/A |
AURELION [ |
Physical/ |
Ray-tracing | Point clouds | Material surface reflectivity, |
N/A |
Haider et al. |
Physical/ |
Ray-tracing | Time domain |
Material surface reflectivity, |
Qualitative comparison |
The
Parameter |
|
---|---|
Number of received points |
8.5% |
Mean Intensity |
9.3% |
References
1. KBA. Bestand Nach Fahrzeugklassen und Aufbauarten. Available online: https://www.kba.de/DE/Statistik/Fahrzeuge/Bestand/FahrzeugklassenAufbauarten/2021/b_fzkl_zeitreihen.html?nn=3524712&fromStatistic=3524712&yearFilter=2021&fromStatistic=3524712&yearFilter=2021 (accessed on 15 April 2022).
2. Synopsys. What is ADAS?. Available online: https://www.synopsys.com/automotive/what-is-adas.html (accessed on 26 August 2021).
3. Thomas, W. Safety benefits of automated vehicles: Extended findings from accident research for development, validation and testing. Autonomous Driving; Springer: Berlin/Heidelberg, Germany, 2016; pp. 335-364.
4. Kalra, N.; Paddock, S.M. Driving to safety: How many miles of driving would it take to demonstrate autonomous vehicle reliability?. Transp. Res. Part A Policy Pract.; 2016; 94, pp. 182-193. [DOI: https://dx.doi.org/10.1016/j.tra.2016.09.010]
5. Winner, H.; Hakuli, S.; Lotz, F.; Singer, C. Handbook of Driver Assistance Systems; Springer International Publishing: Amsterdam, The Netherlands, 2016; pp. 405-430.
6. VIVID Virtual Validation Methodology for Intelligent Driving Systems. Available online: https://www.safecad-vivid.net/ (accessed on 1 June 2022).
7. DIVP Driving Intelligence Validation Platform. Available online: https://divp.net/ (accessed on 24 May 2022).
8. VVM Verification Validation Methods. Available online: https://www.vvm-projekt.de/en/project (accessed on 24 May 2022).
9. SET Level. Available online: https://setlevel.de/en (accessed on 24 May 2022).
10. Kochhar, N. A Digital Twin for Holistic Autonomous Vehicle Development. ATZelectron. Worldw.; 2021; 16, pp. 8-13. [DOI: https://dx.doi.org/10.1007/s38314-020-0579-2]
11. Blochwitz, T. Functional Mock-Up Interface for Model Exchange and Co-Simulation. 2016; Available online: https://fmi-standard.org/downloads/ (accessed on 20 March 2021).
12. ASAM e.V. ASAM OSI. Available online: https://www.asam.net/standards/detail/osi/ (accessed on 13 September 2022).
13. Schneider, S.-A.; Saad, K. Camera behavioral model and testbed setups for image-based ADAS functions. Elektrotech. Inf.; 2018; 135, pp. 328-334. [DOI: https://dx.doi.org/10.1007/s00502-018-0622-7]
14. Rosenberger, P.; Holder, M.; Huch, S.; Winner, H.; Fleck, T.; Zofka, M.R.; Zöllner, J.M.; D’hondt, T.; Wassermann, B. Benchmarking and Functional Decomposition of Automotive Lidar Sensor Models. Proceedings of the 2019 IEEE Intelligent Vehicles Symposium (IV); Paris, France, 9–12 June 2017; pp. 632-639.
15. IPG Automotive GmbH. CarMaker 10.0.2. Available online: https://ipg-automotive.com/en/products-solutions/software/carmaker/ (accessed on 12 March 2022).
16. dSPACE GmbH. AURELION 22.1. Available online: https://www.dspace.com/en/inc/home/news/aurelion_new-version_22-1.cfm (accessed on 12 March 2022).
17. Roriz, R.; Cabral, J.; Gomes, T. Automotive LiDAR Technology: A Survey. IEEE Trans. Intell. Transp. Syst.; 2021; 23, pp. 6282-6297. [DOI: https://dx.doi.org/10.1109/TITS.2021.3086804]
18. Fersch, T.; Buhmann, A.; Weigel, R. The influence of rain on small aperture LiDAR sensors. Proceedings of the 2016 German Microwave Conference (GeMiC); Bochum, Germany, 14–16 March 2016; pp. 84-87.
19. McManamon, P.F. Field Guide to Lidar; SPIE Press: Bellingham, WA, USA, 2015.
20. Ahn, N.; Höfer, A.; Herrmann, M.; Donn, C. Real-time Simulation of Physical Multi-sensor Setups. ATZelectron. Worldw.; 2020; 15, pp. 8-11. [DOI: https://dx.doi.org/10.1007/s38314-020-0207-1]
21. Neuwirthová, E.; Kuusk, A.; Lhotáková, Z.; Kuusk, J.; Albrechtová, J.; Hallik, L. Leaf Age Matters in Remote Sensing: Taking Ground Truth for Spectroscopic Studies in Hemiboreal Deciduous Trees with Continuous Leaf Formation. Remote Sens.; 2021; 13, 1353. [DOI: https://dx.doi.org/10.3390/rs13071353]
22. Feilhauer, M.; Häring, J. A real-time capable multi-sensor model to validate ADAS in a virtual environment. Fahrerassistenzsysteme; Springer Vieweg: Wiesbaden, Germany, 2017; pp. 227-256.
23. Hanke, T.; Hirsenkorn, N.; Dehlink, B.; Rauch, A.; Rasshofer, R.; Biebl, E. Generic architecture for simulation of ADAS sensors. Proceedings of the 16th International Radar Symposium (IRS); Dresden, Germany, 24–26 June 2015; pp. 125-130.
24. Stolz, M.; Nestlinger, G. Fast generic sensor models for testing highly automated vehicles in simulation. Elektrotech. Inf.; 2018; 135, pp. 365-369. [DOI: https://dx.doi.org/10.1007/s00502-018-0629-0]
25. Hirsenkorn, N.; Hanke, T.; Rauch, A.; Dehlink, B.; Rasshofer, R.; Biebl, E. A non-parametric approach for modeling sensor behavior. Proceedings of the 16th International Radar Symposium (IRS); Dresden, Germany, 24–26 June 2015; pp. 131-136.
26. Muckenhuber, S.; Holzer, H.; Rubsam, J.; Stettinger, G. Object-based sensor model for virtual testing of ADAS/AD functions. Proceedings of the IEEE International Conference on Connected Vehicles and Expo (ICCVE); Graz, Austria, 4–8 November 2019; pp. 1-6.
27. Linnhoff, C.; Rosenberger, P.; Winner, H. Refining Object-Based Lidar Sensor Modeling—Challenging Ray Tracing as the Magic Bullet. IEEE Sens. J.; 2021; 21, pp. 24238-24245. [DOI: https://dx.doi.org/10.1109/JSEN.2021.3115589]
28. Zhao, J.; Li, Y.; Zhu, B.; Deng, W.; Sun, B. Method and Applications of Lidar Modeling for Virtual Testing of Intelligent Vehicles. IEEE Trans. Intell. Transp. Syst.; 2021; 22, pp. 2990-3000. [DOI: https://dx.doi.org/10.1109/TITS.2020.2978438]
29. Li, Y.; Wang, Y.; Deng, W.; Li, X.; Jiang, L. LiDAR Sensor Modeling for ADAS Applications under a Virtual Driving Environment; SAE Technical Paper SAE International: Warrendale, PA, USA, 2016.
30. Schaefer, A.; Luft, L.; Burgard, W. An Analytical Lidar Sensor Model Based on Ray Path Information. IEEE Robot. Autom. Lett.; 2017; 2, pp. 1405-1412. [DOI: https://dx.doi.org/10.1109/LRA.2017.2669376]
31. Rosenberger, P.; Holder, M.F.; Cianciaruso, N.; Aust, P.; Tamm-Morschel, J.F.; Linnhoff, C.; Winner, H. Sequential lidar sensor system simulation: A modular approach for simulation-based safety validation of automated driving. Automot. Engine Technol.; 2020; 5, pp. 187-197. [DOI: https://dx.doi.org/10.1007/s41104-020-00066-x]
32. Gschwandtner, M.; Kwitt, R.; Uhl, A.; Pree, W. Blensor: Blender Sensor Simulation Toolbox. Proceedings of the 7th International Symposium on Visual Computing; Las Vegas, NV, USA, 26–28 September 2011; Springer: Berlin/Heidelberg, Germany, 2011; pp. 199-208.
33. Goodin, C.; Kala, R.; Carrrillo, A.; Liu, L.Y. Sensor modeling for the Virtual Autonomous Navigation Environment. Proceedings of the Sensors IEEE; Christchurch, New Zealand, 25–28 October 2009; pp. 1588-1592.
34. Bechtold, S.; Höfle, B. HELIOS: A multi-purpose LIDAR simulation framework for research planning and training of laser scanning operations with airborne ground-based mobile and stationary platforms. ISPRS Ann. Photogramm. Remote. Sens. Spat. Inf. Sci.; 2016; 3, pp. 161-168. [DOI: https://dx.doi.org/10.5194/isprs-annals-III-3-161-2016]
35. Hanke, T.; Schaermann, A.; Geiger, M.; Weiler, K.; Hirsenkorn, N.; Rauch, A.; Schneider, S.A.; Biebl, E. Generation and validation of virtual point cloud data for automated driving systems. Proceedings of the 2017 IEEE 20th International Conference on Intelligent Transportation Systems (ITSC); Yokohama, Japan, 16–19 October 2017; pp. 1-6.
36. Goodin, C.; Carruth, D.; Doude, M.; Hudson, C. Predicting the influence of rain on LIDAR in ADAS. Electronics; 2019; 8, 89. [DOI: https://dx.doi.org/10.3390/electronics8010089]
37. Dosovitskiy, A.; Ros, G.; Codevilla, F.; Lopez, A.; Koltun, V. CARLA: An open urban driving simulator. Proceedings of the Conference on Robot Learning; Mountain View, CA, USA, 13–15 November 2017; pp. 1-16.
38. Vector DYNA4 Sensor Simulation: Environment Perception for ADAS and AD. 2022; Available online: https://www.vector.com/int/en/products/products-a-z/software/dyna4/sensor-simulation/ (accessed on 16 January 2022).
39. dSPACE AURELION Lidar Model: Realistic Simulation of Lidar Sensors. 2022; Available online: https://www.dspace.com/en/pub/home/products/sw/experimentandvisualization/aurelion_sensor-realistic_sim/aurelion_lidar.cfm#175_60627 (accessed on 16 January 2022).
40. Roth, E.; Dirndorfer, T.; Neumann-Cosel, K.V.; Fischer, M.O.; Ganslmeier, T.; Kern, A.; Knoll, A. Analysis and Validation of Perception Sensor Models in an Integrated Vehicle and Environment Simulation. Proceedings of the 22nd International Technical Conference on the Enhanced Safety of Vehicles (ESV); Washington, DC, USA, 13–16 June 2011.
41. Gomes, C.; Thule, C.; Broman, D.; Larsen, P.G.; Vangheluwe, H. Co-simulation: State of the art. arXiv; 2017; arXiv: 1702.00686
42. Blochwitz, T.; Otter, M.; Arnold, M.; Bausch, C.; Clauß, C.; Elmqvist, H.; Junghanns, A.; Mauss, J.; Monteiro, M.; Neidhold, T. et al. The Functional Mockup Interface for Tool independent Exchange of Simulation Models. Proceedings of the 8th International Modelica Conference 2011; Dresden, Germany, 20–22 March 2011; pp. 173-184.
43. Van Driesten, C.; Schaller, T. Overall approach to standardize AD sensor interfaces: Simulation and real vehicle. Fahrerassistenzsysteme 2018; Springer Vieweg: Wiesbaden, Germany, 2019; pp. 47-55.
44. ASAM e.V. ASAM OSI Sensor Model Packaging Specification 2022. Available online: https://opensimulationinterface.github.io/osi-documentation/#_osi_sensor_model_packaging (accessed on 7 June 2021).
45. ASAM e.V. ASAM Open Simulation Interface (OSI) 2022. Available online: https://opensimulationinterface.github.io/open-simulation-interface/index.html (accessed on 30 June 2022).
46. IPG CarMaker. Reference Manual Version 9.0.1; IPG Automotive GmbH: Karlsruhe, Germany, 2021.
47. Fink, M.; Schardt, M.; Baier, V.; Wang, K.; Jakobi, M.; Koch, A.W. Full-Waveform Modeling for Time-of-Flight Measurements based on Arrival Time of Photons. arXiv; 2022; arXiv: 2208.03426
48. Blickfeld Scan Pattern. Available online: https://docs.blickfeld.com/cube/latest/scan_pattern.html (accessed on 7 July 2022).
49. Petit, F. Myths about LiDAR Sensor Debunked. Available online: https://www.blickfeld.com/de/blog/mit-den-lidar-mythen-aufgeraeumt-teil-1/ (accessed on 5 July 2022).
50. Fersch, T.; Weigel, R.; Koelpin, A. Challenges in miniaturized automotive long-range lidar system design. Proceedings of the Three-Dimensional Imaging, Visualization, and Display; Orlando, FL, USA, 10 May 2017; SPIE: Bellingham, WA, USA, 2017; pp. 160-171.
51. National Renewable Energy Laboratory. Reference Air Mass 1.5 Spectra: ASTM G-173. Available online: https://www.nrel.gov/grid/solar-resource/spectra-am1.5.html (accessed on 26 February 2022).
52. French, A.; Taylor, E. An Introduction to Quantum Physics; Norton: New York, NY, USA, 1978.
53. Fox, A.M. Quantum Optics: An Introduction; Oxford Master Series in Physics Atomic, Optical, and Laser Physics Oxford University Press: New York, NY, USA, 2007; ISBN 978-0-19-856673-1
54. Pasquinelli, K.; Lussana, R.; Tisa, S.; Villa, F.; Zappa, F. Single-Photon Detectors Modeling and Selection Criteria for High-Background LiDAR. IEEE Sens. J.; 2020; 20, pp. 7021-7032. [DOI: https://dx.doi.org/10.1109/JSEN.2020.2977775]
55. Bretz, T.; Hebbeker, T.; Kemp, J. Extending the dynamic range of SiPMs by understanding their non-linear behavior. arXiv; 2010; arXiv: 2010.14886
56. Swamidass, P.M. Mean Absolute Percentage Error (MAPE). Encyclopedia of Production and Manufacturing Management; Springer: Boston, MA, USA, 2000; 462.
57. Lang, S.; Murrow, G. The Distance Formula. Geometry; Springer: New York, NY, USA, 1988; pp. 110-122.
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
Abstract
This work introduces a process to develop a tool-independent, high-fidelity, ray tracing-based light detection and ranging (LiDAR) model. This virtual LiDAR sensor includes accurate modeling of the scan pattern and a complete signal processing toolchain of a LiDAR sensor. It is developed as a functional mock-up unit (FMU) by using the standardized open simulation interface (OSI) 3.0.2, and functional mock-up interface (FMI) 2.0. Subsequently, it was integrated into two commercial software virtual environment frameworks to demonstrate its exchangeability. Furthermore, the accuracy of the LiDAR sensor model is validated by comparing the simulation and real measurement data on the time domain and on the point cloud level. The validation results show that the mean absolute percentage error
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer
Details


1 IFM—Institute for Advanced Driver Assistance Systems and Connected Mobility, Kempten University of Applied Sciences, Junkersstrasse 1A, 87734 Benningen, Germany; Institute for Measurement Systems and Sensor Technology, Technical University of Munich, Theresienstr. 90, 80333 Munich, Germany
2 Institute for Measurement Systems and Sensor Technology, Technical University of Munich, Theresienstr. 90, 80333 Munich, Germany
3 Blickfeld GmbH, Barthstr. 12, 80339 Munich, Germany
4 IPG Automotive GmbH, Bannwaldallee 60, 76185 Karlsruhe, Germany
5 IFM—Institute for Advanced Driver Assistance Systems and Connected Mobility, Kempten University of Applied Sciences, Junkersstrasse 1A, 87734 Benningen, Germany