Intelligent Vehicles (IV) Sensors - WMG - Projects
Projects
Bringing together a pool of outstanding industry and academic experts, the webinar series highlights the key challenges and opportunities in the field of perception sensors for Smart Connected and Autonomous Vehicles.
ROADVIEW
WMG IV Sensors group are a part of the ROADVIEW (Robust Automated Driving in Extreme Weather) project, consisting of 15 member consortia spread across Europe, funded by the EU Horizon Europe Innovation Action. ROADVIEW aims to develop robust and cost-efficient in-vehicle perception and decision-making systems for connected and automated vehicles with enhanced performance under harsh weather conditions and different traffic scenarios. Harsh weather challenges are a severe technological barrier for automated vehicles. More information on the project can be found here.
The IV Sensors group is leading a work package around validated sensor noise models for synthetic environment and X-i-L. Furthermore, the group is developing and validating physics based weather models for perception sensor and continuing the work around identifying noise and compounding noise factors. Finally, the group is also co-leading the X-i-L work package, supporting the integration of the developed models for X-in-the-Loop testing.
2ndGeneral Assembly at Helsinki, Finland.
Automotive Sensor Data Compression
Smart Compression of Automotive Environmental Perception Sensors Data and Implications on Perception
Automated vehicles are on the verge of changing our lives, and foremost they can dramatically reduce the number of car accidents and the associated human and economic losses.
To master driving, automated vehicles need to:
- Know where they are
- Sense the surrounding environment (road users, pavements, signs, etc.)
- Make swift decisions and plan a trajectory
- Move on the roads in a safe way.
Sensing the environment requires the vehicles to deploy several different sensors, and each sensor will collect a large quantity of data. Furthermore, data must be moved promptly to the decision units, and the larger the data quantity, the more difficult and costly it is to transmit it.
Estimated number and datarate of perception sensors for an automated vehicle
This project aims to strengthen the collaboration between WMG and a key automotive sensor company, onsemi, enhancing WMG’s ethos of producing research outputs readily applicable in the industry.
Flow diagrams schematically representing: a) example of the camera data flow in a vehicle, b) the methodology proposed in our experiments
The IV sensors group has investigated the effect of compression on neural networks based Object Detection. The results show that up to a certain amount of compression, the performance of these networks was not detrimentally affected. Compression can even help the network if used for the training of these networks. These results have been presented at conferences [LIM][RAAI] and published at IEEE Transactions on Intelligent Transportation Systems [T-ITS].
Currently, the group has 2 ongoing related projects, one related to optimising Automotive Camera data compression and another one realted to data reduction for LiDAR point cloud.
Sim4CAMSens
The Centre for Connected and Autonomous Vehicles (CCAV) has awarded a £2 million grant to the Sim4CAMSens project, where the IV Sensors group will play a key role with expertise on perception sensors noise factors. The project will be led by Claytex and a group of industry leading partners, as part of the Commercialising CAM Supply Chain Competition.
The consortium, consisting of Claytex, rFpro, Syselek, Oxford RF, WMG, National Physical Laboratory, Compound Semiconductor Applications Catapult, and AESIN, aims to enable accurate representation of Automated Driving Systems (ADS) sensors in simulation. It will develop a sensor evaluation framework that spans modelling, simulation, and physical testing. This project will involve the creation of new sensor models, improved noise models, new material models, and new test methods to allow ADS and sensor developers to accelerate their development.
An example of simulated camera data with ground truth - provided by rFpro
Prof Donzella, head of the IV Sensors group says "The Sim4CAMSens project is an extremely exciting opportunity for WMG to work with UK based world leading industrial and research partners to speed up the development of sensors models and testing methodologies which are key for the future safe deployment of robust and reliable ADS. The success of this project can bring the UK to lead the way globally in this field".
Visual Representation of the project outcomes
Sim4CAMSens is part of CCAV’s Commercialising CAM Supply Chain Competition (CCAMSC).
The Commercialising CAM programme is funded by the Centre for Connected and Automated Vehicles, a joint unit between the Department for Business and Trade (DBT) and the Department for Transport (DfT) and delivered in partnership with Innovate UK and Zenzic.
The CCAM Supply Chain competition was launched in October 2022 to support the delivery of early commercialisable Connected and Automated Mobility technologies, products and services and is part of the Government’s vision for self-driving vehicles. Connected and automated mobility 2025: realising the benefits of self-driving vehicles.
NPL Collaboration - LiDAR
The IV Sensors group is working with the National Physical Laboratory (NPL) in a collaboration looking at the presence and effect of interference on Light Detection and Ranging (LiDAR) sensors. We are looking to quantify the likely effect of any interference by simply placing LiDAR sensors in close proximity to each other and measuring the change in the output. Our methods enable us to understand whether interference poses a significant risk to the use and operation of LiDAR sensors when fitted on vehicles operating in a real-world environment.
One of the commercial LiDAR sensor used in the experiment
LiDAR interference could have a profound effect in several driving situations such as a highly automated vehicle attempting to cross an oncoming lane of traffic to enter a side road. The vehicle in this situation will have to identify a safe gap to cross the oncoming lane and do so in a timely manner. If in this scenario, vehicles in the oncoming lane are also fitted with LiDAR sensors, interference could cause problems. Extra detections could be witnessed in the output of the LiDAR sensors which whether used by perception and planning algorithms could severely hinder the vehicle’s ability to successfully complete it’s intended action. Hence, understanding the issue will help the development of solutions such that higher automation using LiDAR sensors can be realised.
Schematic view of overlapping FoV and potential area of interference
Noise Framework
Sensors operating in the real world are unlikely to be operating under ideal, noise free conditions. For Assisted and Automated Driving, this gets even worse as the operating conditions on the road are uncontrollable and highly variable. Perception sensors will be exposed to a wide variety of internal and external noise factors that will have a varying impact on the sensor data. It is imperative to understand the effect of noise on the sensor data. The sensor data is consumed by the vehicle’s processing unit to understand and plan the vehicle’s action. If compromised, it can lead to unsafe actions by the vehicle. Hence, creating noise models to apply on the sensor data is critical for virtual testing of assisted and automated driving systems.
We have created and published a framework to assist the identification and an initial understand of the impact on the sensor data [1][2]. Firstly, by considering the 5 noise factor types in the P-Diagram (Piece to Piece, Change over Time, Usage, Environment and System Interactions) with experts and from literature, a list of noise factors are identified. Each noise factor is then further investigated to categorise how of sensor data will be affected. This breakdown allows an understanding where a sensor noise factor model need to be applied.
[1] P. H. Chan, G. Dhadyalla and V. Donzella, "A Framework to Analyze Noise Factors of Automotive Perception Sensors," in IEEE Sensors Letters, vol. 4, no. 6, pp. 1-4, June 2020.
[2] B. Li, P. H. Chan, G. Baris, M. D. Higgins and V. Donzella, "Analysis of Automotive Camera Sensor Noise Factors and Impact on Object Detection," in IEEE Sensors Journal, vol. 22, no. 22, pp. 22210-22219, Nov, 2022.
LiDAR Noise
Our initial published work using this framework investigated LiDAR and can be found on IEEE Xplore or the pre-print at WRAP (University of Warwick open access research repository). A list of noise factors and associated breakdown of affect sensor data is in the table below.
In this table, each noise is broken down into 4 areas in which it can affect LiDAR. LiDAR outputs 2 main pieces of information, distance (generally based on time of flight) and intensity of each measurement point. Additionally, we have identified two further parameters that are important in a noise modelling perspective, angle of emission (y, q) and position of measurement (x, y, z). As LiDAR provides intensity and range per point, angle of emission is critical as it allows the determination of the returned point in 3D space. For the position of measurement, each beam from a LiDAR measures a small area, however the beam can be affected by refraction which changes the point of measurement.
Factor Type |
ID/Noise Factor |
I |
ToF |
ψ, ϕ |
x,y,z |
Description |
Piece to Piece |
01. Laser Diode |
✓ |
✓ |
|
|
Light emission is affected by the variability of fabrication parameters. |
02. Mounting |
|
|
✓ |
|
Can affect the emission direction. |
|
Change over Time |
03. Emitter |
✓ |
✓ |
|
|
Fluctuation/degradation of emitter power, bias, wavelength shift. |
04. Mechanics |
|
|
✓ |
|
Wear in mechanical parts resulting in offsets and misplacement. |
|
05. Receiver |
✓ |
✓ |
|
|
Degradation could result in a responsivity wavelength shift and could result in lower or higher intensity recorded for a specific wavelength. |
|
06. Circuits |
✓ |
✓ |
|
|
Electronic circuit components degradation/aging over time. |
|
Usage |
07. Multiple Returns |
✓ |
|
|
✓ |
From multiple objects in beam path, ground, beam divergence. |
08. Motion |
|
|
✓ |
|
Vehicle vibration, speed, acceleration, ground holes, etc. |
|
09. Clock Speed |
|
✓ |
|
|
The clock is used as reference for the ToF (instability, errors). |
|
10. Lens Damage |
✓ |
|
|
✓ |
Dispersion effects reducing intensity and refraction may result in a return from a location that is not expected from the beam path |
|
Envi-ronment |
11. Weather |
✓ |
✓ |
|
|
LiDAR is affected by weather conditions, such as rain, snow, fog, etc. . |
12. Obstruction |
✓ |
|
|
✓ |
Lens can be obstructed by objects, rain, mud, etc. Water drops can result in lensing effect, reduce intensity, etc. Mud can occlude the laser beam. |
|
13. Ambient Conditions |
✓ |
✓ |
|
|
These conditions can affect light propagation. Temperature affects optical, electronic, mechanical components. Luminosity affects detector performance. |
|
System Interac-tions |
14. Malicious Attacks |
✓ |
|
|
✓ |
External systems can disrupt the emissions and/or reception, e.g. by absorbing and reemitting at altered times or other methods. |
15. LiDARs |
|
|
|
✓ |
Other LiDAR units can cause interference, false detection, etc. |
|
16. EMI |
✓ |
✓ |
|
✓ |
Internal and external electrical components interactions. |
Camera Noise
We have published, in IEEE Sensors Journal, a breakdown of automotive camera noise factors and can be found on IEEE Xplore or the pre-print at WRAP (University of Warwick open access research repository). A list of noise factors and associated breakdown of affect sensor data is in the table below.
Due to the wide scope in where noise can affect the camera data, we have narrowed the list of affected parameters to only that of the data coming out of the sensor (post capture and image signal processing). This scope allows the application of noise into existing automotive datasets. There are 4 parameters identified: frame rate (FR), intensity per pixel (IRGB), position of pixel (P(X,Y)) and dropped frames. For intensity per pixel, this is noise that will affect at one or more colour channel of the pixel, whereas position of pixel is a noise that affects all three colour channels at the same time. Dropped frames differ to frame rate as they are frames which are not transmitted or corrupted and cannot be read, but the Timing (frequency) between the output frames that are not dropped does not change.
Original frame from the KITTI Dataset
Image with post processed application of lens occlusion and windscreen distortion
Factor Type |
ID/ Noise Factor |
FR |
IRGB |
P(x,y) |
DF |
Description |
Piece to Piece |
01. Alignment |
|
✓ |
✓ |
|
Misalignment between sensor and lens during assembly. |
02. Fabrication Variability |
✓ |
✓ |
|
|
CMOS fabrication variability (photodiode, circuitry, Analogue to Digital Converters). |
|
03. Lens shape, purity |
|
✓ |
✓ |
|
Lens fabrication variability, resulting in non-ideal absorption and refractions. |
|
04. Dark Current variability |
|
✓ |
|
|
There are mechanisms of compensating the current generated by the photodiodes in presence of no light, but usually there is a variability of this current from pixel to pixel. |
|
05. Image Signal Processing (ISP) |
✓ |
✓ |
✓ |
|
ISP alters the data gathered by the image sensor; functions implemented can include: denoising, demosaicing, colour correction, white balancing, sharpening edges, etc. |
|
Change over Time |
06. Ageing of electronics |
✓ |
✓ |
|
|
Degradation of the performance of the electronic components, resulting in effects such as increased/decreased resistance, leaking currents, etc. |
07. Degradation of Lens |
|
✓ |
✓ |
|
Lens wear out and ageing resulting in attenuation and refractions. |
|
08. Vibration of Mounting |
|
✓ |
✓ |
|
Long term effect of vehicle vibrations resulting in loosening of mounting. |
|
09. Pollutant Ingress |
|
✓ |
✓ |
✓ |
Ingress of particulates such as dust, water, condensation. |
|
10. Pixel Degradation |
|
✓ |
|
|
Exposure to electromagnetic waves resulting in degradation of silicon doping and reduction of pixel performance. |
|
11. Board Ageing |
✓ |
✓ |
|
✓ |
Printed circuit board degradation over time such as whisker/dendritic, connector pin contact degradation. |
|
Usage |
12. Misplacement of the sensor |
|
✓ |
✓ |
|
Change in the positioning of the sensor due to terrain and vehicle, causing a variation of the sensor coordinate system (axes and/or angle) with respect to original calibration. |
13. Vehicle impact |
|
✓ |
✓ |
|
Impacts on the camera unit or vehicle which results in misalignment of the sensor/lens. |
|
14. Chemicals/ Contaminants |
|
✓ |
✓ |
|
Cleaning materials and chemicals may react with the lens surface and cause irreversible damage. |
|
15. Obstructions |
|
✓ |
✓ |
|
During driving, materials/particles (e.g. water, stains, etc.) can obscure/refract the incoming light. |
|
16. Lens Scratch |
|
✓ |
✓ |
|
Scratches can reflect and attenuate the incoming light differently to intended. |
|
17. Vehicle dynamic settings |
|
✓ |
✓ |
|
Adjusting height of vehicle through weight, tyres, pitch, loaded weight etc. thereby changing the sensor coordinate system with respect to original calibration. |
|
Environment |
18. Sensor saturation or depletion |
|
✓ |
|
|
Scenes with extreme brightness or very low luminosity (i.e. sunrise, sunset, exiting tunnel) can cause saturation or depletion of areas of pixels and therefore an inaccurate rendering of the scene. |
19. Extreme Temperature |
✓ |
✓ |
|
✓ |
Sensor operating in conditions outside of manufacturer recommended temperature. |
|
20. Adverse Weather |
|
✓ |
|
|
Conditions such as rain, snow, fog, sleet, frost, mist, etc. |
|
21. Optic Obstructions |
|
✓ |
✓ |
|
Obstructions, such as mud, stains, frost, water spray, flies, etc., which are partially or fully on the lens or windshield which can block or refract the light. |
|
22. Low Illumination |
|
✓ |
|
|
Low light resulting in required high pixel gain, creating a larger difference in output intensity between adjacent pixels due to increased noise. |
|
23. Sun |
|
✓ |
|
|
The sun can cause local saturation, lens flare, IR detection into the colour channels. |
|
System Interactions |
24. Malicious Attacks |
✓ |
✓ |
✓ |
✓ |
Artificial alteration of the image (externally or internally) i.e. cyberattacks, external light source attacks, etc. |
25. Windshield |
|
|
✓ |
|
Curvature of windshield which changes the angle at which light enters the sensor. |
|
26. Power Supply |
✓ |
✓ |
|
✓ |
Unstable or varying supplied power causing variations in generated signals. |
|
27. EMI |
✓ |
✓ |
|
✓ |
Electromagnetic Interference (EMI) from start-up/shut-down of electronics, motors, etc. inducing current within the sensor wires and connections. |
|
28. Saturation of Buffer |
|
✓ |
✓ |
✓ |
Sensor internal buffer saturation, causing problems on transmitted data flow, e.g. inability to process incoming data, incorrect stored data, etc. |
|
29. LED flicker |
|
✓ |
|
|
Pulsing LEDs in the environment resulting in fluctuations in the generated images. |
|
30. Localised Light source |
|
✓ |
|
|
Headlight, flashlight, high beams, laser beams, etc. |
Simulation of LiDAR and Rain
Heavy rain affects object detection by autonomous vehicle LiDAR sensors
- Future fully autonomous vehicles will rely on sensors to operate, one type of these sensors is LiDAR
- LiDAR sensor’s effectiveness in detecting objects at a distance in heavy rain decreases, researchers from WMG, University of Warwick have found
- Researchers used the WMG 3xD simulator to test the sensor detection of objects in rain, simulating real world roads and weather
High-level autonomous vehicles (AVs) are promised by Original Equipment Manufacturers (OEMs) and technology companies to improve road safety as well as bring economical and societal benefits to us all. All high-level AVs rely heavily on sensors, and in the paper, ‘Realistic LiDAR with Noise Model for Real-Tim Testing of Automated Vehicles in a Virtual Environment’, published in the IEEE Sensors Journal, researchers from the Intelligent Vehicles Group at WMG, University of Warwick have specifically simulated and evaluated the performance of LiDAR sensors in rain.
Using the WMG 3xD simulator, researchers tested an autonomous vehicle’s LiDAR sensors in different intensities of rain, driving around a simulation of real roads in and around Coventry. The simulator is a key part of testing autonomous vehicles, as they have to have been on several million miles of road, this, therefore, means that they can be tested in a safe environment that is the same as a real road.
LiDAR sensors work by emitting numerous narrow beams of near-infrared light with circular/elliptical cross sections, and these can reflect off objects in their trajectories and return to the detector of the LiDAR sensor.
One of the issues of LiDAR sensors is the degradation of their performance in rain. If a LiDAR beam intersects with a raindrop at a short distance from the transmitter, the raindrop can reflect enough of the beam back to the receiver, therefore detecting the raindrop as an object. The droplets can also absorb some of the emitted light, degrading the range of performance for the sensors.
Using different probabilistic rain models (none, to different intensities), researchers made it ‘rain’ the WMG 3XD simulator and measured the LiDAR sensor’s responses to the rain, making a record of false positive and false negative detections.
They found that it became more difficult for the sensors to detect objects as the rain intensity increased. Several raindrops were erroneously detected in a short range from the vehicle (up to 50m). However, in a medium range (50m-100m) this had decreased, but as rainfall increased to up to 50mm per hour, the sensors' detection of objects decreased in conjunction with a longer range in distance.
Dr Valentina Donzella, from WMG, University of Warwick, comments: “Ultimately, we have confirmed that the detection of objects is hindered to LiDAR sensors the heavier the rain and the further away they are. This means that future research will have to investigate how to ensure LiDAR sensors can still detect objects sufficiently in a noisy environment.
“The developed real-time sensor and noise models will help to investigate these aspects further and may also inform autonomous vehicles manufacturers’ design choices, as more than one type of sensor will be needed to ensure the vehicle can detect objects in heavy rain.”
Ready to work with WMG?
Register your interest in our intelligent vehicles sensors research and start the conversation with us.