Skip to main content Skip to navigation

Resilient and Robust Sensing for Automated Systems for Transportation

Event: Resilient and Robust Sensing for Automated Systems for Transportation

Your content here

Your content here

Date: Wednesday 19th June 2024

Venue: Scarman, University of Warwick, CV4 7S


For the third year running, WMG in association with AESIN are pleased to announce the return of the Resilient and Robust Sensors Conference on Wednesday 19th June 2024 at the University of Warwick.

About the event

The conference brings together stakeholders from academia and industry, including national and international experts working on Automated Systems for Transportation.

You will have the opportunity to understand and discuss the latest technical advancements in the area of resilient and robust sensing, and consider the key challenges that need to be overcome to ensure the safety of the automated systems based on environmental perception sensors. The day will have four topics and include panel sessions, and include contributions from high profile world-leading experts in the field of Sensors for automated systems for transport.

Conference Topics:

- Sensor Data Quality, Degradation, and Metrics

- Sensor Testing, Modelling, and Validation

- Emerging Sensor Technologies (Quantum, Machine Learning, etc.)

- Perception, Fusion, Mitigations

Make a Poster Submission!

Submission deadline: 10pm on Wednesday 15th May 2024
Conference Agenda

Time

Title

Speaker

09.00 -
09.25

Registration

 

09.30 – 09.50

Welcome WMG - Conference Overview - AESIN

Prof. Valentina Donzella (WMG), Gunwant Dhadyalla (AESIN)

09.50 – 10.15

Keynote 1: Virtual Test Platform Qualification for Automated Driving Systems

Ghazal Farhani (Automotive and Surface Transportation, Canada)

10.15 – 11.00

Session 1: Sensing in adverse conditions and data degradation

Chair: Prof. Valentina Donzella

 

Talk 1 - Overcoming Disbelief – Detection of Unlikely Objects in Automotive Images

Dr Anthony Huggett (onsemi, UK)

 

Talk 2 - Enhancing sensing in the dark: Modelling and validating flare in automotive cameras

Boda Li (WMG)

11:00-11:30

Break

 

11.30 - 12.30

Talk + Panel Session: “Sensing in the era of the Software Defined Vehicle”.

Luca Cenciotti (JLR)

Panellists: Dr Taufiq Rahman (CAV lead at National Research Council Canada (NRC)), Dr CIARÁN EISING (Associate Professor, University of Limerick), Kashif Siddiq (CEO, Oxford RF)

12:30-12:45

The Government R&D programme for Connected and Automated Mobility – A Sensing Perspective

Robert Vermeer (IUK)

12.45 – 13.30

Lunch

13.30 - 14:00

Keynote 2: The ROADVIEW project and the development of RADAR model under extreme weather

Yuri Poledna (THI and CARISSMA, Germany)

14.00 – 14.45

Session 2: Robust sensing, modelling, and safety

Chair: Dr Sepeedeh Shahbeigi

Talk 1: Safety Assurance of the understanding function in autonomous systems ​

John Molloy (Uni of York, UK)

 

Talk 2: Accurate LIDAR System Simulation for ADAS L3+ Development​

Ahmed Yousif (Valeo)

14.45 – 15.45

Poster Session and Coffee

15.45 – 16.30

Workshop – Sim4CAMSens CAV Catalogue

16.30 – 16.45

How to engage with WMG, final remarks and close

Daniela Gonzalez (WMG)

Valentina Donzella (WMG)


Poster Presentations

Title Presenter Description
Synthetic Bayer Dataset with Headlight Flare Boda Li Flare (straylight) from automotive headlights, particularly at night, impacts camera data and complicates assisted and automated driving (AAD) functions. This research develops and validates a parameterised method to model flare effects, integrating these results into the CARLA simulator to enhance the accuracy of automotive camera data for AAD testing.
Correlating Image Quality Metrics and DNN-Based Detection for Automotive Camera Data Daniel Gummadi Assisted and automated driving (AAD) systems rely on camera sensors. This study finds that traditional image quality assessment metrics, like SSIM and retrained BRISQUE, correlate strongly with deep neural networkobject detection performance against compression-induced image degradation, aiding in better prediction of perception degradation for AAD system development.




A New Approach for Bayer Adaption Techniques in Compression Hetian Wang Existing wired vehicle networks lack the bandwidth for next-gen assisted and automated driving sensors. Compressing Bayer images directly, rather than traditional RGB images, preserves data fidelity and reduces memory storage. This research introduces two novel Colour Space Transform techniques with the H.264 codec, which enhance object detection performance, especially at bit rates of 700 to 1250 kb/fr.
Investigating the Mutual Interference of State-of-the-Art Automotive LiDAR Sensors Milan Lovric The growing use of LiDAR-equipped vehicles raises concerns about potential LiDAR interference, but research is limited. This study replicates lab experiments outdoors and analyses mitigation by adjusting LiDAR angles. Results show detectable interference effects, but changing angles is not proven effective. Further research is needed to separate interference from other noise factors.


Darwick: A Paired Dataset in Low-Light Driving Scenarios for Advanced
Perceptual Enhancement and Benchmarking Assessment
Zixiang Wei To address perception challenges in low-light vehicular environments, the Darwick dataset is introduced as a high-quality resource designed to enhance visual perception for assisted and automated driving functions. It is the first dataset focused on pixel-level paired low-light imaging tailored to diverse driving scenarios.
Benchmarking the Robustness of Panoptic Segmentation in Automated Driving Yiting Wang For safe assisted and automated driving, precise situational awareness is vital. This work proposes a pipeline to assess panoptic segmentation model robustness for AAD by correlating it with traditional image quality. It involves generating degraded camera data reflecting real-world noise factors and analysing three segmentation networks' performance variations with selected image quality metrics.


Robust downsampling for LiDAR Point Clouds in Assisted and Automated Driving Jiangyin Sun This study proposes a novel LiDAR point cloud processing algorithm, reducing data transmission size without losing essential information. Experimental results show stable detection performance, with average precision around 90%, remaining robust against rain noise up to 1mm/h.
Influence of AVC and HEVC Compression on Detection of Vehicles Through Faster R-CNN Harry Chan This study explores data reduction techniques for vehicle perception sensors, focusing on lossy compression for cameras. Compression-tuned deep neural networks (DNNs) outperform traditional ones, maintaining steady performance with increasing compression rates (up to ~130:1). This suggests the potential of compression-tuned DNNs for automotive sensors.
ARDÁN: Automated Reference-free Defocus characterization for Automotive Near-field Cameras Daniel Jakab Measuring optical quality in automotive camera lenses is critical for safety. ARDÁN evaluates Horizontal Slanted Edges for ISO12233:2023 standards in four public datasets, using Region of Interest (ROI) selection and the mean of 50% of the Modulation Transfer Function (MTF50) for optical quality, and Regional Mask to Lens Alignment (RMLA) to remove occlusion and vignetting.
BEVSeg2GTA: Joint Vehicle Segmentation and GNNs for Ego Vehicle Trajectory Prediction in BEV Sushil Sharma Predicting the ego vehicle’s trajectory is crucial for autonomous driving but is complicated by real-world conditions. BEVSeg2GTA integrates perception and trajectory prediction using EfficientNet and a Graph Neural Network (GNN) to enhance accuracy.
Deformable Convolution Based Road Scene Semantic Segmentation of Fisheye Images in Autonomous Driving Anam Manzoor This work enhances semantic segmentation in autonomous driving by addressing challenges of fisheye images, which have wide fields of view and geometric distortions. It evaluates the effectiveness of deformable convolutions, which adapt to these distortions, using U-Net. The study also explores multi-view scene processing to assess model versatility and improve segmentation accuracy.
Camera-Radar Fusion in Autonomous Vehicles For Perception Tasks Sanjay Kumar This research enhances autonomous vehicle perception by fusing radar and RGB camera data into a Bird's Eye View format, improving accuracy in tracking, map segmentation, and 3D object detection. Intermediate and late-stage fusion techniques optimise data integration for reliable performance in diverse driving conditions.
AUTOMATED MEASUREMENT OF VISIBLE WARNINGS ON VEHICLE INSTRUMENT CLUSTER William Dunnion Test engineers manually process videos to identify ADAS activation, a time-consuming task. This research automates the process with a GigE PTP camera and YOLOv5 image processing, ensuring synchronised data acquisition and improved efficiency.

VELOCITY DRIVEN VISION: ASYNCHRONOUS SENSOR FUSION BIRDS EYE VIEW MODELS FOR AUTONOMOUS VEHICLES

Seamie Hayes

The study addresses challenges in fusing asynchronous sensor modalities, improving spatial and temporal alignment. By using velocity information, the research narrowed the gap between camera+radar (C+R) and camera+LiDAR (C+L) from 5.1 IOU to 2.7.

Advanced 6D Radar for Software-Defined Vehicles Dr Kashif Siddiq The Oxford RF 6D radar for software-defined vehicles offers higher precision localisation and enhanced target kinematics with fewer sensors. It provides superior cross-range 3D Synthetic Aperture Radar imaging, better noise and weather immunity, and improved reliability.  

WMG's Intelligent Vehicles Sensors Group Link opens in a new windowhas a focus on "Robust Sensing”, as the quality of sensor data is key for any decision-making process in these systems, either based on traditional algorithms or AI-based ones. The group is led by Dr. Valentina DonzellaLink opens in a new window. who will be facilitating the conference.

Dr. Donzella will be joined alongside Gunwant DhadyallaLink opens in a new window, AESIN Director and Engineering leader with over 30 years' experience in the automotive industry.

AESIN Link opens in a new windowlaunched in 2012, is an outstanding member-based community committed to the next generation of UK-centric automotive electronics & software systems and supply chains.

See below highlights from the 2023 event.