
6 days ago
How the Fusion of Onboard Sensors and V2X Data can improve (or not) the Cooperative Percep...
Automated vehicles rely on onboard sensors to perceive their surroundings and navigate autonomously. However, sensor performance may degrade under adverse weather conditions or when line-of-sight is obstructed. Cooperative perception (or collective perception) is expected to mitigate these limitations by enabling Connected and Automated Vehicles (CAVs) to share sensor data and collaboratively enhance situational awareness. Several studies have analyzed the potential of cooperative perception, yet the fusion of V2X data with information from onboard sensors has received limited focus. V2X data may contain errors that affect the quality of the fused data, and hence the effectiveness of cooperative perception. This study analyzes the impact of sensing measurement errors, V2X packet losses, and GNSS inaccuracies on the effectiveness of cooperative perception. The results highlight the potential of cooperative perception to enhance perception levels and range compared to using onboard sensors alone. However, they also identify key challenges related to the generation of ghost vehicles during the fusion process, which must be addressed to prevent V2X data from introducing additional errors when fused with onboard sensor data.
How the Fusion of Onboard Sensors and V2X Data can improve (or not) the Cooperative Perception of Connected Automated Vehicles
Amir Mohammadisarab, Miguel Sepulcre, Universidad Miguel Hernandez de Elche (UMH); Luca Lusvarghi, Universidad Miguel Hernandez de Elche; Javier Gozálvez, Universidad Miguel Hernandez de Elche (UMH)
No comments yet. Be the first to say something!