📢 We are excited to announce Adver-City, 𝘁𝗵𝗲 𝗳𝗶𝗿𝘀𝘁 𝗼𝗽𝗲𝗻–𝘀𝗼𝘂𝗿𝗰𝗲 𝗰𝗼𝗹𝗹𝗮𝗯𝗼𝗿𝗮𝘁𝗶𝘃𝗲 𝗽𝗲𝗿𝗰𝗲𝗽𝘁𝗶𝗼𝗻 𝗱𝗮𝘁𝗮𝘀𝗲𝘁 𝗳𝗼𝗰𝘂𝘀𝗲𝗱 𝗼𝗻 𝗮𝗱𝘃𝗲𝗿𝘀𝗲 𝘄𝗲𝗮𝘁𝗵𝗲𝗿 𝗰𝗼𝗻𝗱𝗶𝘁𝗶𝗼𝗻𝘀!
𝗔𝘂𝘁𝗼𝗻𝗼𝗺𝗼𝘂𝘀 𝗩𝗲𝗵𝗶𝗰𝗹𝗲𝘀 (AVs) have come a long way, but adverse weather conditions still pose a significant challenge to their widespread adoption by impacting sensors like LiDARs and cameras. Even though 𝗖𝗼𝗹𝗹𝗮𝗯𝗼𝗿𝗮𝘁𝗶𝘃𝗲 𝗣𝗲𝗿𝗰𝗲𝗽𝘁𝗶𝗼𝗻 (CP) improves an AV perception in difficult conditions, the existing CP datasets lack adverse weather conditions.
Simulated in CARLA with OpenCDA, Adver-City features over 24k frames and 890k annotations from 110 unique scenarios and six weather conditions: clear weather, soft rain, heavy rain, fog, foggy heavy rain, and glare, for the first time in a synthetic CP dataset.
Its scenarios are 𝗯𝗮𝘀𝗲𝗱 𝗼𝗻 𝗿𝗲𝗮𝗹 𝗰𝗿𝗮𝘀𝗵 𝗿𝗲𝗽𝗼𝗿𝘁𝘀 𝗮𝗻𝗱 𝗱𝗲𝗽𝗶𝗰𝘁 𝘁𝗵𝗲 𝗺𝗼𝘀𝘁 𝗿𝗲𝗹𝗲𝘃𝗮𝗻𝘁 𝗿𝗼𝗮𝗱 𝗰𝗼𝗻𝗳𝗶𝗴𝘂𝗿𝗮𝘁𝗶𝗼𝗻𝘀 𝗳𝗼𝗿 𝗮𝗱𝘃𝗲𝗿𝘀𝗲 𝘄𝗲𝗮𝘁𝗵𝗲𝗿 and poor visibility conditions. The scenarios vary in object density, with dense and sparse scenes, allowing for novel testing conditions of AI models.
It has six object categories including pedestrians and cyclists, and uses data from vehicles and Roadside Units (RSUs) featuring LiDARs, RGB and semantic segmentation cameras, GNSS, and IMUs.
💻 Website: https://lnkd.in/gh2kaeWi
📄 Paper: https://lnkd.in/gZhQT2xf
🤖 GitHub: https://lnkd.in/g52kfwZT
#autonomousdriving #computervision #deeplearning #machinelearning #v2x #v2v #v2i #carla #ai #automotivetechnology #smartmobility #automatedvehicles #cooperativedriving #collaborativeperception #github #opensource #lidar #gnss