SHEN LYUYUMECHANICAL ENGINEERING2024-03-312024-03-312023-08-19SHEN LYUYU (2023-08-19). IMPROVED CAMERA-RADAR FUSION FOR ACCURATE OBJECT DETECTION AND TRACKING IN AUTONOMOUS DRIVING. ScholarBank@NUS Repository.https://scholarbank.nus.edu.sg/handle/10635/247641In the realm of autonomous driving, the perception system stands at the forefront of vehicle intelligence, enabling vehicles to interpret and react to their environment. To achieve this, a typical perception system integrates input signals from various onboard sensors such as cameras, LiDARs, and radars to estimate surrounding dynamic objects and the environment (object detection and tracking). Despite the resourceful techniques embedded within current systems, it shows shortcomings in sensing dynamic objects and in fully leveraging automotive sensorsŠ capabilities. SpeciĄcally, the current tracking methods can be improved in terms of the interaction with the environment, and thereŠs a notable gap in research on high-level perception tasks using automotive radar. Hence, this thesis emphasizes two pivotal components: camera-based object tracking and camera-radar object detection.enautonomous driving, perception, radar, sensor fusion, detection, trackingIMPROVED CAMERA-RADAR FUSION FOR ACCURATE OBJECT DETECTION AND TRACKING IN AUTONOMOUS DRIVINGThesis0000-0001-7435-5072