Home
Scholarly Works
A Camera-LiDAR Fusion Framework for Traffic...
Conference

A Camera-LiDAR Fusion Framework for Traffic Monitoring

Abstract

Intelligent Transportation Systems (ITS) enabled with LiDAR and sensor fusion technology can provide critical information to increase road safety. Traffic monitoring systems can enable these changes through the collection of accurate and real-time detection and tracking information, but require consistent performance in all environmental conditions. Recent developments in LiDAR technology have opened new opportunities in autonomous vehicles, smart infrastructure, and surveillance. With high 3D accuracy and performance across adverse lighting and weather conditions compared to cameras alone, LiDAR stands to become commonplace in ITS applications. To further traffic monitoring systems, sensor fusion leverages complementary characteristics from multiple sensors to enhance detection performance in challenging conditions. However, many existing perception frameworks require prohibitively expensive computer hardware for widespread deployment and to achieve real-time performance. This paper presents the following contributions (1) a real-time multi-object detection and tracking pipeline using camera-LiDAR fusion, (2) Center for Mechatronics and Hybrid Technologies (CMHT) Traffic Dataset containing synchronized camera and LiDAR data. The proposed fusion framework is evaluated on the real-world CMHT Traffic Dataset, achieving a +3 Higher Order Tracking (HOTA) score compared to LiDAR or camera only.

Authors

Sochaniwsky A; Huangfu Y; Habibi S; Von Mohrenschildt M; Ahmed R; Bhuiyan M; Wyndham-West K; Vidal C

Volume

00

Pagination

pp. 1-6

Publisher

Institute of Electrical and Electronics Engineers (IEEE)

Publication Date

June 21, 2024

DOI

10.1109/itec60657.2024.10598852

Name of conference

2024 IEEE Transportation Electrification Conference and Expo (ITEC)
View published work (Non-McMaster Users)

Contact the Experts team