Object Detection and Tracking with YOLO and the Sliding Innovation Filter Journal Articles uri icon

  •  
  • Overview
  •  
  • Research
  •  
  • Identity
  •  
  • Additional Document Info
  •  
  • View All
  •  

abstract

  • Object detection and tracking are pivotal tasks in machine learning, particularly within the domain of computer vision technologies. Despite significant advancements in object detection frameworks, challenges persist in real-world tracking scenarios, including object interactions, occlusions, and background interference. Many algorithms have been proposed to carry out such tasks; however, most struggle to perform well in the face of disturbances and uncertain environments. This research proposes a novel approach by integrating the You Only Look Once (YOLO) architecture for object detection with a robust filter for target tracking, addressing issues of disturbances and uncertainties. The YOLO architecture, known for its real-time object detection capabilities, is employed for initial object detection and centroid location. In combination with the detection framework, the sliding innovation filter, a novel robust filter, is implemented and postulated to improve tracking reliability in the face of disturbances. Specifically, the sliding innovation filter is implemented to enhance tracking performance by estimating the optimal centroid location in each frame and updating the object’s trajectory. Target tracking traditionally relies on estimation theory techniques like the Kalman filter, and the sliding innovation filter is introduced as a robust alternative particularly suitable for scenarios where a priori information about system dynamics and noise is limited. Experimental simulations in a surveillance scenario demonstrate that the sliding innovation filter-based tracking approach outperforms existing Kalman-based methods, especially in the presence of disturbances. In all, this research contributes a practical and effective approach to object detection and tracking, addressing challenges in real-world, dynamic environments. The comparative analysis with traditional filters provides practical insights, laying the groundwork for future work aimed at advancing multi-object detection and tracking capabilities in diverse applications.

publication date

  • March 26, 2024