Home
Scholarly Works
3D object detection algorithm based on...
Journal article

3D object detection algorithm based on multi-sensor segmental fusion of frustum association for autonomous driving

Abstract

The rotation characteristics of point clouds are challenging to capture in current multimodal fusion methods for 3D object detection. A single fusion method cannot well balance the accuracy and speed in object detection. Therefore, a multi-sensor segmental fusion of frustum is proposed for 3D object detection in autonomous driving. A monocular camera, lidar, and radar are used for piecewise distributed feature-level fusion through frustum association. Firstly, a fully convolutional network is used to obtain a 2D detection frame and a center point of an object from an image. Frustum is generated according to the depth and scale information in a 3D space. Secondly, region of interest in the lidar and radar point clouds is determined by using the frustum association method. Then, spherical voxelization and spherical voxel convolution are performed on the lidar point cloud while extracting the rotation-invariant feature. Finally, feature-level fusion is performed with object attributes extracted from an image and the radar point cloud to improve the detection results. Meanwhile, a dynamic adaptive neural network of parameters for feature fusion is proposed, and it quickly obtains fusion features and ensures the accuracy of fusion results. The proposed method is both compared with other algorithms on the nuScenes dataset and tested on a severe weather dataset Radiate and in a real scenario. The proposed method has achieved the highest NDS score and the highest average accuracy in severe weather compared with other advanced methods. The experimental results indicate that the proposed method has higher accuracy and more excellent adaptability in various complex and severe weather driving environments.

Authors

Tao C; Bian W; Wang C; Li H; Gao Z; Zhang Z; Zheng S; Zhu Y

Journal

Applied Intelligence, Vol. 53, No. 19, pp. 22753–22774

Publisher

Springer Nature

Publication Date

October 1, 2023

DOI

10.1007/s10489-023-04630-4

ISSN

0924-669X

Contact the Experts team