Home
Scholarly Works
2DTriPnP: A Robust Two-Dimensional Method for Fine...
Journal article

2DTriPnP: A Robust Two-Dimensional Method for Fine Visual Localization Using Google Streetview Database

Abstract

The complete camera pose (location + orientation) of the Google street view (GSV) images is provided by Google. Hence, one can utilize this information to localize a query camera based on the projective geometry. The existing literature works either perform image retrieval-based rough location recognition or require high-computational power/specific features for three-dimensional fine localization. In this paper, we propose a robust 2-D method for outdoor image-based localization using the GSV database. Having found the nearest neighboring images (best matches) in the GSV database using image retrieval techniques or the GPS circle information, the proposed method can be applied for robust fine localization of pedestrians/vehicles. The proposed method first finds the common features among the three views, i.e., query view and two from the best matches. Next, for each common feature, a 2-D triangulation is performed using the retrieved database images to find the feature world coordinates. We call this procedure 2DTri. Afterward, a novel set of nonlinear equations is solved to estimate the fine location of the query. The novel set of equations can be interpreted as a 2-D version of the well-known perspective n-point (PnP) problem, which we call 2DPnP. Hence, the proposed method is named 2DTriPnP. The 2DPnP step is performed in a robust way, which is more accurate and considerably less complex compared to the conventional RANSAC-based robust methods. 2DTriPnP will be demonstrated experimentally to show better localization performance compared to other state-of-the-art methods.

Authors

Sadeghi H; Valaee S; Shirani S

Journal

IEEE Transactions on Vehicular Technology, Vol. 66, No. 6, pp. 4678–4690

Publisher

Institute of Electrical and Electronics Engineers (IEEE)

Publication Date

June 1, 2017

DOI

10.1109/tvt.2016.2615630

ISSN

0018-9545

Contact the Experts team