Home
Scholarly Works
Robust Spatial Consistency Graph Model for Partial...
Journal article

Robust Spatial Consistency Graph Model for Partial Duplicate Image Retrieval

Abstract

Partial duplicate images often have large non-duplicate regions and small duplicate regions with random rotation, which lead to the following problems: 1) large number of noisy features from the non-duplicate regions; 2) small number of representative features from the duplicate regions; 3) randomly rotated or deformed duplicate regions. These problems challenge many content based image retrieval (CBIR) approaches, since most of them cannot distinguish the representative features from a large proportion of noisy features in a rotation invariant way. In this paper, we propose a rotation invariant partial duplicate image retrieval (PDIR) approach, which effectively and efficiently retrieves the partial duplicate images by accurately matching the representative SIFT features. Our method is based on the Combined-Orientation-Position (COP) consistency graph model, which consists of the following two parts: 1) The COP consistency, which is a rotation invariant measurement of the relative spatial consistency among the candidate matches of SIFT features; it uses a coarse-to-fine family of evenly sectored polar coordinate systems to softly quantize and combine the orientations and positions of the SIFT features. 2) The consistency graph model, which robustly rejects the spatially inconsistent noisy features by effectively detecting the group of candidate feature matches with the largest average COP consistency. Extensive experiments on five large scale image data sets show promising retrieval performances.

Authors

Chu L; Jiang S; Wang S; Zhang Y; Huang Q

Journal

IEEE Transactions on Multimedia, Vol. 15, No. 8, pp. 1982–1996

Publisher

Institute of Electrical and Electronics Engineers (IEEE)

Publication Date

December 1, 2013

DOI

10.1109/tmm.2013.2270455

ISSN

1520-9210

Contact the Experts team