Expressive Local Feature Match for Image Search

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Content-based image retrieval (CBIR) aims to search the most similar images to a given query content, from a large pool of images. Existing state of the art works would extract a compact global feature vector for each image and then evaluate their similarity. Although some CBIR works utilize local features to get better retrieval results, they either would require extra codebook training or use re-ranking for improving the retrieved results. In this work, we propose a many-to-many local feature matching for large scale CBIR tasks. Unlike existing local feature based algorithms which tend to extract large amounts of short-dimensional local features from each image, the characteristic feature representation in the proposed approach is modeled for each image aiming to employ fewer but more expressive local features. Characteristic latent features are selected using k-means clustering and then fed into a similarity measure, without using complex matching kernels or codebook references. Despite the straightforwardness of the proposed CBIR method, experimental results indicate state of art results on several benchmark datasets.
Original languageEnglish
Title of host publicationInternational Conference on Pattern Recognition (ICPR)
Place of PublicationMontréal, Québec, Canada
PublisherIEEE
Pages1386-1392
Publication statusPublished - Jul 2022

Bibliographical note

© IEEE 2022. This is an author-produced version of the published paper. Uploaded in accordance with the publisher’s self-archiving policy. Further copying may not be permitted; contact the publisher for details

Cite this