By the same authors

From the same journal

From the same journal

Deep Supervised Hashing using Symmetric Relative Entropy

Research output: Contribution to journalArticle

Author(s)

Department/unit(s)

Publication details

JournalPattern Recognition Letters
DateAccepted/In press - 10 Jul 2019
DatePublished (current) - 11 Jul 2019
Original languageEnglish

Abstract

By virtue of their simplicity and efficiency, hashing algorithms have achieved significant success on large-scale approximate nearest neighbor search. Recently, many deep neural network based hashing methods have been proposed to improve the search accuracy by simultaneously learning both the feature representation and the binary hash functions. Most deep hashing methods depend on supervised semantic label information for preserving the distance or similarity between local structures, which unfortunately ignores the global distribution of the learned hash codes. We propose a novel deep supervised hashing method that aims to minimize the information loss generated during the embedding process. Specifically, the information loss is measured by the Jensen-Shannon divergence to ensure that compact hash codes have a similar distribution with those from the original images. Experimental results show that our method outperforms current state-of-the-art approaches on two benchmark datasets.

Bibliographical note

© 2019 Published by Elsevier B.V. This is an author-produced version of the published paper. Uploaded in accordance with the publisher’s self-archiving policy.

Discover related content

Find related publications, people, projects, datasets and more using interactive charts.

View graph of relations