Exploring auxiliary context: Discrete semantic transfer hashing for scalable image retrieval

Publication Type:
Journal Article
Citation:
IEEE Transactions on Neural Networks and Learning Systems, 2018, 29 (11), pp. 5264 - 5276
Issue Date:
2018-11-01
Filename Description Size
08291840.pdfPublished Version2.46 MB
Adobe PDF
Full metadata record
© 2018 IEEE. Unsupervised hashing can desirably support scalable content-based image retrieval for its appealing advantages of semantic label independence, memory, and search efficiency. However, the learned hash codes are embedded with limited discriminative semantics due to the intrinsic limitation of image representation. To address the problem, in this paper, we propose a novel hashing approach, dubbed as discrete semantic transfer hashing (DSTH). The key idea is to directly augment the semantics of discrete image hash codes by exploring auxiliary contextual modalities. To this end, a unified hashing framework is formulated to simultaneously preserve visual similarities of images and perform semantic transfer from contextual modalities. Furthermore, to guarantee direct semantic transfer and avoid information loss, we explicitly impose the discrete constraint, bit-uncorrelation constraint, and bit-balance constraint on hash codes. A novel and effective discrete optimization method based on augmented Lagrangian multiplier is developed to iteratively solve the optimization problem. The whole learning process has linear computation complexity and desirable scalability. Experiments on three benchmark data sets demonstrate the superiority of DSTH compared with several state-of-the-art approaches.
Please use this identifier to cite or link to this item: