|
Deep unsupervised discrete cross-modal hashing based on knowledge distillation
ZHANG Cheng, WAN Yuan, QIANG Haopeng
Journal of Computer Applications
2021, 41 (9):
2523-2531.
DOI: 10.11772/j.issn.1001-9081.2020111785
Cross-modal hashing has attracted much attention due to its low storage cost and high retrieval efficiency. Most of the existing cross-modal hashing methods require the inter-instance association information provided by additional manual labels. However, the deep features learned by pre-trained deep unsupervised cross-modal hashing methods can also provide similar information. In addition, the discrete constraints are relaxed in the learning process of Hash codes, resulting in a large quantization loss. To solve the above two issues, a Deep Unsupervised Discrete Cross-modal Hashing (DUDCH) method based on knowledge distillation was proposed. Firstly, combined with the idea of knowledge transfer in knowledge distillation, the latent association information of the pre-trained unsupervised teacher model was used to reconstruct the symmetric similarity matrix, so as to replace the manual labels to help the supervised student method training. Secondly, the Discrete Cyclic Coordinate descent (DCC) was adopted to update the discrete Hash codes iteratively, thereby reducing the quantization loss between the real-value Hash codes learned by neural network and the discrete Hash codes. Finally, with the end-to-end neural network adopted as teacher model and the asymmetric neural network constructed as student model, the time complexity of the combination model was reduced. Experimental results on two commonly used benchmark datasets MIRFLICKR-25K and NUS-WIDE show that compared with Deep Joint-Semantics Reconstructing Hashing (DJSRH), the proposed method has the mean Average Precision (mAP) in image-to-text/text-to-image tasks increased by 2.83 percentage points/0.70 percentage points and 6.53 percentage points/3.95 percentage points averagely and respectively, proving its effectiveness in large-scale cross-modal retrieval.
Reference |
Related Articles |
Metrics
|
|