IDEAS home Printed from https://ideas.repec.org/a/gam/jmathe/v12y2024i14p2221-d1436306.html
   My bibliography  Save this article

DSCEH: Dual-Stream Correlation-Enhanced Deep Hashing for Image Retrieval

Author

Listed:
  • Yulin Yang

    (School of Computer Science and Engineering, Central South University, Changsha 410083, China
    These authors contributed equally to this work.)

  • Huizhen Chen

    (School of Computer Science and Engineering, Central South University, Changsha 410083, China
    These authors contributed equally to this work.)

  • Rongkai Liu

    (School of Computer Science and Engineering, Central South University, Changsha 410083, China
    These authors contributed equally to this work.)

  • Shuning Liu

    (School of Computer Science and Engineering, Central South University, Changsha 410083, China
    These authors contributed equally to this work.)

  • Yu Zhan

    (China Telecom, Changsha 410083, China
    These authors contributed equally to this work.)

  • Chao Hu

    (School of Electronic Information, Central South University, Changsha 410083, China
    These authors contributed equally to this work.)

  • Ronghua Shi

    (School of Electronic Information, Central South University, Changsha 410083, China
    These authors contributed equally to this work.)

Abstract

Deep Hashing is widely used for large-scale image-retrieval tasks to speed up the retrieval process. Current deep hashing methods are mainly based on the Convolutional Neural Network (CNN) or Vision Transformer (VIT). They only use the local or global features for low-dimensional mapping and only use the similarity loss function to optimize the correlation between pairwise or triplet images. Therefore, the effectiveness of deep hashing methods is limited. In this paper, we propose a dual-stream correlation-enhanced deep hashing framework (DSCEH), which uses the local and global features of the image for low-dimensional mapping and optimizes the correlation of images from the model architecture. DSCEH consists of two main steps: model training and deep-hash-based retrieval. During the training phase, a dual-network structure comprising CNN and VIT is employed for feature extraction. Subsequently, feature fusion is achieved through a concatenation operation, followed by similarity evaluation based on the class token acquired from VIT to establish edge relationships. The Graph Convolutional Network is then utilized to enhance correlation optimization between images, resulting in the generation of high-quality hash codes. This stage facilitates the development of an optimized hash model for image retrieval. In the retrieval stage, all images within the database and the to-be-retrieved images are initially mapped to hash codes using the aforementioned hash model. The retrieval results are subsequently determined based on the Hamming distance between the hash codes. We conduct experiments on three datasets: CIFAR-10, MSCOCO, and NUSWIDE. Experimental results show the superior performance of DSCEH, which helps with fast and accurate image retrieval.

Suggested Citation

  • Yulin Yang & Huizhen Chen & Rongkai Liu & Shuning Liu & Yu Zhan & Chao Hu & Ronghua Shi, 2024. "DSCEH: Dual-Stream Correlation-Enhanced Deep Hashing for Image Retrieval," Mathematics, MDPI, vol. 12(14), pages 1-16, July.
  • Handle: RePEc:gam:jmathe:v:12:y:2024:i:14:p:2221-:d:1436306
    as

    Download full text from publisher

    File URL: https://www.mdpi.com/2227-7390/12/14/2221/pdf
    Download Restriction: no

    File URL: https://www.mdpi.com/2227-7390/12/14/2221/
    Download Restriction: no
    ---><---

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:gam:jmathe:v:12:y:2024:i:14:p:2221-:d:1436306. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: MDPI Indexing Manager (email available below). General contact details of provider: https://www.mdpi.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.