IDEAS home Printed from https://ideas.repec.org/a/spr/annopr/v339y2024i1d10.1007_s10479-022-04788-z.html
   My bibliography  Save this article

An improved transformer model with multi-head attention and attention to attention for low-carbon multi-depot vehicle routing problem

Author

Listed:
  • Yang Zou

    (Nanjing University of Aeronautics and Astronautics)

  • Hecheng Wu

    (Nanjing University of Aeronautics and Astronautics)

  • Yunqiang Yin

    (University of Electronic Science and Technology of China)

  • Lalitha Dhamotharan

    (University of Exeter)

  • Daqiang Chen

    (Zhejiang Gongshang University)

  • Aviral Kumar Tiwari

    (Rajagiri Business School (RBS))

Abstract

Low-carbon logistics is an emerging and sustainable development industry in the era of a low-carbon economy. The end-to-end deep reinforcement learning (DRL) method with an encoder-decoder framework has been proven effective for solving logistics problems. However, in most cases, the recurrent neural networks (RNN) and attention mechanisms are used in encoders and decoders, which may result in the long-distance dependence problem and the neglect of the correlation between query vectors. To surround this problem, we propose an improved transformer model (TAOA) with both multi-head attention mechanism (MHA) and attention to attention mechanism (AOA), and apply it to solve the low-carbon multi-depot vehicle routing problem (MDVRP). In this model, the MHA and AOA are implemented to solve the probability of route nodes in the encoder and decoder. The MHA is used to process different parts of the input sequence, which can be calculated in parallel, and the AOA is used to deal with the deficiency problem of correlation between query results and query vectors in the MHA. The actor-critic framework based on strategy gradient is constructed to train model parameters. The 2opt operator is further used to optimize the resulting routes. Finally, extensive numerical studies are carried out to verify the effectiveness and operation efficiency of the proposed TAOA, and the results show that the proposed TAOA performs better in solving the MDVRP than the traditional transformer model (Kools), genetic algorithm (GA), and Google OR-Tools (Ortools).

Suggested Citation

  • Yang Zou & Hecheng Wu & Yunqiang Yin & Lalitha Dhamotharan & Daqiang Chen & Aviral Kumar Tiwari, 2024. "An improved transformer model with multi-head attention and attention to attention for low-carbon multi-depot vehicle routing problem," Annals of Operations Research, Springer, vol. 339(1), pages 517-536, August.
  • Handle: RePEc:spr:annopr:v:339:y:2024:i:1:d:10.1007_s10479-022-04788-z
    DOI: 10.1007/s10479-022-04788-z
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s10479-022-04788-z
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s10479-022-04788-z?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:annopr:v:339:y:2024:i:1:d:10.1007_s10479-022-04788-z. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.