IDEAS home Printed from https://ideas.repec.org/p/arx/papers/2412.12516.html
   My bibliography  Save this paper

Enhanced Momentum with Momentum Transformers

Author

Listed:
  • Max Mason
  • Waasi A Jagirdar
  • David Huang
  • Rahul Murugan

Abstract

The primary objective of this research is to build a Momentum Transformer that is expected to outperform benchmark time-series momentum and mean-reversion trading strategies. We extend the ideas introduced in the paper Trading with the Momentum Transformer: An Intelligent and Interpretable Architecture to equities as the original paper primarily only builds upon futures and equity indices. Unlike conventional Long Short-Term Memory (LSTM) models, which operate sequentially and are optimized for processing local patterns, an attention mechanism equips our architecture with direct access to all prior time steps in the training window. This hybrid design, combining attention with an LSTM, enables the model to capture long-term dependencies, enhance performance in scenarios accounting for transaction costs, and seamlessly adapt to evolving market conditions, such as those witnessed during the Covid Pandemic. We average 4.14% returns which is similar to the original papers results. Our Sharpe is lower at an average of 1.12 due to much higher volatility which may be due to stocks being inherently more volatile than futures and indices.

Suggested Citation

  • Max Mason & Waasi A Jagirdar & David Huang & Rahul Murugan, 2024. "Enhanced Momentum with Momentum Transformers," Papers 2412.12516, arXiv.org.
  • Handle: RePEc:arx:papers:2412.12516
    as

    Download full text from publisher

    File URL: http://arxiv.org/pdf/2412.12516
    File Function: Latest version
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Kieran Wood & Sven Giegerich & Stephen Roberts & Stefan Zohren, 2021. "Trading with the Momentum Transformer: An Intelligent and Interpretable Architecture," Papers 2112.08534, arXiv.org, revised Nov 2022.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Xingyue Pu & Stephen Roberts & Xiaowen Dong & Stefan Zohren, 2023. "Network Momentum across Asset Classes," Papers 2308.11294, arXiv.org.
    2. Joel Ong & Dorien Herremans, 2024. "DeepUnifiedMom: Unified Time-series Momentum Portfolio Construction via Multi-Task Learning with Multi-Gate Mixture of Experts," Papers 2406.08742, arXiv.org.
    3. Xingyue Pu & Stefan Zohren & Stephen Roberts & Xiaowen Dong, 2023. "Learning to Learn Financial Networks for Optimising Momentum Strategies," Papers 2308.12212, arXiv.org.
    4. Joel Ong & Dorien Herremans, 2023. "Constructing Time-Series Momentum Portfolios with Deep Multi-Task Learning," Papers 2306.13661, arXiv.org.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:arx:papers:2412.12516. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: arXiv administrators (email available below). General contact details of provider: http://arxiv.org/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.