IDEAS home Printed from https://ideas.repec.org/a/eee/energy/v294y2024ics0360544224007059.html
   My bibliography  Save this article

Enabling intelligent transferable energy management of series hybrid electric tracked vehicle across motion dimensions via soft actor-critic algorithm

Author

Listed:
  • He, Hongwen
  • Su, Qicong
  • Huang, Ruchen
  • Niu, Zegong

Abstract

Due to the complex driving conditions faced by hybrid electric tracked vehicles, energy management is crucial for improving fuel economy. However, developing an energy management strategy (EMS) is a time-consuming and labor-intensive task, which is challenging to generalize across different driving tasks. To solve this problem and shorten the development cycle of EMSs, this article proposes a novel transferable energy management framework for a series hybrid electric tracked vehicle (SHETV) across motion dimensions. To fully reuse the learned knowledge from longitudinal motion into both longitudinal and lateral motion, this framework merges transfer learning (TL) into the state-of-the-art deep reinforcement learning (DRL) algorithm, soft actor-critic (SAC), to formulate a novel deep transfer reinforcement learning (DTRL) method, with the transfer of both the neural networks and the pre-trained experience replay buffer. Simulation results indicate that the proposed EMS accelerates the convergence speed by 75.38%, enhances the learning ability by 19.05%, and improves the fuel economy by 5.08% compared to the baseline EMS. This article contributes to correlating different energy management tasks and reusing the existing EMS for the rapid development of a new EMS of the hybrid electric tracked vehicle.

Suggested Citation

  • He, Hongwen & Su, Qicong & Huang, Ruchen & Niu, Zegong, 2024. "Enabling intelligent transferable energy management of series hybrid electric tracked vehicle across motion dimensions via soft actor-critic algorithm," Energy, Elsevier, vol. 294(C).
  • Handle: RePEc:eee:energy:v:294:y:2024:i:c:s0360544224007059
    DOI: 10.1016/j.energy.2024.130933
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0360544224007059
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.energy.2024.130933?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Wang, Huaqing & Xie, Zhuoshi & Pu, Lei & Ren, Zhongrui & Zhang, Yaoyu & Tan, Zhongfu, 2022. "Energy management strategy of hybrid energy storage based on Pareto optimality," Applied Energy, Elsevier, vol. 327(C).
    2. Wang, Hanchen & Ye, Yiming & Zhang, Jiangfeng & Xu, Bin, 2023. "A comparative study of 13 deep reinforcement learning based energy management methods for a hybrid electric vehicle," Energy, Elsevier, vol. 266(C).
    3. Julian Schrittwieser & Ioannis Antonoglou & Thomas Hubert & Karen Simonyan & Laurent Sifre & Simon Schmitt & Arthur Guez & Edward Lockhart & Demis Hassabis & Thore Graepel & Timothy Lillicrap & David , 2020. "Mastering Atari, Go, chess and shogi by planning with a learned model," Nature, Nature, vol. 588(7839), pages 604-609, December.
    4. Hanchen Wang & Tianfan Fu & Yuanqi Du & Wenhao Gao & Kexin Huang & Ziming Liu & Payal Chandak & Shengchao Liu & Peter Katwyk & Andreea Deac & Anima Anandkumar & Karianne Bergen & Carla P. Gomes & Shir, 2023. "Scientific discovery in the age of artificial intelligence," Nature, Nature, vol. 620(7972), pages 47-60, August.
    5. Hanchen Wang & Tianfan Fu & Yuanqi Du & Wenhao Gao & Kexin Huang & Ziming Liu & Payal Chandak & Shengchao Liu & Peter Katwyk & Andreea Deac & Anima Anandkumar & Karianne Bergen & Carla P. Gomes & Shir, 2023. "Publisher Correction: Scientific discovery in the age of artificial intelligence," Nature, Nature, vol. 621(7978), pages 33-33, September.
    6. Cui, Wei & Cui, Naxin & Li, Tao & Cui, Zhongrui & Du, Yi & Zhang, Chenghui, 2022. "An efficient multi-objective hierarchical energy management strategy for plug-in hybrid electric vehicle in connected scenario," Energy, Elsevier, vol. 257(C).
    7. Sun, Wenjing & Zou, Yuan & Zhang, Xudong & Guo, Ningyuan & Zhang, Bin & Du, Guodong, 2022. "High robustness energy management strategy of hybrid electric vehicle based on improved soft actor-critic deep reinforcement learning," Energy, Elsevier, vol. 258(C).
    8. Shi, Junzhe & Xu, Bin & Shen, Yimin & Wu, Jingbo, 2022. "Energy management strategy for battery/supercapacitor hybrid electric city bus based on driving pattern recognition," Energy, Elsevier, vol. 243(C).
    9. Wang, Kang & Wang, Haixin & Yang, Zihao & Feng, Jiawei & Li, Yanzhen & Yang, Junyou & Chen, Zhe, 2023. "A transfer learning method for electric vehicles charging strategy based on deep reinforcement learning," Applied Energy, Elsevier, vol. 343(C).
    10. Zhou, Jianhao & Xue, Yuan & Xu, Da & Li, Chaoxiong & Zhao, Wanzhong, 2022. "Self-learning energy management strategy for hybrid electric vehicle via curiosity-inspired asynchronous deep reinforcement learning," Energy, Elsevier, vol. 242(C).
    11. Maino, Claudio & Misul, Daniela & Musa, Alessia & Spessa, Ezio, 2021. "Optimal mesh discretization of the dynamic programming for hybrid electric vehicles," Applied Energy, Elsevier, vol. 292(C).
    12. Coraci, Davide & Brandi, Silvio & Hong, Tianzhen & Capozzoli, Alfonso, 2023. "Online transfer learning strategy for enhancing the scalability and deployment of deep reinforcement learning control in smart buildings," Applied Energy, Elsevier, vol. 333(C).
    13. Huang, Ruchen & He, Hongwen & Gao, Miaojue, 2023. "Training-efficient and cost-optimal energy management for fuel cell hybrid electric bus based on a novel distributed deep reinforcement learning framework," Applied Energy, Elsevier, vol. 346(C).
    14. Yang, Ningkang & Han, Lijin & Bo, Lin & Liu, Baoshuai & Chen, Xiuqi & Liu, Hui & Xiang, Changle, 2023. "Real-time adaptive energy management for off-road hybrid electric vehicles based on decision-time planning," Energy, Elsevier, vol. 282(C).
    15. Xu, Bin & Rathod, Dhruvang & Zhang, Darui & Yebi, Adamu & Zhang, Xueyu & Li, Xiaoya & Filipi, Zoran, 2020. "Parametric study on reinforcement learning optimized energy management strategy for a hybrid electric vehicle," Applied Energy, Elsevier, vol. 259(C).
    16. He, Hongwen & Wang, Yunlong & Han, Ruoyan & Han, Mo & Bai, Yunfei & Liu, Qingwu, 2021. "An improved MPC-based energy management strategy for hybrid vehicles using V2V and V2I communications," Energy, Elsevier, vol. 225(C).
    17. Xiao, Boyi & Yang, Weiwei & Wu, Jiamin & Walker, Paul D. & Zhang, Nong, 2022. "Energy management strategy via maximum entropy reinforcement learning for an extended range logistics vehicle," Energy, Elsevier, vol. 253(C).
    18. Wang, Hong & Huang, Yanjun & Khajepour, Amir & Song, Qiang, 2016. "Model predictive control-based energy management strategy for a series hybrid electric tracked vehicle," Applied Energy, Elsevier, vol. 182(C), pages 105-114.
    19. Han, Xuefeng & He, Hongwen & Wu, Jingda & Peng, Jiankun & Li, Yuecheng, 2019. "Energy management based on reinforcement learning with double deep Q-learning for a hybrid electric tracked vehicle," Applied Energy, Elsevier, vol. 254(C).
    20. Huang, Ruchen & He, Hongwen & Zhao, Xuyang & Wang, Yunlong & Li, Menglin, 2022. "Battery health-aware and naturalistic data-driven energy management for hybrid electric bus based on TD3 deep reinforcement learning algorithm," Applied Energy, Elsevier, vol. 321(C).
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Peng, Jiankun & Shen, Yang & Wu, ChangCheng & Wang, Chunhai & Yi, Fengyan & Ma, Chunye, 2023. "Research on energy-saving driving control of hydrogen fuel bus based on deep reinforcement learning in freeway ramp weaving area," Energy, Elsevier, vol. 285(C).
    2. Matteo Acquarone & Claudio Maino & Daniela Misul & Ezio Spessa & Antonio Mastropietro & Luca Sorrentino & Enrico Busto, 2023. "Influence of the Reward Function on the Selection of Reinforcement Learning Agents for Hybrid Electric Vehicles Real-Time Control," Energies, MDPI, vol. 16(6), pages 1-22, March.
    3. Marouane Adnane & Ahmed Khoumsi & João Pedro F. Trovão, 2023. "Efficient Management of Energy Consumption of Electric Vehicles Using Machine Learning—A Systematic and Comprehensive Survey," Energies, MDPI, vol. 16(13), pages 1-39, June.
    4. Alessia Musa & Pier Giuseppe Anselma & Giovanni Belingardi & Daniela Anna Misul, 2023. "Energy Management in Hybrid Electric Vehicles: A Q-Learning Solution for Enhanced Drivability and Energy Efficiency," Energies, MDPI, vol. 17(1), pages 1-20, December.
    5. Baodi Zhang & Sheng Guo & Xin Zhang & Qicheng Xue & Lan Teng, 2020. "Adaptive Smoothing Power Following Control Strategy Based on an Optimal Efficiency Map for a Hybrid Electric Tracked Vehicle," Energies, MDPI, vol. 13(8), pages 1-25, April.
    6. He, Hongwen & Meng, Xiangfei & Wang, Yong & Khajepour, Amir & An, Xiaowen & Wang, Renguang & Sun, Fengchun, 2024. "Deep reinforcement learning based energy management strategies for electrified vehicles: Recent advances and perspectives," Renewable and Sustainable Energy Reviews, Elsevier, vol. 192(C).
    7. Liang, Zhaowen & Ruan, Jiageng & Wang, Zhenpo & Liu, Kai & Li, Bin, 2024. "Soft actor-critic-based EMS design for dual motor battery electric bus," Energy, Elsevier, vol. 288(C).
    8. Sun, Alexander Y., 2020. "Optimal carbon storage reservoir management through deep reinforcement learning," Applied Energy, Elsevier, vol. 278(C).
    9. Yang, Ningkang & Han, Lijin & Xiang, Changle & Liu, Hui & Li, Xunmin, 2021. "An indirect reinforcement learning based real-time energy management strategy via high-order Markov Chain model for a hybrid electric vehicle," Energy, Elsevier, vol. 236(C).
    10. Evangelos Katsamakas & Oleg V. Pavlov & Ryan Saklad, 2024. "Artificial intelligence and the transformation of higher education institutions," Papers 2402.08143, arXiv.org.
    11. Fabian Dvorak & Regina Stumpf & Sebastian Fehrler & Urs Fischbacher, 2024. "Generative AI Triggers Welfare-Reducing Decisions in Humans," Papers 2401.12773, arXiv.org.
    12. Anselma, Pier Giuseppe, 2022. "Computationally efficient evaluation of fuel and electrical energy economy of plug-in hybrid electric vehicles with smooth driving constraints," Applied Energy, Elsevier, vol. 307(C).
    13. Daniel Egan & Qilun Zhu & Robert Prucka, 2023. "A Review of Reinforcement Learning-Based Powertrain Controllers: Effects of Agent Selection for Mixed-Continuity Control and Reward Formulation," Energies, MDPI, vol. 16(8), pages 1-31, April.
    14. Koehler, Maximilian & Sauermann, Henry, 2024. "Algorithmic management in scientific research," Research Policy, Elsevier, vol. 53(4).
    15. Huang, Ruchen & He, Hongwen & Zhao, Xuyang & Wang, Yunlong & Li, Menglin, 2022. "Battery health-aware and naturalistic data-driven energy management for hybrid electric bus based on TD3 deep reinforcement learning algorithm," Applied Energy, Elsevier, vol. 321(C).
    16. Chen, Ruihu & Yang, Chao & Ma, Yue & Wang, Weida & Wang, Muyao & Du, Xuelong, 2022. "Online learning predictive power coordinated control strategy for off-road hybrid electric vehicles considering the dynamic response of engine generator set," Applied Energy, Elsevier, vol. 323(C).
    17. Mohseni, Morteza, 2023. "Deep learning in bifurcations of particle trajectories," Chaos, Solitons & Fractals, Elsevier, vol. 175(P1).
    18. Xue, Jiaqi & Jiao, Xiaohong & Yu, Danmei & Zhang, Yahui, 2023. "Predictive hierarchical eco-driving control involving speed planning and energy management for connected plug-in hybrid electric vehicles," Energy, Elsevier, vol. 283(C).
    19. Dong, Peng & Zhao, Junwei & Liu, Xuewu & Wu, Jian & Xu, Xiangyang & Liu, Yanfang & Wang, Shuhan & Guo, Wei, 2022. "Practical application of energy management strategy for hybrid electric vehicles based on intelligent and connected technologies: Development stages, challenges, and future trends," Renewable and Sustainable Energy Reviews, Elsevier, vol. 170(C).
    20. Sani I. Abba & Mohamed A. Yassin & Auwalu Saleh Mubarak & Syed Muzzamil Hussain Shah & Jamilu Usman & Atheer Y. Oudah & Sujay Raghavendra Naganna & Isam H. Aljundi, 2023. "Drinking Water Resources Suitability Assessment Based on Pollution Index of Groundwater Using Improved Explainable Artificial Intelligence," Sustainability, MDPI, vol. 15(21), pages 1-21, November.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:energy:v:294:y:2024:i:c:s0360544224007059. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.journals.elsevier.com/energy .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.