IDEAS home Printed from https://ideas.repec.org/a/eee/energy/v266y2023ics0360544222033837.html
   My bibliography  Save this article

A comparative study of 13 deep reinforcement learning based energy management methods for a hybrid electric vehicle

Author

Listed:
  • Wang, Hanchen
  • Ye, Yiming
  • Zhang, Jiangfeng
  • Xu, Bin

Abstract

Energy management strategy (EMS) has a huge impact on the energy efficiency of hybrid electric vehicles (HEVs). Recently, fast-growing number of studies have applied different deep reinforcement learning (DRL) based EMS for HEVs. However, a unified performance review benchmark is lacking for most popular DRL algorithms. In this study, 13 popular DRL algorithms are applied as HEV EMSs. The reward performance, computation cost, and learning convergence of different DRL algorithms are discussed. In addition, HEV environments are modified to fit both discrete and continuous action spaces. The results show that the stability of agent during the learning process of continuous action space is more stable than discrete action space. In the continuous action space, SAC has the highest reward, and PPO has the lowest time cost. In discrete action space, DQN has the lowest time cost, and FQF has the highest reward. The comparison among SAC, FQF, rule-based, and equivalent consumption minimization strategies (ECMS) shows that DRL EMSs run the engine more efficiently, thus saving fuel consumption. The fuel consumption of FQF is 10.26% and 5.34% less than Rule-based and ECMS, respectively. The contribution of this paper will speed up the application of DRL algorithms in the HEV EMS application.

Suggested Citation

  • Wang, Hanchen & Ye, Yiming & Zhang, Jiangfeng & Xu, Bin, 2023. "A comparative study of 13 deep reinforcement learning based energy management methods for a hybrid electric vehicle," Energy, Elsevier, vol. 266(C).
  • Handle: RePEc:eee:energy:v:266:y:2023:i:c:s0360544222033837
    DOI: 10.1016/j.energy.2022.126497
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0360544222033837
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.energy.2022.126497?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Du, Guodong & Zou, Yuan & Zhang, Xudong & Liu, Teng & Wu, Jinlong & He, Dingbo, 2020. "Deep reinforcement learning based energy management for a hybrid electric vehicle," Energy, Elsevier, vol. 201(C).
    2. Zou, Yuan & Liu, Teng & Liu, Dexing & Sun, Fengchun, 2016. "Reinforcement learning-based real-time energy management for a hybrid tracked vehicle," Applied Energy, Elsevier, vol. 171(C), pages 372-382.
    3. Pérez, Laura V. & Bossio, Guillermo R. & Moitre, Diego & García, Guillermo O., 2006. "Optimization of power management in an hybrid electric vehicle using dynamic programming," Mathematics and Computers in Simulation (MATCOM), Elsevier, vol. 73(1), pages 244-254.
    4. Xiong, Rui & Cao, Jiayi & Yu, Quanqing, 2018. "Reinforcement learning-based real-time power management for hybrid energy storage system in the plug-in hybrid electric vehicle," Applied Energy, Elsevier, vol. 211(C), pages 538-548.
    5. Du, Guodong & Zou, Yuan & Zhang, Xudong & Kong, Zehui & Wu, Jinlong & He, Dingbo, 2019. "Intelligent energy management for hybrid electric tracked vehicles using online reinforcement learning," Applied Energy, Elsevier, vol. 251(C), pages 1-1.
    6. Yue Hu & Weimin Li & Hui Xu & Guoqing Xu, 2015. "An Online Learning Control Strategy for Hybrid Electric Vehicle Based on Fuzzy Q-Learning," Energies, MDPI, vol. 8(10), pages 1-20, October.
    7. Volodymyr Mnih & Koray Kavukcuoglu & David Silver & Andrei A. Rusu & Joel Veness & Marc G. Bellemare & Alex Graves & Martin Riedmiller & Andreas K. Fidjeland & Georg Ostrovski & Stig Petersen & Charle, 2015. "Human-level control through deep reinforcement learning," Nature, Nature, vol. 518(7540), pages 529-533, February.
    8. Jinquan, Guo & Hongwen, He & Jiankun, Peng & Nana, Zhou, 2019. "A novel MPC-based adaptive energy management strategy in plug-in hybrid electric vehicles," Energy, Elsevier, vol. 175(C), pages 378-392.
    9. Al-Alawi, Baha M. & Bradley, Thomas H., 2013. "Review of hybrid, plug-in hybrid, and electric vehicle market modeling Studies," Renewable and Sustainable Energy Reviews, Elsevier, vol. 21(C), pages 190-203.
    10. Wu, Yuankai & Tan, Huachun & Peng, Jiankun & Zhang, Hailong & He, Hongwen, 2019. "Deep reinforcement learning of energy management with continuous control strategy and traffic information for a series-parallel plug-in hybrid electric bus," Applied Energy, Elsevier, vol. 247(C), pages 454-466.
    11. Zhou, Jianhao & Xue, Siwu & Xue, Yuan & Liao, Yuhui & Liu, Jun & Zhao, Wanzhong, 2021. "A novel energy management strategy of hybrid electric vehicle via an improved TD3 deep reinforcement learning," Energy, Elsevier, vol. 224(C).
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Peng, Jiankun & Shen, Yang & Wu, ChangCheng & Wang, Chunhai & Yi, Fengyan & Ma, Chunye, 2023. "Research on energy-saving driving control of hydrogen fuel bus based on deep reinforcement learning in freeway ramp weaving area," Energy, Elsevier, vol. 285(C).
    2. Liang, Zhaowen & Ruan, Jiageng & Wang, Zhenpo & Liu, Kai & Li, Bin, 2024. "Soft actor-critic-based EMS design for dual motor battery electric bus," Energy, Elsevier, vol. 288(C).
    3. Wilberforce, Tabbi & Anser, Afaaq & Swamy, Jangam Aishwarya & Opoku, Richard, 2023. "An investigation into hybrid energy storage system control and power distribution for hybrid electric vehicles," Energy, Elsevier, vol. 279(C).
    4. Nafiseh Mazaheri & Daniel Santamargarita & Emilio Bueno & Daniel Pizarro & Santiago Cobreces, 2024. "A Deep Reinforcement Learning Approach to DC-DC Power Electronic Converter Control with Practical Considerations," Energies, MDPI, vol. 17(14), pages 1-22, July.
    5. Angel Recalde & Ricardo Cajo & Washington Velasquez & Manuel S. Alvarez-Alvarado, 2024. "Machine Learning and Optimization in Energy Management Systems for Plug-In Hybrid Electric Vehicles: A Comprehensive Review," Energies, MDPI, vol. 17(13), pages 1-39, June.
    6. He, Hongwen & Su, Qicong & Huang, Ruchen & Niu, Zegong, 2024. "Enabling intelligent transferable energy management of series hybrid electric tracked vehicle across motion dimensions via soft actor-critic algorithm," Energy, Elsevier, vol. 294(C).
    7. Alessia Musa & Pier Giuseppe Anselma & Giovanni Belingardi & Daniela Anna Misul, 2023. "Energy Management in Hybrid Electric Vehicles: A Q-Learning Solution for Enhanced Drivability and Energy Efficiency," Energies, MDPI, vol. 17(1), pages 1-20, December.
    8. Zhou, Yujie & Huang, Yin & Mao, Xuping & Kang, Zehao & Huang, Xuejin & Xuan, Dongji, 2024. "Research on energy management strategy of fuel cell hybrid power via an improved TD3 deep reinforcement learning," Energy, Elsevier, vol. 293(C).

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Daniel Egan & Qilun Zhu & Robert Prucka, 2023. "A Review of Reinforcement Learning-Based Powertrain Controllers: Effects of Agent Selection for Mixed-Continuity Control and Reward Formulation," Energies, MDPI, vol. 16(8), pages 1-31, April.
    2. Dong, Peng & Zhao, Junwei & Liu, Xuewu & Wu, Jian & Xu, Xiangyang & Liu, Yanfang & Wang, Shuhan & Guo, Wei, 2022. "Practical application of energy management strategy for hybrid electric vehicles based on intelligent and connected technologies: Development stages, challenges, and future trends," Renewable and Sustainable Energy Reviews, Elsevier, vol. 170(C).
    3. Lian, Renzong & Peng, Jiankun & Wu, Yuankai & Tan, Huachun & Zhang, Hailong, 2020. "Rule-interposing deep reinforcement learning based energy management strategy for power-split hybrid electric vehicle," Energy, Elsevier, vol. 197(C).
    4. Perera, A.T.D. & Kamalaruban, Parameswaran, 2021. "Applications of reinforcement learning in energy systems," Renewable and Sustainable Energy Reviews, Elsevier, vol. 137(C).
    5. Xu, Bin & Rathod, Dhruvang & Zhang, Darui & Yebi, Adamu & Zhang, Xueyu & Li, Xiaoya & Filipi, Zoran, 2020. "Parametric study on reinforcement learning optimized energy management strategy for a hybrid electric vehicle," Applied Energy, Elsevier, vol. 259(C).
    6. Wang, Hanchen & Arjmandzadeh, Ziba & Ye, Yiming & Zhang, Jiangfeng & Xu, Bin, 2024. "FlexNet: A warm start method for deep reinforcement learning in hybrid electric vehicle energy management applications," Energy, Elsevier, vol. 288(C).
    7. Yang, Ningkang & Han, Lijin & Xiang, Changle & Liu, Hui & Li, Xunmin, 2021. "An indirect reinforcement learning based real-time energy management strategy via high-order Markov Chain model for a hybrid electric vehicle," Energy, Elsevier, vol. 236(C).
    8. Zhou, Jianhao & Xue, Siwu & Xue, Yuan & Liao, Yuhui & Liu, Jun & Zhao, Wanzhong, 2021. "A novel energy management strategy of hybrid electric vehicle via an improved TD3 deep reinforcement learning," Energy, Elsevier, vol. 224(C).
    9. Liu, Teng & Tan, Wenhao & Tang, Xiaolin & Zhang, Jinwei & Xing, Yang & Cao, Dongpu, 2021. "Driving conditions-driven energy management strategies for hybrid electric vehicles: A review," Renewable and Sustainable Energy Reviews, Elsevier, vol. 151(C).
    10. Ye, Yiming & Wang, Hanchen & Xu, Bin & Zhang, Jiangfeng, 2023. "An imitation learning-based energy management strategy for electric vehicles considering battery aging," Energy, Elsevier, vol. 283(C).
    11. Du, Guodong & Zou, Yuan & Zhang, Xudong & Liu, Teng & Wu, Jinlong & He, Dingbo, 2020. "Deep reinforcement learning based energy management for a hybrid electric vehicle," Energy, Elsevier, vol. 201(C).
    12. Shi, Wenzhuo & Huangfu, Yigeng & Xu, Liangcai & Pang, Shengzhao, 2022. "Online energy management strategy considering fuel cell fault for multi-stack fuel cell hybrid vehicle based on multi-agent reinforcement learning," Applied Energy, Elsevier, vol. 328(C).
    13. Hu, Dong & Xie, Hui & Song, Kang & Zhang, Yuanyuan & Yan, Long, 2023. "An apprenticeship-reinforcement learning scheme based on expert demonstrations for energy management strategy of hybrid electric vehicles," Applied Energy, Elsevier, vol. 342(C).
    14. Baodi Zhang & Sheng Guo & Xin Zhang & Qicheng Xue & Lan Teng, 2020. "Adaptive Smoothing Power Following Control Strategy Based on an Optimal Efficiency Map for a Hybrid Electric Tracked Vehicle," Energies, MDPI, vol. 13(8), pages 1-25, April.
    15. Wei, Hongqian & Zhang, Nan & Liang, Jun & Ai, Qiang & Zhao, Wenqiang & Huang, Tianyi & Zhang, Youtong, 2022. "Deep reinforcement learning based direct torque control strategy for distributed drive electric vehicles considering active safety and energy saving performance," Energy, Elsevier, vol. 238(PB).
    16. Tang, Xiaolin & Zhou, Haitao & Wang, Feng & Wang, Weida & Lin, Xianke, 2022. "Longevity-conscious energy management strategy of fuel cell hybrid electric Vehicle Based on deep reinforcement learning," Energy, Elsevier, vol. 238(PA).
    17. Tran, Dai-Duong & Vafaeipour, Majid & El Baghdadi, Mohamed & Barrero, Ricardo & Van Mierlo, Joeri & Hegazy, Omar, 2020. "Thorough state-of-the-art analysis of electric and hybrid vehicle powertrains: Topologies and integrated energy management strategies," Renewable and Sustainable Energy Reviews, Elsevier, vol. 119(C).
    18. Wu, Yuankai & Tan, Huachun & Peng, Jiankun & Zhang, Hailong & He, Hongwen, 2019. "Deep reinforcement learning of energy management with continuous control strategy and traffic information for a series-parallel plug-in hybrid electric bus," Applied Energy, Elsevier, vol. 247(C), pages 454-466.
    19. Wang, Yue & Li, Keqiang & Zeng, Xiaohua & Gao, Bolin & Hong, Jichao, 2023. "Investigation of novel intelligent energy management strategies for connected HEB considering global planning of fixed-route information," Energy, Elsevier, vol. 263(PB).
    20. Liu, Yonggang & Wu, Yitao & Wang, Xiangyu & Li, Liang & Zhang, Yuanjian & Chen, Zheng, 2023. "Energy management for hybrid electric vehicles based on imitation reinforcement learning," Energy, Elsevier, vol. 263(PC).

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:energy:v:266:y:2023:i:c:s0360544222033837. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.journals.elsevier.com/energy .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.