IDEAS home Printed from https://ideas.repec.org/a/eee/appene/v251y2019ic67.html
   My bibliography  Save this article

Intelligent energy management for hybrid electric tracked vehicles using online reinforcement learning

Author

Listed:
  • Du, Guodong
  • Zou, Yuan
  • Zhang, Xudong
  • Kong, Zehui
  • Wu, Jinlong
  • He, Dingbo

Abstract

The energy management approach of hybrid electric vehicles has the potential to overcome the increasing energy crisis and environmental pollution by reducing the fuel consumption. This paper proposes an online updating energy management strategy to improve the fuel economy of hybrid electric tracked vehicles. As the basis of the research, the overall model for the hybrid electric tracked vehicle is built in detail and validated through the field experiment. To accelerate the convergence rate of the control policy calculation, a novel reinforcement learning algorithm called fast Q-learning is applied which improves the computational speed by 16%. The cloud-computation is presented to afford the main computation burden to realize the online updating energy management strategy in hardware-in-loop simulation bench. The Kullback-Leibler divergence rate to trigger the update of the control strategy is designed and realized in hardware-in-loop simulation bench. The simulation results show that the fuel consumption of the fast Q-learning based online updating strategy is 4.6% lower than that of stationary strategy, and is close to that of dynamic programming strategy. Besides, the computation time of the proposed method is only 1.35 s which is much shorter than that of dynamic programming based method. The results indicate that the proposed energy management strategy can greatly improve the fuel economy and have the potential to be applied in the real-time application. Moreover, the adaptability of the online energy management strategy is validated in three realistic driving schedules.

Suggested Citation

  • Du, Guodong & Zou, Yuan & Zhang, Xudong & Kong, Zehui & Wu, Jinlong & He, Dingbo, 2019. "Intelligent energy management for hybrid electric tracked vehicles using online reinforcement learning," Applied Energy, Elsevier, vol. 251(C), pages 1-1.
  • Handle: RePEc:eee:appene:v:251:y:2019:i:c:67
    DOI: 10.1016/j.apenergy.2019.113388
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0306261919310621
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.apenergy.2019.113388?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Pandian Vasant & Utku Kose & Junzo Watada, 2017. "Metaheuristic Techniques in Enhancing the Efficiency and Performance of Thermo-Electric Cooling Devices," Energies, MDPI, vol. 10(11), pages 1-50, October.
    2. Wu, Yuankai & Tan, Huachun & Peng, Jiankun & Zhang, Hailong & He, Hongwen, 2019. "Deep reinforcement learning of energy management with continuous control strategy and traffic information for a series-parallel plug-in hybrid electric bus," Applied Energy, Elsevier, vol. 247(C), pages 454-466.
    3. Trovão, João P. & Pereirinha, Paulo G. & Jorge, Humberto M. & Antunes, Carlos Henggeler, 2013. "A multi-level energy management system for multi-source electric vehicles – An integrated rule-based meta-heuristic approach," Applied Energy, Elsevier, vol. 105(C), pages 304-318.
    4. M. Sabri, M.F. & Danapalasingam, K.A. & Rahmat, M.F., 2016. "A review on hybrid electric vehicles architecture and energy management strategies," Renewable and Sustainable Energy Reviews, Elsevier, vol. 53(C), pages 1433-1442.
    5. Peng, Jiankun & He, Hongwen & Xiong, Rui, 2017. "Rule based energy management strategy for a series–parallel plug-in hybrid electric bus optimized by dynamic programming," Applied Energy, Elsevier, vol. 185(P2), pages 1633-1643.
    6. Zou, Yuan & Liu, Teng & Liu, Dexing & Sun, Fengchun, 2016. "Reinforcement learning-based real-time energy management for a hybrid tracked vehicle," Applied Energy, Elsevier, vol. 171(C), pages 372-382.
    7. M. Hadi Amini & Orkun Karabasoglu, 2018. "Optimal Operation of Interdependent Power Systems and Electrified Transportation Networks," Energies, MDPI, vol. 11(1), pages 1-25, January.
    8. Xiang, Changle & Ding, Feng & Wang, Weida & He, Wei, 2017. "Energy management of a dual-mode power-split hybrid electric vehicle based on velocity prediction and nonlinear model predictive control," Applied Energy, Elsevier, vol. 189(C), pages 640-653.
    9. Yuan Zou & Fengchun Sun & Xiaosong Hu & Lino Guzzella & Huei Peng, 2012. "Combined Optimal Sizing and Control for a Hybrid Tracked Vehicle," Energies, MDPI, vol. 5(11), pages 1-14, November.
    10. Qin, Zhaobo & Luo, Yugong & Zhuang, Weichao & Pan, Ziheng & Li, Keqiang & Peng, Huei, 2018. "Simultaneous optimization of topology, control and size for multi-mode hybrid tracked vehicles," Applied Energy, Elsevier, vol. 212(C), pages 1627-1641.
    11. Zeyu Chen & Rui Xiong & Kunyu Wang & Bin Jiao, 2015. "Optimal Energy Management Strategy of a Plug-in Hybrid Electric Vehicle Based on a Particle Swarm Optimization Algorithm," Energies, MDPI, vol. 8(5), pages 1-18, April.
    12. Tang, Xiaolin & Zhang, Dejiu & Liu, Teng & Khajepour, Amir & Yu, Haisheng & Wang, Hong, 2019. "Research on the energy control of a dual-motor hybrid vehicle during engine start-stop process," Energy, Elsevier, vol. 166(C), pages 1181-1193.
    13. Liu, Teng & Wang, Bo & Yang, Chenglang, 2018. "Online Markov Chain-based energy management for a hybrid tracked vehicle with speedy Q-learning," Energy, Elsevier, vol. 160(C), pages 544-555.
    14. Zehui Kong & Yuan Zou & Teng Liu, 2017. "Implementation of real-time energy management strategy based on reinforcement learning for hybrid electric vehicles and simulation validation," PLOS ONE, Public Library of Science, vol. 12(7), pages 1-16, July.
    15. Teng Liu & Yuan Zou & Dexing Liu & Fengchun Sun, 2015. "Reinforcement Learning–Based Energy Management Strategy for a Hybrid Electric Tracked Vehicle," Energies, MDPI, vol. 8(7), pages 1-18, July.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Liu, Teng & Tan, Wenhao & Tang, Xiaolin & Zhang, Jinwei & Xing, Yang & Cao, Dongpu, 2021. "Driving conditions-driven energy management strategies for hybrid electric vehicles: A review," Renewable and Sustainable Energy Reviews, Elsevier, vol. 151(C).
    2. Du, Guodong & Zou, Yuan & Zhang, Xudong & Liu, Teng & Wu, Jinlong & He, Dingbo, 2020. "Deep reinforcement learning based energy management for a hybrid electric vehicle," Energy, Elsevier, vol. 201(C).
    3. Dong, Peng & Zhao, Junwei & Liu, Xuewu & Wu, Jian & Xu, Xiangyang & Liu, Yanfang & Wang, Shuhan & Guo, Wei, 2022. "Practical application of energy management strategy for hybrid electric vehicles based on intelligent and connected technologies: Development stages, challenges, and future trends," Renewable and Sustainable Energy Reviews, Elsevier, vol. 170(C).
    4. Du, Guodong & Zou, Yuan & Zhang, Xudong & Guo, Lingxiong & Guo, Ningyuan, 2022. "Energy management for a hybrid electric vehicle based on prioritized deep reinforcement learning framework," Energy, Elsevier, vol. 241(C).
    5. Daniel Egan & Qilun Zhu & Robert Prucka, 2023. "A Review of Reinforcement Learning-Based Powertrain Controllers: Effects of Agent Selection for Mixed-Continuity Control and Reward Formulation," Energies, MDPI, vol. 16(8), pages 1-31, April.
    6. Qi, Chunyang & Zhu, Yiwen & Song, Chuanxue & Yan, Guangfu & Xiao, Feng & Da wang, & Zhang, Xu & Cao, Jingwei & Song, Shixin, 2022. "Hierarchical reinforcement learning based energy management strategy for hybrid electric vehicle," Energy, Elsevier, vol. 238(PA).
    7. Zhang, Yahui & Wang, Zimeng & Tian, Yang & Wang, Zhong & Kang, Mingxin & Xie, Fangxi & Wen, Guilin, 2024. "Pre-optimization-assisted deep reinforcement learning-based energy management strategy for a series–parallel hybrid electric truck," Energy, Elsevier, vol. 302(C).
    8. Chen, Zheng & Hu, Hengjie & Wu, Yitao & Zhang, Yuanjian & Li, Guang & Liu, Yonggang, 2020. "Stochastic model predictive control for energy management of power-split plug-in hybrid electric vehicles based on reinforcement learning," Energy, Elsevier, vol. 211(C).
    9. Wu, Yuankai & Tan, Huachun & Peng, Jiankun & Zhang, Hailong & He, Hongwen, 2019. "Deep reinforcement learning of energy management with continuous control strategy and traffic information for a series-parallel plug-in hybrid electric bus," Applied Energy, Elsevier, vol. 247(C), pages 454-466.
    10. Liu, Yonggang & Liu, Junjun & Zhang, Yuanjian & Wu, Yitao & Chen, Zheng & Ye, Ming, 2020. "Rule learning based energy management strategy of fuel cell hybrid vehicles considering multi-objective optimization," Energy, Elsevier, vol. 207(C).
    11. Zhou, Jianhao & Xue, Siwu & Xue, Yuan & Liao, Yuhui & Liu, Jun & Zhao, Wanzhong, 2021. "A novel energy management strategy of hybrid electric vehicle via an improved TD3 deep reinforcement learning," Energy, Elsevier, vol. 224(C).
    12. Lian, Renzong & Peng, Jiankun & Wu, Yuankai & Tan, Huachun & Zhang, Hailong, 2020. "Rule-interposing deep reinforcement learning based energy management strategy for power-split hybrid electric vehicle," Energy, Elsevier, vol. 197(C).
    13. Shi, Wenzhuo & Huangfu, Yigeng & Xu, Liangcai & Pang, Shengzhao, 2022. "Online energy management strategy considering fuel cell fault for multi-stack fuel cell hybrid vehicle based on multi-agent reinforcement learning," Applied Energy, Elsevier, vol. 328(C).
    14. Huang, Xuejin & Zhang, Jingyi & Ou, Kai & Huang, Yin & Kang, Zehao & Mao, Xuping & Zhou, Yujie & Xuan, Dongji, 2024. "Deep reinforcement learning-based health-conscious energy management for fuel cell hybrid electric vehicles in model predictive control framework," Energy, Elsevier, vol. 304(C).
    15. Hu, Dong & Xie, Hui & Song, Kang & Zhang, Yuanyuan & Yan, Long, 2023. "An apprenticeship-reinforcement learning scheme based on expert demonstrations for energy management strategy of hybrid electric vehicles," Applied Energy, Elsevier, vol. 342(C).
    16. Perera, A.T.D. & Kamalaruban, Parameswaran, 2021. "Applications of reinforcement learning in energy systems," Renewable and Sustainable Energy Reviews, Elsevier, vol. 137(C).
    17. Yang, Ningkang & Ruan, Shumin & Han, Lijin & Liu, Hui & Guo, Lingxiong & Xiang, Changle, 2023. "Reinforcement learning-based real-time intelligent energy management for hybrid electric vehicles in a model predictive control framework," Energy, Elsevier, vol. 270(C).
    18. Alessia Musa & Pier Giuseppe Anselma & Giovanni Belingardi & Daniela Anna Misul, 2023. "Energy Management in Hybrid Electric Vehicles: A Q-Learning Solution for Enhanced Drivability and Energy Efficiency," Energies, MDPI, vol. 17(1), pages 1-20, December.
    19. Qi, Chunyang & Song, Chuanxue & Xiao, Feng & Song, Shixin, 2022. "Generalization ability of hybrid electric vehicle energy management strategy based on reinforcement learning method," Energy, Elsevier, vol. 250(C).
    20. Han, Xuefeng & He, Hongwen & Wu, Jingda & Peng, Jiankun & Li, Yuecheng, 2019. "Energy management based on reinforcement learning with double deep Q-learning for a hybrid electric tracked vehicle," Applied Energy, Elsevier, vol. 254(C).

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:appene:v:251:y:2019:i:c:67. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/wps/find/journaldescription.cws_home/405891/description#description .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.