IDEAS home Printed from https://ideas.repec.org/a/eee/energy/v282y2023ics0360544223022260.html
   My bibliography  Save this article

Real-time adaptive energy management for off-road hybrid electric vehicles based on decision-time planning

Author

Listed:
  • Yang, Ningkang
  • Han, Lijin
  • Bo, Lin
  • Liu, Baoshuai
  • Chen, Xiuqi
  • Liu, Hui
  • Xiang, Changle

Abstract

Unknown and changeable driving conditions of off-road hybrid electric vehicle (HEV) challenge its energy management strategy (EMS). To tackle this issue, the paper develops a real-time adaptive strategy for off-road HEVs through decision-time planning (DTP), which is a unique method of model-based reinforcement learning (MBRL). First, the MBRL framework for the energy management problem is established, including a RL-oriented model and the DTP algorithm. The RL model consists of a deterministic nonlinear state space model and a stochastic recursive Markov Chain (MC), and the latter is constructed online and updated constantly according to new observations, which can reflect the driving condition precisely. Then, the DTP algorithm is detailed and applied. Instead of learning an overall policy for an entire driving cycle, it seeks to learn the optimal action for each encountered vehicle state, which improves the learning efficiency and realizes the real-time adaptive EMS. In the simulation, assuming that no prior information of the driving conditions is known, the proposed EMS only takes about 1–3% more fuel and 10% more battery life than dynamic programming in both off-road driving conditions and standard road cycles. The EMS significantly outperforms traditional Q-learning and rule-based strategy, verifying its optimization capability and adaptability.

Suggested Citation

  • Yang, Ningkang & Han, Lijin & Bo, Lin & Liu, Baoshuai & Chen, Xiuqi & Liu, Hui & Xiang, Changle, 2023. "Real-time adaptive energy management for off-road hybrid electric vehicles based on decision-time planning," Energy, Elsevier, vol. 282(C).
  • Handle: RePEc:eee:energy:v:282:y:2023:i:c:s0360544223022260
    DOI: 10.1016/j.energy.2023.128832
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0360544223022260
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.energy.2023.128832?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Xu, Bin & Rathod, Dhruvang & Zhang, Darui & Yebi, Adamu & Zhang, Xueyu & Li, Xiaoya & Filipi, Zoran, 2020. "Parametric study on reinforcement learning optimized energy management strategy for a hybrid electric vehicle," Applied Energy, Elsevier, vol. 259(C).
    2. Zou, Runnan & Fan, Likang & Dong, Yanrui & Zheng, Siyu & Hu, Chenxing, 2021. "DQL energy management: An online-updated algorithm and its application in fix-line hybrid electric vehicle," Energy, Elsevier, vol. 225(C).
    3. Yang, Ningkang & Han, Lijin & Xiang, Changle & Liu, Hui & Li, Xunmin, 2021. "An indirect reinforcement learning based real-time energy management strategy via high-order Markov Chain model for a hybrid electric vehicle," Energy, Elsevier, vol. 236(C).
    4. Zhou, Quan & Li, Ji & Shuai, Bin & Williams, Huw & He, Yinglong & Li, Ziyang & Xu, Hongming & Yan, Fuwu, 2019. "Multi-step reinforcement learning for model-free predictive energy management of an electrified off-highway vehicle," Applied Energy, Elsevier, vol. 255(C).
    5. Du, Guodong & Zou, Yuan & Zhang, Xudong & Kong, Zehui & Wu, Jinlong & He, Dingbo, 2019. "Intelligent energy management for hybrid electric tracked vehicles using online reinforcement learning," Applied Energy, Elsevier, vol. 251(C), pages 1-1.
    6. Yang, Ningkang & Ruan, Shumin & Han, Lijin & Liu, Hui & Guo, Lingxiong & Xiang, Changle, 2023. "Reinforcement learning-based real-time intelligent energy management for hybrid electric vehicles in a model predictive control framework," Energy, Elsevier, vol. 270(C).
    7. Li, Yuecheng & He, Hongwen & Khajepour, Amir & Wang, Hong & Peng, Jiankun, 2019. "Energy management for a power-split hybrid electric bus via deep reinforcement learning with terrain information," Applied Energy, Elsevier, vol. 255(C).
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. He, Hongwen & Su, Qicong & Huang, Ruchen & Niu, Zegong, 2024. "Enabling intelligent transferable energy management of series hybrid electric tracked vehicle across motion dimensions via soft actor-critic algorithm," Energy, Elsevier, vol. 294(C).

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Daniel Egan & Qilun Zhu & Robert Prucka, 2023. "A Review of Reinforcement Learning-Based Powertrain Controllers: Effects of Agent Selection for Mixed-Continuity Control and Reward Formulation," Energies, MDPI, vol. 16(8), pages 1-31, April.
    2. Yang, Ningkang & Han, Lijin & Xiang, Changle & Liu, Hui & Li, Xunmin, 2021. "An indirect reinforcement learning based real-time energy management strategy via high-order Markov Chain model for a hybrid electric vehicle," Energy, Elsevier, vol. 236(C).
    3. Louback, Eduardo & Biswas, Atriya & Machado, Fabricio & Emadi, Ali, 2024. "A review of the design process of energy management systems for dual-motor battery electric vehicles," Renewable and Sustainable Energy Reviews, Elsevier, vol. 193(C).
    4. Li, Guozhen & Zhang, Zhenyu & Shi, Wankai & Li, Wenyong, 2023. "Energy management strategy and simulation analysis of a hybrid train based on a comprehensive efficiency optimization," Applied Energy, Elsevier, vol. 349(C).
    5. Dong, Peng & Zhao, Junwei & Liu, Xuewu & Wu, Jian & Xu, Xiangyang & Liu, Yanfang & Wang, Shuhan & Guo, Wei, 2022. "Practical application of energy management strategy for hybrid electric vehicles based on intelligent and connected technologies: Development stages, challenges, and future trends," Renewable and Sustainable Energy Reviews, Elsevier, vol. 170(C).
    6. Wu, Yitao & Zhang, Yuanjian & Li, Guang & Shen, Jiangwei & Chen, Zheng & Liu, Yonggang, 2020. "A predictive energy management strategy for multi-mode plug-in hybrid electric vehicles based on multi neural networks," Energy, Elsevier, vol. 208(C).
    7. Liu, Huanlong & Chen, Guanpeng & Li, Dafa & Wang, Jiawei & Zhou, Jianyi, 2021. "Energy active adjustment and bidirectional transfer management strategy of the electro-hydrostatic hydraulic hybrid powertrain for battery bus," Energy, Elsevier, vol. 230(C).
    8. Liu, Teng & Tan, Wenhao & Tang, Xiaolin & Zhang, Jinwei & Xing, Yang & Cao, Dongpu, 2021. "Driving conditions-driven energy management strategies for hybrid electric vehicles: A review," Renewable and Sustainable Energy Reviews, Elsevier, vol. 151(C).
    9. Chen, Jiaxin & Shu, Hong & Tang, Xiaolin & Liu, Teng & Wang, Weida, 2022. "Deep reinforcement learning-based multi-objective control of hybrid power system combined with road recognition under time-varying environment," Energy, Elsevier, vol. 239(PC).
    10. Zhang, Hao & Chen, Boli & Lei, Nuo & Li, Bingbing & Chen, Chaoyi & Wang, Zhi, 2024. "Coupled velocity and energy management optimization of connected hybrid electric vehicles for maximum collective efficiency," Applied Energy, Elsevier, vol. 360(C).
    11. Chen, Ruihu & Yang, Chao & Ma, Yue & Wang, Weida & Wang, Muyao & Du, Xuelong, 2022. "Online learning predictive power coordinated control strategy for off-road hybrid electric vehicles considering the dynamic response of engine generator set," Applied Energy, Elsevier, vol. 323(C).
    12. Marouane Adnane & Ahmed Khoumsi & João Pedro F. Trovão, 2023. "Efficient Management of Energy Consumption of Electric Vehicles Using Machine Learning—A Systematic and Comprehensive Survey," Energies, MDPI, vol. 16(13), pages 1-39, June.
    13. Hu, Dong & Xie, Hui & Song, Kang & Zhang, Yuanyuan & Yan, Long, 2023. "An apprenticeship-reinforcement learning scheme based on expert demonstrations for energy management strategy of hybrid electric vehicles," Applied Energy, Elsevier, vol. 342(C).
    14. Yang, Ningkang & Ruan, Shumin & Han, Lijin & Liu, Hui & Guo, Lingxiong & Xiang, Changle, 2023. "Reinforcement learning-based real-time intelligent energy management for hybrid electric vehicles in a model predictive control framework," Energy, Elsevier, vol. 270(C).
    15. Baodi Zhang & Sheng Guo & Xin Zhang & Qicheng Xue & Lan Teng, 2020. "Adaptive Smoothing Power Following Control Strategy Based on an Optimal Efficiency Map for a Hybrid Electric Tracked Vehicle," Energies, MDPI, vol. 13(8), pages 1-25, April.
    16. Chang, Chengcheng & Zhao, Wanzhong & Wang, Chunyan & Luan, Zhongkai, 2023. "An energy management strategy of deep reinforcement learning based on multi-agent architecture under self-generating conditions," Energy, Elsevier, vol. 283(C).
    17. Sercan Yalçın & Münür Sacit Herdem, 2024. "Optimizing EV Battery Management: Advanced Hybrid Reinforcement Learning Models for Efficient Charging and Discharging," Energies, MDPI, vol. 17(12), pages 1-21, June.
    18. Zhang, Hao & Fan, Qinhao & Liu, Shang & Li, Shengbo Eben & Huang, Jin & Wang, Zhi, 2021. "Hierarchical energy management strategy for plug-in hybrid electric powertrain integrated with dual-mode combustion engine," Applied Energy, Elsevier, vol. 304(C).
    19. Zhang, Wei & Wang, Jixin & Xu, Zhenyu & Shen, Yuying & Gao, Guangzong, 2022. "A generalized energy management framework for hybrid construction vehicles via model-based reinforcement learning," Energy, Elsevier, vol. 260(C).
    20. Han, Lijin & Yang, Ke & Ma, Tian & Yang, Ningkang & Liu, Hui & Guo, Lingxiong, 2022. "Battery life constrained real-time energy management strategy for hybrid electric vehicles based on reinforcement learning," Energy, Elsevier, vol. 259(C).

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:energy:v:282:y:2023:i:c:s0360544223022260. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.journals.elsevier.com/energy .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.