IDEAS home Printed from https://ideas.repec.org/a/eee/energy/v290y2024ics0360544223034916.html
   My bibliography  Save this article

A transfer-based reinforcement learning collaborative energy management strategy for extended-range electric buses with cabin temperature comfort consideration

Author

Listed:
  • Hu, Dong
  • Huang, Chao
  • Yin, Guodong
  • Li, Yangmin
  • Huang, Yue
  • Huang, Hailong
  • Wu, Jingda
  • Li, Wenfei
  • Xie, Hui

Abstract

Electric vehicles (EVs) have received extensive attention as an environmentally friendly and sustainable mode of transportation. To address “range anxiety” issues, extended-range electric vehicles (EREVs) have gradually gained popularity as a solution. However, current research on energy management strategies (EMS) for EVs often overlooks the energy consumption of the air conditioning (AC) system, resulting in suboptimal energy allocation. Therefore, this study focuses on the extended-range electric bus (EREbus), an extended-range electric bus, and incorporates the AC system into its EMS, enabling coordinated optimization with the powertrain system. First, the study embeds a control-oriented cabin thermal management model based on the powertrain model. Next, representations transfer-based reinforcement learning (RTRL) transfers the learned policy representations from the AC-off state to the EMS in the AC-on state. Furthermore, the study analyzes the impact of different representation transfers on powertrain performance and thermal comfort. The results demonstrate that the proposed EMS can improve the convergence rate and stability of training. Compared to direct learning methods, RTRL exhibits clear advantages in reducing operating costs and improving cabin thermal comfort, achieving reductions of 8.3%–12.6 % and 5.2%–27.0 % in operating costs for driving modes with different battery levels. Moreover, setting the transfer layer appropriately promotes the utilization of the global optimal potential of RTRL. This research provides support for energy management and holds the potential for promoting the development of EVs.

Suggested Citation

  • Hu, Dong & Huang, Chao & Yin, Guodong & Li, Yangmin & Huang, Yue & Huang, Hailong & Wu, Jingda & Li, Wenfei & Xie, Hui, 2024. "A transfer-based reinforcement learning collaborative energy management strategy for extended-range electric buses with cabin temperature comfort consideration," Energy, Elsevier, vol. 290(C).
  • Handle: RePEc:eee:energy:v:290:y:2024:i:c:s0360544223034916
    DOI: 10.1016/j.energy.2023.130097
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0360544223034916
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.energy.2023.130097?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Li, Xunming & Han, Lijin & Liu, Hui & Wang, Weida & Xiang, Changle, 2019. "Real-time optimal energy management strategy for a dual-mode power-split hybrid electric vehicle based on an explicit model predictive control algorithm," Energy, Elsevier, vol. 172(C), pages 1161-1178.
    2. He, Hongwen & Wang, Yunlong & Han, Ruoyan & Han, Mo & Bai, Yunfei & Liu, Qingwu, 2021. "An improved MPC-based energy management strategy for hybrid vehicles using V2V and V2I communications," Energy, Elsevier, vol. 225(C).
    3. Ganesh, Akhil Hannegudda & Xu, Bin, 2022. "A review of reinforcement learning based energy management systems for electrified powertrains: Progress, challenge, and potential solution," Renewable and Sustainable Energy Reviews, Elsevier, vol. 154(C).
    4. Lian, Renzong & Peng, Jiankun & Wu, Yuankai & Tan, Huachun & Zhang, Hailong, 2020. "Rule-interposing deep reinforcement learning based energy management strategy for power-split hybrid electric vehicle," Energy, Elsevier, vol. 197(C).
    5. Liu, Yonggang & Wu, Yitao & Wang, Xiangyu & Li, Liang & Zhang, Yuanjian & Chen, Zheng, 2023. "Energy management for hybrid electric vehicles based on imitation reinforcement learning," Energy, Elsevier, vol. 263(PC).
    6. Wang, Hao & He, Hongwen & Bai, Yunfei & Yue, Hongwei, 2022. "Parameterized deep Q-network based energy management with balanced energy economy and battery life for hybrid electric vehicles," Applied Energy, Elsevier, vol. 320(C).
    7. Hua, Min & Zhang, Cetengfei & Zhang, Fanggang & Li, Zhi & Yu, Xiaoli & Xu, Hongming & Zhou, Quan, 2023. "Energy management of multi-mode plug-in hybrid electric vehicle using multi-agent deep reinforcement learning," Applied Energy, Elsevier, vol. 348(C).
    8. Zhang, Zhenying & Wang, Jiayu & Feng, Xu & Chang, Li & Chen, Yanhua & Wang, Xingguo, 2018. "The solutions to electric vehicle air conditioning systems: A review," Renewable and Sustainable Energy Reviews, Elsevier, vol. 91(C), pages 443-463.
    9. Shabbir, Wassif & Evangelou, Simos A., 2019. "Threshold-changing control strategy for series hybrid electric vehicles," Applied Energy, Elsevier, vol. 235(C), pages 761-775.
    10. Ahmed M. Ali & Dirk Söffker, 2018. "Towards Optimal Power Management of Hybrid Electric Vehicles in Real-Time: A Review on Methods, Challenges, and State-Of-The-Art Solutions," Energies, MDPI, vol. 11(3), pages 1-24, February.
    11. Hou, Jun & Song, Ziyou, 2020. "A hierarchical energy management strategy for hybrid energy storage via vehicle-to-cloud connectivity," Applied Energy, Elsevier, vol. 257(C).
    12. Hu, Dong & Xie, Hui & Song, Kang & Zhang, Yuanyuan & Yan, Long, 2023. "An apprenticeship-reinforcement learning scheme based on expert demonstrations for energy management strategy of hybrid electric vehicles," Applied Energy, Elsevier, vol. 342(C).
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Dong, Peng & Zhao, Junwei & Liu, Xuewu & Wu, Jian & Xu, Xiangyang & Liu, Yanfang & Wang, Shuhan & Guo, Wei, 2022. "Practical application of energy management strategy for hybrid electric vehicles based on intelligent and connected technologies: Development stages, challenges, and future trends," Renewable and Sustainable Energy Reviews, Elsevier, vol. 170(C).
    2. Wang, Yaxin & Lou, Diming & Xu, Ning & Fang, Liang & Tan, Piqiang, 2021. "Energy management and emission control for range extended electric vehicles," Energy, Elsevier, vol. 236(C).
    3. He, Hongwen & Meng, Xiangfei & Wang, Yong & Khajepour, Amir & An, Xiaowen & Wang, Renguang & Sun, Fengchun, 2024. "Deep reinforcement learning based energy management strategies for electrified vehicles: Recent advances and perspectives," Renewable and Sustainable Energy Reviews, Elsevier, vol. 192(C).
    4. Ma, Zhikai & Huo, Qian & Wang, Wei & Zhang, Tao, 2023. "Voltage-temperature aware thermal runaway alarming framework for electric vehicles via deep learning with attention mechanism in time-frequency domain," Energy, Elsevier, vol. 278(C).
    5. Zhu, Tao & Wills, Richard G.A. & Lot, Roberto & Ruan, Haijun & Jiang, Zhihao, 2021. "Adaptive energy management of a battery-supercapacitor energy storage system for electric vehicles based on flexible perception and neural network fitting," Applied Energy, Elsevier, vol. 292(C).
    6. Daniel Egan & Qilun Zhu & Robert Prucka, 2023. "A Review of Reinforcement Learning-Based Powertrain Controllers: Effects of Agent Selection for Mixed-Continuity Control and Reward Formulation," Energies, MDPI, vol. 16(8), pages 1-31, April.
    7. Penghui Qiang & Peng Wu & Tao Pan & Huaiquan Zang, 2021. "Real-Time Approximate Equivalent Consumption Minimization Strategy Based on the Single-Shaft Parallel Hybrid Powertrain," Energies, MDPI, vol. 14(23), pages 1-22, November.
    8. Zhang, Hao & Chen, Boli & Lei, Nuo & Li, Bingbing & Chen, Chaoyi & Wang, Zhi, 2024. "Coupled velocity and energy management optimization of connected hybrid electric vehicles for maximum collective efficiency," Applied Energy, Elsevier, vol. 360(C).
    9. Guo, Ningyuan & Zhang, Xudong & Zou, Yuan & Guo, Lingxiong & Du, Guodong, 2021. "Real-time predictive energy management of plug-in hybrid electric vehicles for coordination of fuel economy and battery degradation," Energy, Elsevier, vol. 214(C).
    10. Cui, Wei & Cui, Naxin & Li, Tao & Cui, Zhongrui & Du, Yi & Zhang, Chenghui, 2022. "An efficient multi-objective hierarchical energy management strategy for plug-in hybrid electric vehicle in connected scenario," Energy, Elsevier, vol. 257(C).
    11. Ruan, Shumin & Ma, Yue & Yang, Ningkang & Xiang, Changle & Li, Xunming, 2022. "Real-time energy-saving control for HEVs in car-following scenario with a double explicit MPC approach," Energy, Elsevier, vol. 247(C).
    12. Hu, Dong & Xie, Hui & Song, Kang & Zhang, Yuanyuan & Yan, Long, 2023. "An apprenticeship-reinforcement learning scheme based on expert demonstrations for energy management strategy of hybrid electric vehicles," Applied Energy, Elsevier, vol. 342(C).
    13. Gao, Qinxiang & Lei, Tao & Yao, Wenli & Zhang, Xingyu & Zhang, Xiaobin, 2023. "A health-aware energy management strategy for fuel cell hybrid electric UAVs based on safe reinforcement learning," Energy, Elsevier, vol. 283(C).
    14. Mudhafar Al-Saadi & Maher Al-Greer & Michael Short, 2023. "Reinforcement Learning-Based Intelligent Control Strategies for Optimal Power Management in Advanced Power Distribution Systems: A Survey," Energies, MDPI, vol. 16(4), pages 1-38, February.
    15. Qi, Chunyang & Song, Chuanxue & Xiao, Feng & Song, Shixin, 2022. "Generalization ability of hybrid electric vehicle energy management strategy based on reinforcement learning method," Energy, Elsevier, vol. 250(C).
    16. Cui, Wei & Cui, Naxin & Li, Tao & Du, Yi & Zhang, Chenghui, 2024. "Multi-objective hierarchical energy management for connected plug-in hybrid electric vehicle with cyber–physical interaction," Applied Energy, Elsevier, vol. 360(C).
    17. Huang, Ruchen & He, Hongwen & Gao, Miaojue, 2023. "Training-efficient and cost-optimal energy management for fuel cell hybrid electric bus based on a novel distributed deep reinforcement learning framework," Applied Energy, Elsevier, vol. 346(C).
    18. Jiang, Yue & Meng, Hao & Chen, Guanpeng & Yang, Congnan & Xu, Xiaojun & Zhang, Lei & Xu, Haijun, 2022. "Differential-steering based path tracking control and energy-saving torque distribution strategy of 6WID unmanned ground vehicle," Energy, Elsevier, vol. 254(PA).
    19. Ju, Fei & Zhuang, Weichao & Wang, Liangmo & Zhang, Zhe, 2020. "Comparison of four-wheel-drive hybrid powertrain configurations," Energy, Elsevier, vol. 209(C).
    20. Fuwu Yan & Jinhai Wang & Changqing Du & Min Hua, 2022. "Multi-Objective Energy Management Strategy for Hybrid Electric Vehicles Based on TD3 with Non-Parametric Reward Function," Energies, MDPI, vol. 16(1), pages 1-17, December.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:energy:v:290:y:2024:i:c:s0360544223034916. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.journals.elsevier.com/energy .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.