IDEAS home Printed from https://ideas.repec.org/a/eee/appene/v377y2025ipcs0306261924019779.html
   My bibliography  Save this article

Type- and task-crossing energy management for fuel cell vehicles with longevity consideration: A heterogeneous deep transfer reinforcement learning framework

Author

Listed:
  • Huang, Ruchen
  • He, Hongwen
  • Su, Qicong
  • Härtl, Martin
  • Jaensch, Malte

Abstract

The recent advancements in artificial intelligence have promoted deep reinforcement learning (DRL) as the preferred method for developing energy management strategies (EMSs) for fuel cell vehicles (FCVs). However, the development of DRL-based EMSs is a time-consuming process, requiring repetitive training when encountering different vehicle types or learning tasks. To surmount this technical barrier, this paper develops a transferable EMS rooted in heterogeneous deep transfer reinforcement learning (DTRL) across both FCV types and optimization tasks. Firstly, a simple source EMS based on the soft actor-critic (SAC) algorithm is pre-trained for a fuel cell sedan, solely focusing on hydrogen saving. After that, a heterogeneous DTRL framework is developed by integrating SAC with transfer learning, through which both heterogeneous deep neural networks and experience replay buffers can be transferred. Subsequently, the source EMS is transferred to the target new EMS of a fuel cell bus (FCB) to be reused, with additional consideration of the fuel cell (FC) longevity. Experimental simulations reveal that the heterogeneous DTRL framework expedites the development of the new EMS for FCB by 90.28 %. Moreover, the new EMS achieves a 7.93 % reduction in hydrogen consumption and suppresses FC degradation by 63.21 %. By correlating different energy management tasks of FCVs, this article both expedites the development and facilitates the generalized application of DRL-based EMSs.

Suggested Citation

  • Huang, Ruchen & He, Hongwen & Su, Qicong & Härtl, Martin & Jaensch, Malte, 2025. "Type- and task-crossing energy management for fuel cell vehicles with longevity consideration: A heterogeneous deep transfer reinforcement learning framework," Applied Energy, Elsevier, vol. 377(PC).
  • Handle: RePEc:eee:appene:v:377:y:2025:i:pc:s0306261924019779
    DOI: 10.1016/j.apenergy.2024.124594
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0306261924019779
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.apenergy.2024.124594?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Irani, Fatemeh Negar & Soleimani, Mohammadjavad & Yadegar, Meysam & Meskin, Nader, 2024. "Deep transfer learning strategy in intelligent fault diagnosis of gas turbines based on the Koopman operator," Applied Energy, Elsevier, vol. 365(C).
    2. Ajagekar, Akshay & Decardi-Nelson, Benjamin & You, Fengqi, 2024. "Energy management for demand response in networked greenhouses with multi-agent deep reinforcement learning," Applied Energy, Elsevier, vol. 355(C).
    3. He, Hongwen & Wang, Yunlong & Han, Ruoyan & Han, Mo & Bai, Yunfei & Liu, Qingwu, 2021. "An improved MPC-based energy management strategy for hybrid vehicles using V2V and V2I communications," Energy, Elsevier, vol. 225(C).
    4. Lian, Renzong & Peng, Jiankun & Wu, Yuankai & Tan, Huachun & Zhang, Hailong, 2020. "Rule-interposing deep reinforcement learning based energy management strategy for power-split hybrid electric vehicle," Energy, Elsevier, vol. 197(C).
    5. Huang, Ruchen & He, Hongwen & Su, Qicong, 2024. "Towards a fossil-free urban transport system: An intelligent cross-type transferable energy management framework based on deep transfer reinforcement learning," Applied Energy, Elsevier, vol. 363(C).
    6. Lin, Xinyou & Xu, Xinhao & Wang, Zhaorui, 2022. "Deep Q-learning network based trip pattern adaptive battery longevity-conscious strategy of plug-in fuel cell hybrid electric vehicle," Applied Energy, Elsevier, vol. 321(C).
    7. Peng, Jiankun & He, Hongwen & Xiong, Rui, 2017. "Rule based energy management strategy for a series–parallel plug-in hybrid electric bus optimized by dynamic programming," Applied Energy, Elsevier, vol. 185(P2), pages 1633-1643.
    8. Liu, Bo & Sun, Chao & Wang, Bo & Liang, Weiqiang & Ren, Qiang & Li, Junqiu & Sun, Fengchun, 2022. "Bi-level convex optimization of eco-driving for connected Fuel Cell Hybrid Electric Vehicles through signalized intersections," Energy, Elsevier, vol. 252(C).
    9. Xu, Nan & Kong, Yan & Yan, Jinyue & Zhang, Yuanjian & Sui, Yan & Ju, Hao & Liu, Heng & Xu, Zhe, 2022. "Global optimization energy management for multi-energy source vehicles based on “Information layer - Physical layer - Energy layer - Dynamic programming” (IPE-DP)," Applied Energy, Elsevier, vol. 312(C).
    10. Huang, Ruchen & He, Hongwen & Su, Qicong, 2024. "Smart energy management for hybrid electric bus via improved soft actor-critic algorithm in a heuristic learning framework," Energy, Elsevier, vol. 309(C).
    11. Huang, Ruchen & He, Hongwen & Su, Qicong, 2024. "An intelligent full-knowledge transferable collaborative eco-driving framework based on improved soft actor-critic algorithm," Applied Energy, Elsevier, vol. 375(C).
    12. Wang, Hanchen & Ye, Yiming & Zhang, Jiangfeng & Xu, Bin, 2023. "A comparative study of 13 deep reinforcement learning based energy management methods for a hybrid electric vehicle," Energy, Elsevier, vol. 266(C).
    13. Xiao, Boyi & Yang, Weiwei & Wu, Jiamin & Walker, Paul D. & Zhang, Nong, 2022. "Energy management strategy via maximum entropy reinforcement learning for an extended range logistics vehicle," Energy, Elsevier, vol. 253(C).
    14. Wang, Qiao & Ye, Min & Cai, Xue & Sauer, Dirk Uwe & Li, Weihan, 2023. "Transferable data-driven capacity estimation for lithium-ion batteries with deep learning: A case study from laboratory to field applications," Applied Energy, Elsevier, vol. 350(C).
    15. Li, Yapeng & Tang, Xiaolin & Lin, Xianke & Grzesiak, Lech & Hu, Xiaosong, 2022. "The role and application of convex modeling and optimization in electrified vehicles," Renewable and Sustainable Energy Reviews, Elsevier, vol. 153(C).
    16. Zhou, Yujie & Huang, Yin & Mao, Xuping & Kang, Zehao & Huang, Xuejin & Xuan, Dongji, 2024. "Research on energy management strategy of fuel cell hybrid power via an improved TD3 deep reinforcement learning," Energy, Elsevier, vol. 293(C).
    17. Huang, Ruchen & He, Hongwen & Su, Qicong & Härtl, Martin & Jaensch, Malte, 2024. "Enabling cross-type full-knowledge transferable energy management for hybrid electric vehicles via deep transfer reinforcement learning," Energy, Elsevier, vol. 305(C).
    18. Tang, Xiaolin & Zhou, Haitao & Wang, Feng & Wang, Weida & Lin, Xianke, 2022. "Longevity-conscious energy management strategy of fuel cell hybrid electric Vehicle Based on deep reinforcement learning," Energy, Elsevier, vol. 238(PA).
    19. Zou, Weitao & Li, Jianwei & Yang, Qingqing & Wan, Xinming & He, Yuntang & Lan, Hao, 2023. "A real-time energy management approach with fuel cell and battery competition-synergy control for the fuel cell vehicle," Applied Energy, Elsevier, vol. 334(C).
    20. Huang, Ruchen & He, Hongwen & Gao, Miaojue, 2023. "Training-efficient and cost-optimal energy management for fuel cell hybrid electric bus based on a novel distributed deep reinforcement learning framework," Applied Energy, Elsevier, vol. 346(C).
    21. Guo, Ningyuan & Zhang, Wencan & Li, Junqiu & Chen, Zheng & Li, Jianwei & Sun, Chao, 2024. "Predictive energy management of fuel cell plug-in hybrid electric vehicles: A co-state boundaries-oriented PMP optimization approach," Applied Energy, Elsevier, vol. 362(C).
    22. Li, Yuecheng & He, Hongwen & Khajepour, Amir & Wang, Hong & Peng, Jiankun, 2019. "Energy management for a power-split hybrid electric bus via deep reinforcement learning with terrain information," Applied Energy, Elsevier, vol. 255(C).
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Huang, Ruchen & He, Hongwen & Su, Qicong, 2024. "Smart energy management for hybrid electric bus via improved soft actor-critic algorithm in a heuristic learning framework," Energy, Elsevier, vol. 309(C).
    2. Huang, Ruchen & He, Hongwen & Su, Qicong, 2024. "An intelligent full-knowledge transferable collaborative eco-driving framework based on improved soft actor-critic algorithm," Applied Energy, Elsevier, vol. 375(C).
    3. Peng, Jiankun & Shen, Yang & Wu, ChangCheng & Wang, Chunhai & Yi, Fengyan & Ma, Chunye, 2023. "Research on energy-saving driving control of hydrogen fuel bus based on deep reinforcement learning in freeway ramp weaving area," Energy, Elsevier, vol. 285(C).
    4. Liu, Zemin Eitan & Li, Yong & Zhou, Quan & Shuai, Bin & Hua, Min & Xu, Hongming & Xu, Lubing & Tan, Guikun & Li, Yanfei, 2025. "Real-time energy management for HEV combining naturalistic driving data and deep reinforcement learning with high generalization," Applied Energy, Elsevier, vol. 377(PA).
    5. Niu, Zegong & He, Hongwen, 2024. "A data-driven solution for intelligent power allocation of connected hybrid electric vehicles inspired by offline deep reinforcement learning in V2X scenario," Applied Energy, Elsevier, vol. 372(C).
    6. He, Hongwen & Su, Qicong & Huang, Ruchen & Niu, Zegong, 2024. "Enabling intelligent transferable energy management of series hybrid electric tracked vehicle across motion dimensions via soft actor-critic algorithm," Energy, Elsevier, vol. 294(C).
    7. Huang, Ruchen & He, Hongwen & Su, Qicong & Härtl, Martin & Jaensch, Malte, 2024. "Enabling cross-type full-knowledge transferable energy management for hybrid electric vehicles via deep transfer reinforcement learning," Energy, Elsevier, vol. 305(C).
    8. Tang, Tianfeng & Peng, Qianlong & Shi, Qing & Peng, Qingguo & Zhao, Jin & Chen, Chaoyi & Wang, Guangwei, 2024. "Energy management of fuel cell hybrid electric bus in mountainous regions: A deep reinforcement learning approach considering terrain characteristics," Energy, Elsevier, vol. 311(C).
    9. Chen, Jiaxin & Shu, Hong & Tang, Xiaolin & Liu, Teng & Wang, Weida, 2022. "Deep reinforcement learning-based multi-objective control of hybrid power system combined with road recognition under time-varying environment," Energy, Elsevier, vol. 239(PC).
    10. Tan, Yingqi & Xu, Jingyi & Ma, Junyi & Li, Zirui & Chen, Huiyan & Xi, Junqiang & Liu, Haiou, 2024. "A transferable perception-guided EMS for series hybrid electric unmanned tracked vehicles," Energy, Elsevier, vol. 306(C).
    11. Shi, Dehua & Xu, Han & Wang, Shaohua & Hu, Jia & Chen, Long & Yin, Chunfang, 2024. "Deep reinforcement learning based adaptive energy management for plug-in hybrid electric vehicle with double deep Q-network," Energy, Elsevier, vol. 305(C).
    12. Huang, Ruchen & He, Hongwen & Zhao, Xuyang & Wang, Yunlong & Li, Menglin, 2022. "Battery health-aware and naturalistic data-driven energy management for hybrid electric bus based on TD3 deep reinforcement learning algorithm," Applied Energy, Elsevier, vol. 321(C).
    13. Liu, Huimin & Lin, Cheng & Yu, Xiao & Tao, Zhenyi & Xu, Jiaqi, 2024. "Variable horizon multivariate driving pattern recognition framework based on vehicle-road two-dimensional information for electric vehicle," Applied Energy, Elsevier, vol. 365(C).
    14. Wang, Yue & Li, Keqiang & Zeng, Xiaohua & Gao, Bolin & Hong, Jichao, 2023. "Investigation of novel intelligent energy management strategies for connected HEB considering global planning of fixed-route information," Energy, Elsevier, vol. 263(PB).
    15. Ren, Xiaoxia & Ye, Jinze & Xie, Liping & Lin, Xinyou, 2024. "Battery longevity-conscious energy management predictive control strategy optimized by using deep reinforcement learning algorithm for a fuel cell hybrid electric vehicle," Energy, Elsevier, vol. 286(C).
    16. Tao, Fazhan & Fu, Zhigao & Gong, Huixian & Ji, Baofeng & Zhou, Yao, 2023. "Twin delayed deep deterministic policy gradient based energy management strategy for fuel cell/battery/ultracapacitor hybrid electric vehicles considering predicted terrain information," Energy, Elsevier, vol. 283(C).
    17. He, Hongwen & Meng, Xiangfei & Wang, Yong & Khajepour, Amir & An, Xiaowen & Wang, Renguang & Sun, Fengchun, 2024. "Deep reinforcement learning based energy management strategies for electrified vehicles: Recent advances and perspectives," Renewable and Sustainable Energy Reviews, Elsevier, vol. 192(C).
    18. Xiaodong Liu & Hongqiang Guo & Xingqun Cheng & Juan Du & Jian Ma, 2022. "A Robust Design of the Model-Free-Adaptive-Control-Based Energy Management for Plug-In Hybrid Electric Vehicle," Energies, MDPI, vol. 15(20), pages 1-24, October.
    19. Xu, Nan & Kong, Yan & Zhang, Yuanjian & Yue, Fenglai & Sui, Yan & Li, Xiaohan & Liu, Heng & Xu, Zhe, 2022. "Determination of vehicle working modes for global optimization energy management and evaluation of the economic performance for a certain control strategy," Energy, Elsevier, vol. 251(C).
    20. Zhang, Yahui & Wang, Zimeng & Tian, Yang & Wang, Zhong & Kang, Mingxin & Xie, Fangxi & Wen, Guilin, 2024. "Pre-optimization-assisted deep reinforcement learning-based energy management strategy for a series–parallel hybrid electric truck," Energy, Elsevier, vol. 302(C).

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:appene:v:377:y:2025:i:pc:s0306261924019779. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/wps/find/journaldescription.cws_home/405891/description#description .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.