IDEAS home Printed from https://ideas.repec.org/a/eee/energy/v283y2023ics0360544223024866.html
   My bibliography  Save this article

A health-aware energy management strategy for fuel cell hybrid electric UAVs based on safe reinforcement learning

Author

Listed:
  • Gao, Qinxiang
  • Lei, Tao
  • Yao, Wenli
  • Zhang, Xingyu
  • Zhang, Xiaobin

Abstract

Energy management strategies (EMSs) are crucial for the hydrogen economy and energy component lifetimes of fuel cell hybrid electric unmanned aerial vehicles (UAVs). Reinforcement learning (RL)-based schemes have been a hotspot for EMSs, but most of RL-based EMSs focus on the energy-saving performance and rarely consider energy component durability and safe exploration. This paper proposes a health-aware energy management strategy based on a safe RL framework to minimize the overall flight cost and achieve safe operation of UAVs. In this framework, a universal three-dimensional environment that integrates the UAV kinematics and dynamics model is developed. In addition, wind disturbances and random loading of the mission payload during flight are considered for robust training. The energy management problem is formulated as a constrained Markov decision process, where both hydrogen consumption and energy component degradation are incorporated in the multi-objective reward function. A safety optimizer is then designed to satisfy operation constraints by correcting the action through analytical optimization. The results indicate that the safety of the explored action is guaranteed, maintaining zero constraint violations in both training and real-time control scenarios. Compared with other RL-based methods, the proposed method had better convergence capability and reduced the training time. Furthermore, the simulation showed that the proposed method can reduce the total flight cost and fuel cell degradation by 14.6% and 15.3%, respectively, compared with the online benchmark method.

Suggested Citation

  • Gao, Qinxiang & Lei, Tao & Yao, Wenli & Zhang, Xingyu & Zhang, Xiaobin, 2023. "A health-aware energy management strategy for fuel cell hybrid electric UAVs based on safe reinforcement learning," Energy, Elsevier, vol. 283(C).
  • Handle: RePEc:eee:energy:v:283:y:2023:i:c:s0360544223024866
    DOI: 10.1016/j.energy.2023.129092
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0360544223024866
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.energy.2023.129092?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Lian, Renzong & Peng, Jiankun & Wu, Yuankai & Tan, Huachun & Zhang, Hailong, 2020. "Rule-interposing deep reinforcement learning based energy management strategy for power-split hybrid electric vehicle," Energy, Elsevier, vol. 197(C).
    2. Erdinc, O. & Uzunoglu, M., 2010. "Recent trends in PEM fuel cell-powered hybrid systems: Investigation of application areas, design architectures and energy management approaches," Renewable and Sustainable Energy Reviews, Elsevier, vol. 14(9), pages 2874-2884, December.
    3. Juhui Gim & Minsu Kim & Changsun Ahn, 2022. "Energy Management Control Strategy for Saving Trip Costs of Fuel Cell/Battery Electric Vehicles," Energies, MDPI, vol. 15(6), pages 1-15, March.
    4. Chen, Huicui & Pei, Pucheng & Song, Mancun, 2015. "Lifetime prediction and the economic lifetime of Proton Exchange Membrane fuel cells," Applied Energy, Elsevier, vol. 142(C), pages 154-163.
    5. Song, Ke & Wang, Xiaodi & Li, Feiqiang & Sorrentino, Marco & Zheng, Bailin, 2020. "Pontryagin’s minimum principle-based real-time energy management strategy for fuel cell hybrid electric vehicle considering both fuel economy and power source durability," Energy, Elsevier, vol. 205(C).
    6. Sun, Wenjing & Zou, Yuan & Zhang, Xudong & Guo, Ningyuan & Zhang, Bin & Du, Guodong, 2022. "High robustness energy management strategy of hybrid electric vehicle based on improved soft actor-critic deep reinforcement learning," Energy, Elsevier, vol. 258(C).
    7. Kristen A. Severson & Peter M. Attia & Norman Jin & Nicholas Perkins & Benben Jiang & Zi Yang & Michael H. Chen & Muratahan Aykol & Patrick K. Herring & Dimitrios Fraggedakis & Martin Z. Bazant & Step, 2019. "Data-driven prediction of battery cycle life before capacity degradation," Nature Energy, Nature, vol. 4(5), pages 383-391, May.
    8. Ganesh, Akhil Hannegudda & Xu, Bin, 2022. "A review of reinforcement learning based energy management systems for electrified powertrains: Progress, challenge, and potential solution," Renewable and Sustainable Energy Reviews, Elsevier, vol. 154(C).
    9. Boukoberine, Mohamed Nadir & Zhou, Zhibin & Benbouzid, Mohamed, 2019. "A critical review on unmanned aerial vehicles power supply and energy management: Solutions, strategies, and prospects," Applied Energy, Elsevier, vol. 255(C).
    10. Fathy, Ahmed & Rezk, Hegazy & Nassef, Ahmed M., 2019. "Robust hydrogen-consumption-minimization strategy based salp swarm algorithm for energy management of fuel cell/supercapacitor/batteries in highly fluctuated load condition," Renewable Energy, Elsevier, vol. 139(C), pages 147-160.
    11. Wu, Jingda & He, Hongwen & Peng, Jiankun & Li, Yuecheng & Li, Zhanjiang, 2018. "Continuous reinforcement learning of energy management with deep Q network for a power split hybrid electric bus," Applied Energy, Elsevier, vol. 222(C), pages 799-811.
    12. Hua, Zhiguang & Zheng, Zhixue & Péra, Marie-Cécile & Gao, Fei, 2020. "Remaining useful life prediction of PEMFC systems based on the multi-input echo state network," Applied Energy, Elsevier, vol. 265(C).
    13. Du, Guodong & Zou, Yuan & Zhang, Xudong & Guo, Lingxiong & Guo, Ningyuan, 2022. "Energy management for a hybrid electric vehicle based on prioritized deep reinforcement learning framework," Energy, Elsevier, vol. 241(C).
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Daniel Egan & Qilun Zhu & Robert Prucka, 2023. "A Review of Reinforcement Learning-Based Powertrain Controllers: Effects of Agent Selection for Mixed-Continuity Control and Reward Formulation," Energies, MDPI, vol. 16(8), pages 1-31, April.
    2. Mudhafar Al-Saadi & Maher Al-Greer & Michael Short, 2023. "Reinforcement Learning-Based Intelligent Control Strategies for Optimal Power Management in Advanced Power Distribution Systems: A Survey," Energies, MDPI, vol. 16(4), pages 1-38, February.
    3. Wu, Jinglai & Zhang, Yunqing & Ruan, Jiageng & Liang, Zhaowen & Liu, Kai, 2023. "Rule and optimization combined real-time energy management strategy for minimizing cost of fuel cell hybrid electric vehicles," Energy, Elsevier, vol. 285(C).
    4. Dong, Peng & Zhao, Junwei & Liu, Xuewu & Wu, Jian & Xu, Xiangyang & Liu, Yanfang & Wang, Shuhan & Guo, Wei, 2022. "Practical application of energy management strategy for hybrid electric vehicles based on intelligent and connected technologies: Development stages, challenges, and future trends," Renewable and Sustainable Energy Reviews, Elsevier, vol. 170(C).
    5. Zuo, Jian & Steiner, Nadia Yousfi & Li, Zhongliang & Hissel, Daniel, 2024. "Health management review for fuel cells: Focus on action phase," Renewable and Sustainable Energy Reviews, Elsevier, vol. 201(C).
    6. He, Hongwen & Meng, Xiangfei & Wang, Yong & Khajepour, Amir & An, Xiaowen & Wang, Renguang & Sun, Fengchun, 2024. "Deep reinforcement learning based energy management strategies for electrified vehicles: Recent advances and perspectives," Renewable and Sustainable Energy Reviews, Elsevier, vol. 192(C).
    7. Tang, Xiaolin & Zhou, Haitao & Wang, Feng & Wang, Weida & Lin, Xianke, 2022. "Longevity-conscious energy management strategy of fuel cell hybrid electric Vehicle Based on deep reinforcement learning," Energy, Elsevier, vol. 238(PA).
    8. Matteo Acquarone & Claudio Maino & Daniela Misul & Ezio Spessa & Antonio Mastropietro & Luca Sorrentino & Enrico Busto, 2023. "Influence of the Reward Function on the Selection of Reinforcement Learning Agents for Hybrid Electric Vehicles Real-Time Control," Energies, MDPI, vol. 16(6), pages 1-22, March.
    9. Huang, Ruchen & He, Hongwen & Gao, Miaojue, 2023. "Training-efficient and cost-optimal energy management for fuel cell hybrid electric bus based on a novel distributed deep reinforcement learning framework," Applied Energy, Elsevier, vol. 346(C).
    10. Miranda, Matheus H.R. & Silva, Fabrício L. & Lourenço, Maria A.M. & Eckert, Jony J. & Silva, Ludmila C.A., 2022. "Vehicle drivetrain and fuzzy controller optimization using a planar dynamics simulation based on a real-world driving cycle," Energy, Elsevier, vol. 257(C).
    11. Pang, Kexin & Zhou, Jian & Tsianikas, Stamatis & Coit, David W. & Ma, Yizhong, 2024. "Long-term microgrid expansion planning with resilience and environmental benefits using deep reinforcement learning," Renewable and Sustainable Energy Reviews, Elsevier, vol. 191(C).
    12. Qi, Chunyang & Zhu, Yiwen & Song, Chuanxue & Yan, Guangfu & Xiao, Feng & Da wang, & Zhang, Xu & Cao, Jingwei & Song, Shixin, 2022. "Hierarchical reinforcement learning based energy management strategy for hybrid electric vehicle," Energy, Elsevier, vol. 238(PA).
    13. Hu, Dong & Huang, Chao & Yin, Guodong & Li, Yangmin & Huang, Yue & Huang, Hailong & Wu, Jingda & Li, Wenfei & Xie, Hui, 2024. "A transfer-based reinforcement learning collaborative energy management strategy for extended-range electric buses with cabin temperature comfort consideration," Energy, Elsevier, vol. 290(C).
    14. Hu, Jianjun & Wang, Yangguang & Zou, Lingbo & Wang, Zhouxin, 2023. "Adaptive rule control strategy for composite energy storage fuel cell vehicle based on vehicle operating state recognition," Renewable Energy, Elsevier, vol. 204(C), pages 166-175.
    15. Macias, A. & Kandidayeni, M. & Boulon, L. & Trovão, J.P., 2021. "Fuel cell-supercapacitor topologies benchmark for a three-wheel electric vehicle powertrain," Energy, Elsevier, vol. 224(C).
    16. He, Wenbin & Liu, Ting & Ming, Wuyi & Li, Zongze & Du, Jinguang & Li, Xiaoke & Guo, Xudong & Sun, Peiyan, 2024. "Progress in prediction of remaining useful life of hydrogen fuel cells based on deep learning," Renewable and Sustainable Energy Reviews, Elsevier, vol. 192(C).
    17. Zhengyu Yao & Hwan-Sik Yoon & Yang-Ki Hong, 2023. "Control of Hybrid Electric Vehicle Powertrain Using Offline-Online Hybrid Reinforcement Learning," Energies, MDPI, vol. 16(2), pages 1-18, January.
    18. Zhou, Jianhao & Xue, Yuan & Xu, Da & Li, Chaoxiong & Zhao, Wanzhong, 2022. "Self-learning energy management strategy for hybrid electric vehicle via curiosity-inspired asynchronous deep reinforcement learning," Energy, Elsevier, vol. 242(C).
    19. Wu, Jingda & Huang, Chao & He, Hongwen & Huang, Hailong, 2024. "Confidence-aware reinforcement learning for energy management of electrified vehicles," Renewable and Sustainable Energy Reviews, Elsevier, vol. 191(C).
    20. Fuwu Yan & Jinhai Wang & Changqing Du & Min Hua, 2022. "Multi-Objective Energy Management Strategy for Hybrid Electric Vehicles Based on TD3 with Non-Parametric Reward Function," Energies, MDPI, vol. 16(1), pages 1-17, December.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:energy:v:283:y:2023:i:c:s0360544223024866. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.journals.elsevier.com/energy .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.