IDEAS home Printed from https://ideas.repec.org/a/eee/energy/v283y2023ics0360544223024866.html
   My bibliography  Save this article

A health-aware energy management strategy for fuel cell hybrid electric UAVs based on safe reinforcement learning

Author

Listed:
  • Gao, Qinxiang
  • Lei, Tao
  • Yao, Wenli
  • Zhang, Xingyu
  • Zhang, Xiaobin

Abstract

Energy management strategies (EMSs) are crucial for the hydrogen economy and energy component lifetimes of fuel cell hybrid electric unmanned aerial vehicles (UAVs). Reinforcement learning (RL)-based schemes have been a hotspot for EMSs, but most of RL-based EMSs focus on the energy-saving performance and rarely consider energy component durability and safe exploration. This paper proposes a health-aware energy management strategy based on a safe RL framework to minimize the overall flight cost and achieve safe operation of UAVs. In this framework, a universal three-dimensional environment that integrates the UAV kinematics and dynamics model is developed. In addition, wind disturbances and random loading of the mission payload during flight are considered for robust training. The energy management problem is formulated as a constrained Markov decision process, where both hydrogen consumption and energy component degradation are incorporated in the multi-objective reward function. A safety optimizer is then designed to satisfy operation constraints by correcting the action through analytical optimization. The results indicate that the safety of the explored action is guaranteed, maintaining zero constraint violations in both training and real-time control scenarios. Compared with other RL-based methods, the proposed method had better convergence capability and reduced the training time. Furthermore, the simulation showed that the proposed method can reduce the total flight cost and fuel cell degradation by 14.6% and 15.3%, respectively, compared with the online benchmark method.

Suggested Citation

  • Gao, Qinxiang & Lei, Tao & Yao, Wenli & Zhang, Xingyu & Zhang, Xiaobin, 2023. "A health-aware energy management strategy for fuel cell hybrid electric UAVs based on safe reinforcement learning," Energy, Elsevier, vol. 283(C).
  • Handle: RePEc:eee:energy:v:283:y:2023:i:c:s0360544223024866
    DOI: 10.1016/j.energy.2023.129092
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0360544223024866
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.energy.2023.129092?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Kristen A. Severson & Peter M. Attia & Norman Jin & Nicholas Perkins & Benben Jiang & Zi Yang & Michael H. Chen & Muratahan Aykol & Patrick K. Herring & Dimitrios Fraggedakis & Martin Z. Bazant & Step, 2019. "Data-driven prediction of battery cycle life before capacity degradation," Nature Energy, Nature, vol. 4(5), pages 383-391, May.
    2. Chen, Huicui & Pei, Pucheng & Song, Mancun, 2015. "Lifetime prediction and the economic lifetime of Proton Exchange Membrane fuel cells," Applied Energy, Elsevier, vol. 142(C), pages 154-163.
    3. Ganesh, Akhil Hannegudda & Xu, Bin, 2022. "A review of reinforcement learning based energy management systems for electrified powertrains: Progress, challenge, and potential solution," Renewable and Sustainable Energy Reviews, Elsevier, vol. 154(C).
    4. Lian, Renzong & Peng, Jiankun & Wu, Yuankai & Tan, Huachun & Zhang, Hailong, 2020. "Rule-interposing deep reinforcement learning based energy management strategy for power-split hybrid electric vehicle," Energy, Elsevier, vol. 197(C).
    5. Boukoberine, Mohamed Nadir & Zhou, Zhibin & Benbouzid, Mohamed, 2019. "A critical review on unmanned aerial vehicles power supply and energy management: Solutions, strategies, and prospects," Applied Energy, Elsevier, vol. 255(C).
    6. Song, Ke & Wang, Xiaodi & Li, Feiqiang & Sorrentino, Marco & Zheng, Bailin, 2020. "Pontryagin’s minimum principle-based real-time energy management strategy for fuel cell hybrid electric vehicle considering both fuel economy and power source durability," Energy, Elsevier, vol. 205(C).
    7. Fathy, Ahmed & Rezk, Hegazy & Nassef, Ahmed M., 2019. "Robust hydrogen-consumption-minimization strategy based salp swarm algorithm for energy management of fuel cell/supercapacitor/batteries in highly fluctuated load condition," Renewable Energy, Elsevier, vol. 139(C), pages 147-160.
    8. Wu, Jingda & He, Hongwen & Peng, Jiankun & Li, Yuecheng & Li, Zhanjiang, 2018. "Continuous reinforcement learning of energy management with deep Q network for a power split hybrid electric bus," Applied Energy, Elsevier, vol. 222(C), pages 799-811.
    9. Erdinc, O. & Uzunoglu, M., 2010. "Recent trends in PEM fuel cell-powered hybrid systems: Investigation of application areas, design architectures and energy management approaches," Renewable and Sustainable Energy Reviews, Elsevier, vol. 14(9), pages 2874-2884, December.
    10. Du, Guodong & Zou, Yuan & Zhang, Xudong & Guo, Lingxiong & Guo, Ningyuan, 2022. "Energy management for a hybrid electric vehicle based on prioritized deep reinforcement learning framework," Energy, Elsevier, vol. 241(C).
    11. Hua, Zhiguang & Zheng, Zhixue & Péra, Marie-Cécile & Gao, Fei, 2020. "Remaining useful life prediction of PEMFC systems based on the multi-input echo state network," Applied Energy, Elsevier, vol. 265(C).
    12. Juhui Gim & Minsu Kim & Changsun Ahn, 2022. "Energy Management Control Strategy for Saving Trip Costs of Fuel Cell/Battery Electric Vehicles," Energies, MDPI, vol. 15(6), pages 1-15, March.
    13. Sun, Wenjing & Zou, Yuan & Zhang, Xudong & Guo, Ningyuan & Zhang, Bin & Du, Guodong, 2022. "High robustness energy management strategy of hybrid electric vehicle based on improved soft actor-critic deep reinforcement learning," Energy, Elsevier, vol. 258(C).
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Daniel Egan & Qilun Zhu & Robert Prucka, 2023. "A Review of Reinforcement Learning-Based Powertrain Controllers: Effects of Agent Selection for Mixed-Continuity Control and Reward Formulation," Energies, MDPI, vol. 16(8), pages 1-31, April.
    2. Mudhafar Al-Saadi & Maher Al-Greer & Michael Short, 2023. "Reinforcement Learning-Based Intelligent Control Strategies for Optimal Power Management in Advanced Power Distribution Systems: A Survey," Energies, MDPI, vol. 16(4), pages 1-38, February.
    3. Wu, Jinglai & Zhang, Yunqing & Ruan, Jiageng & Liang, Zhaowen & Liu, Kai, 2023. "Rule and optimization combined real-time energy management strategy for minimizing cost of fuel cell hybrid electric vehicles," Energy, Elsevier, vol. 285(C).
    4. Dong, Peng & Zhao, Junwei & Liu, Xuewu & Wu, Jian & Xu, Xiangyang & Liu, Yanfang & Wang, Shuhan & Guo, Wei, 2022. "Practical application of energy management strategy for hybrid electric vehicles based on intelligent and connected technologies: Development stages, challenges, and future trends," Renewable and Sustainable Energy Reviews, Elsevier, vol. 170(C).
    5. Tang, Xiaolin & Zhou, Haitao & Wang, Feng & Wang, Weida & Lin, Xianke, 2022. "Longevity-conscious energy management strategy of fuel cell hybrid electric Vehicle Based on deep reinforcement learning," Energy, Elsevier, vol. 238(PA).
    6. He, Hongwen & Meng, Xiangfei & Wang, Yong & Khajepour, Amir & An, Xiaowen & Wang, Renguang & Sun, Fengchun, 2024. "Deep reinforcement learning based energy management strategies for electrified vehicles: Recent advances and perspectives," Renewable and Sustainable Energy Reviews, Elsevier, vol. 192(C).
    7. Matteo Acquarone & Claudio Maino & Daniela Misul & Ezio Spessa & Antonio Mastropietro & Luca Sorrentino & Enrico Busto, 2023. "Influence of the Reward Function on the Selection of Reinforcement Learning Agents for Hybrid Electric Vehicles Real-Time Control," Energies, MDPI, vol. 16(6), pages 1-22, March.
    8. Miranda, Matheus H.R. & Silva, Fabrício L. & Lourenço, Maria A.M. & Eckert, Jony J. & Silva, Ludmila C.A., 2022. "Vehicle drivetrain and fuzzy controller optimization using a planar dynamics simulation based on a real-world driving cycle," Energy, Elsevier, vol. 257(C).
    9. Pang, Kexin & Zhou, Jian & Tsianikas, Stamatis & Coit, David W. & Ma, Yizhong, 2024. "Long-term microgrid expansion planning with resilience and environmental benefits using deep reinforcement learning," Renewable and Sustainable Energy Reviews, Elsevier, vol. 191(C).
    10. Qi, Chunyang & Zhu, Yiwen & Song, Chuanxue & Yan, Guangfu & Xiao, Feng & Da wang, & Zhang, Xu & Cao, Jingwei & Song, Shixin, 2022. "Hierarchical reinforcement learning based energy management strategy for hybrid electric vehicle," Energy, Elsevier, vol. 238(PA).
    11. Hu, Dong & Huang, Chao & Yin, Guodong & Li, Yangmin & Huang, Yue & Huang, Hailong & Wu, Jingda & Li, Wenfei & Xie, Hui, 2024. "A transfer-based reinforcement learning collaborative energy management strategy for extended-range electric buses with cabin temperature comfort consideration," Energy, Elsevier, vol. 290(C).
    12. Macias, A. & Kandidayeni, M. & Boulon, L. & Trovão, J.P., 2021. "Fuel cell-supercapacitor topologies benchmark for a three-wheel electric vehicle powertrain," Energy, Elsevier, vol. 224(C).
    13. Zhengyu Yao & Hwan-Sik Yoon & Yang-Ki Hong, 2023. "Control of Hybrid Electric Vehicle Powertrain Using Offline-Online Hybrid Reinforcement Learning," Energies, MDPI, vol. 16(2), pages 1-18, January.
    14. Zhou, Jianhao & Xue, Yuan & Xu, Da & Li, Chaoxiong & Zhao, Wanzhong, 2022. "Self-learning energy management strategy for hybrid electric vehicle via curiosity-inspired asynchronous deep reinforcement learning," Energy, Elsevier, vol. 242(C).
    15. Wu, Jingda & Huang, Chao & He, Hongwen & Huang, Hailong, 2024. "Confidence-aware reinforcement learning for energy management of electrified vehicles," Renewable and Sustainable Energy Reviews, Elsevier, vol. 191(C).
    16. Bo, Lin & Han, Lijin & Xiang, Changle & Liu, Hui & Ma, Tian, 2022. "A Q-learning fuzzy inference system based online energy management strategy for off-road hybrid electric vehicles," Energy, Elsevier, vol. 252(C).
    17. Yao, Yongming & Wang, Jie & Zhou, Zhicong & Li, Hang & Liu, Huiying & Li, Tianyu, 2023. "Grey Markov prediction-based hierarchical model predictive control energy management for fuel cell/battery hybrid unmanned aerial vehicles," Energy, Elsevier, vol. 262(PA).
    18. Xiao, B. & Ruan, J. & Yang, W. & Walker, P.D. & Zhang, N., 2021. "A review of pivotal energy management strategies for extended range electric vehicles," Renewable and Sustainable Energy Reviews, Elsevier, vol. 149(C).
    19. Li, Cheng & Xu, Xiangyang & Zhu, Helong & Gan, Jiongpeng & Chen, Zhige & Tang, Xiaolin, 2024. "Research on car-following control and energy management strategy of hybrid electric vehicles in connected scene," Energy, Elsevier, vol. 293(C).
    20. Marouane Adnane & Ahmed Khoumsi & João Pedro F. Trovão, 2023. "Efficient Management of Energy Consumption of Electric Vehicles Using Machine Learning—A Systematic and Comprehensive Survey," Energies, MDPI, vol. 16(13), pages 1-39, June.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:energy:v:283:y:2023:i:c:s0360544223024866. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.journals.elsevier.com/energy .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.