IDEAS home Printed from https://ideas.repec.org/a/eee/renene/v237y2024ipbs0960148124017932.html
   My bibliography  Save this article

Intelligent hydrogen-ammonia combined energy storage system with deep reinforcement learning

Author

Listed:
  • Lan, Penghang
  • Chen, She
  • Li, Qihang
  • Li, Kelin
  • Wang, Feng
  • Zhao, Yaoxun

Abstract

To achieve carbon neutrality, hydrogen and ammonia are considered promising energy carriers for renewable energy. Efficient use of these resources has become a critical research focus. Here we propose an intelligent hydrogen-ammonia combined energy storage system. To maximize net present value (NPV), deep reinforcement learning (DRL) is employed for the energy management strategy, dynamically adjusting the priority between hydrogen and ammonia. The results indicate that the DRL pathway achieves the highest NPV of 1.38 M$, which is 194 % of the benchmark pathway. Furthermore, the DRL pathway utilizes energy resources more efficiently, its grid dependency portion is lower than that of the benchmark pathway, particularly in November, by less than 0.8 %. Compared to conventional ways, the DRL pathway achieves zero carbon footprint, equivalently reducing 4819 tons, 17,715 tons and 94,944 tons of CO2 emissions for ammonia, hydrogen and electricity production, respectively. Considering the carbon tax policy, this pathway could save up to 5.87 M$ annually.

Suggested Citation

  • Lan, Penghang & Chen, She & Li, Qihang & Li, Kelin & Wang, Feng & Zhao, Yaoxun, 2024. "Intelligent hydrogen-ammonia combined energy storage system with deep reinforcement learning," Renewable Energy, Elsevier, vol. 237(PB).
  • Handle: RePEc:eee:renene:v:237:y:2024:i:pb:s0960148124017932
    DOI: 10.1016/j.renene.2024.121725
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0960148124017932
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.renene.2024.121725?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. May, Ross & Huang, Pei, 2023. "A multi-agent reinforcement learning approach for investigating and optimising peer-to-peer prosumer energy markets," Applied Energy, Elsevier, vol. 334(C).
    2. Usman, Muhammad R., 2022. "Hydrogen storage methods: Review and current status," Renewable and Sustainable Energy Reviews, Elsevier, vol. 167(C).
    3. Cosgrove, Paul & Roulstone, Tony & Zachary, Stan, 2023. "Intermittency and periodicity in net-zero renewable energy systems with storage," Renewable Energy, Elsevier, vol. 212(C), pages 299-307.
    4. Harrold, Daniel J.B. & Cao, Jun & Fan, Zhong, 2022. "Renewable energy integration and microgrid energy trading using multi-agent deep reinforcement learning," Applied Energy, Elsevier, vol. 318(C).
    5. Fúnez Guerra, C. & Reyes-Bozo, L. & Vyhmeister, E. & Jaén Caparrós, M. & Salazar, José Luis & Clemente-Jul, C., 2020. "Technical-economic analysis for a green ammonia production plant in Chile and its subsequent transport to Japan," Renewable Energy, Elsevier, vol. 157(C), pages 404-414.
    6. Lazzari, Florencia & Mor, Gerard & Cipriano, Jordi & Solsona, Francesc & Chemisana, Daniel & Guericke, Daniela, 2023. "Optimizing planning and operation of renewable energy communities with genetic algorithms," Applied Energy, Elsevier, vol. 338(C).
    7. Hemmati, Reza & Saboori, Hedayat, 2016. "Emergence of hybrid energy storage systems in renewable energy and transport applications – A review," Renewable and Sustainable Energy Reviews, Elsevier, vol. 65(C), pages 11-23.
    8. Bhandari, Ramchandra & Shah, Ronak Rakesh, 2021. "Hydrogen as energy carrier: Techno-economic assessment of decentralized hydrogen production in Germany," Renewable Energy, Elsevier, vol. 177(C), pages 915-931.
    9. Yue, Meiling & Lambert, Hugo & Pahon, Elodie & Roche, Robin & Jemei, Samir & Hissel, Daniel, 2021. "Hydrogen energy systems: A critical review of technologies, applications, trends and challenges," Renewable and Sustainable Energy Reviews, Elsevier, vol. 146(C).
    10. Turk, Ana & Wu, Qiuwei & Zhang, Menglin & Østergaard, Jacob, 2020. "Day-ahead stochastic scheduling of integrated multi-energy system for flexibility synergy and uncertainty balancing," Energy, Elsevier, vol. 196(C).
    11. Liang, Tao & Chai, Lulu & Cao, Xin & Tan, Jianxin & Jing, Yanwei & Lv, Liangnian, 2024. "Real-time optimization of large-scale hydrogen production systems using off-grid renewable energy: Scheduling strategy based on deep reinforcement learning," Renewable Energy, Elsevier, vol. 224(C).
    12. Tawalbeh, Muhammad & Murtaza, Sana Z.M. & Al-Othman, Amani & Alami, Abdul Hai & Singh, Karnail & Olabi, Abdul Ghani, 2022. "Ammonia: A versatile candidate for the use in energy storage systems," Renewable Energy, Elsevier, vol. 194(C), pages 955-977.
    13. Chong, Lee Wai & Wong, Yee Wan & Rajkumar, Rajprasad Kumar & Rajkumar, Rajpartiban Kumar & Isa, Dino, 2016. "Hybrid energy storage systems and control strategies for stand-alone renewable energy power systems," Renewable and Sustainable Energy Reviews, Elsevier, vol. 66(C), pages 174-189.
    14. Shuo Feng & Haowei Sun & Xintao Yan & Haojie Zhu & Zhengxia Zou & Shengyin Shen & Henry X. Liu, 2023. "Dense reinforcement learning for safety validation of autonomous vehicles," Nature, Nature, vol. 615(7953), pages 620-627, March.
    15. Guo, Xiaokai & Yan, Xianguo & Chen, Zhi & Meng, Zhiyu, 2022. "Research on energy management strategy of heavy-duty fuel cell hybrid vehicles based on dueling-double-deep Q-network," Energy, Elsevier, vol. 260(C).
    16. Qi, Yunying & Xu, Xiao & Liu, Youbo & Pan, Li & Liu, Junyong & Hu, Weihao, 2024. "Intelligent energy management for an on-grid hydrogen refueling station based on dueling double deep Q network algorithm with NoisyNet," Renewable Energy, Elsevier, vol. 222(C).
    17. Jaemin Seo & SangKyeun Kim & Azarakhsh Jalalvand & Rory Conlin & Andrew Rothstein & Joseph Abbate & Keith Erickson & Josiah Wai & Ricardo Shousha & Egemen Kolemen, 2024. "Avoiding fusion plasma tearing instability with deep reinforcement learning," Nature, Nature, vol. 626(8000), pages 746-751, February.
    18. Perera, A.T.D. & Wickramasinghe, P.U. & Nik, Vahid M. & Scartezzini, Jean-Louis, 2020. "Introducing reinforcement learning to the energy system design process," Applied Energy, Elsevier, vol. 262(C).
    19. Volodymyr Mnih & Koray Kavukcuoglu & David Silver & Andrei A. Rusu & Joel Veness & Marc G. Bellemare & Alex Graves & Martin Riedmiller & Andreas K. Fidjeland & Georg Ostrovski & Stig Petersen & Charle, 2015. "Human-level control through deep reinforcement learning," Nature, Nature, vol. 518(7540), pages 529-533, February.
    20. Lan, Penghang & Chen, She & Li, Qihang & Li, Kelin & Wang, Feng & Zhao, Yaoxun & Wang, Tianwei, 2024. "Comparison of different hydrogen-ammonia energy conversion pathways for renewable energy supply," Renewable Energy, Elsevier, vol. 227(C).
    21. Perera, A.T.D. & Kamalaruban, Parameswaran, 2021. "Applications of reinforcement learning in energy systems," Renewable and Sustainable Energy Reviews, Elsevier, vol. 137(C).
    22. Huang, Xuejin & Zhang, Jingyi & Ou, Kai & Huang, Yin & Kang, Zehao & Mao, Xuping & Zhou, Yujie & Xuan, Dongji, 2024. "Deep reinforcement learning-based health-conscious energy management for fuel cell hybrid electric vehicles in model predictive control framework," Energy, Elsevier, vol. 304(C).
    23. Moretti, Luca & Martelli, Emanuele & Manzolini, Giampaolo, 2020. "An efficient robust optimization model for the unit commitment and dispatch of multi-energy systems and microgrids," Applied Energy, Elsevier, vol. 261(C).
    24. Guo, Chenyu & Wang, Xin & Zheng, Yihui & Zhang, Feng, 2022. "Real-time optimal energy management of microgrid with uncertainties based on deep reinforcement learning," Energy, Elsevier, vol. 238(PC).
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Wang, Yi & Qiu, Dawei & Sun, Mingyang & Strbac, Goran & Gao, Zhiwei, 2023. "Secure energy management of multi-energy microgrid: A physical-informed safe reinforcement learning approach," Applied Energy, Elsevier, vol. 335(C).
    2. Tawalbeh, Muhammad & Murtaza, Sana Z.M. & Al-Othman, Amani & Alami, Abdul Hai & Singh, Karnail & Olabi, Abdul Ghani, 2022. "Ammonia: A versatile candidate for the use in energy storage systems," Renewable Energy, Elsevier, vol. 194(C), pages 955-977.
    3. Omar Al-Ani & Sanjoy Das, 2022. "Reinforcement Learning: Theory and Applications in HEMS," Energies, MDPI, vol. 15(17), pages 1-37, September.
    4. Qiu, Dawei & Wang, Yi & Hua, Weiqi & Strbac, Goran, 2023. "Reinforcement learning for electric vehicle applications in power systems:A critical review," Renewable and Sustainable Energy Reviews, Elsevier, vol. 173(C).
    5. Lu, Yu & Xiang, Yue & Huang, Yuan & Yu, Bin & Weng, Liguo & Liu, Junyong, 2023. "Deep reinforcement learning based optimal scheduling of active distribution system considering distributed generation, energy storage and flexible load," Energy, Elsevier, vol. 271(C).
    6. Junior Diamant Ngando Ebba & Mamadou Baïlo Camara & Mamadou Lamine Doumbia & Brayima Dakyo & Joseph Song-Manguelle, 2023. "Large-Scale Hydrogen Production Systems Using Marine Renewable Energies: State-of-the-Art," Energies, MDPI, vol. 17(1), pages 1-23, December.
    7. Li, Yanxue & Wang, Zixuan & Xu, Wenya & Gao, Weijun & Xu, Yang & Xiao, Fu, 2023. "Modeling and energy dynamic control for a ZEH via hybrid model-based deep reinforcement learning," Energy, Elsevier, vol. 277(C).
    8. Harrold, Daniel J.B. & Cao, Jun & Fan, Zhong, 2022. "Data-driven battery operation for energy arbitrage using rainbow deep reinforcement learning," Energy, Elsevier, vol. 238(PC).
    9. Zhu, Ziqing & Hu, Ze & Chan, Ka Wing & Bu, Siqi & Zhou, Bin & Xia, Shiwei, 2023. "Reinforcement learning in deregulated energy market: A comprehensive review," Applied Energy, Elsevier, vol. 329(C).
    10. Pinciroli, Luca & Baraldi, Piero & Compare, Michele & Zio, Enrico, 2023. "Optimal operation and maintenance of energy storage systems in grid-connected microgrids by deep reinforcement learning," Applied Energy, Elsevier, vol. 352(C).
    11. Guo, Yuxiang & Qu, Shengli & Wang, Chuang & Xing, Ziwen & Duan, Kaiwen, 2024. "Optimal dynamic thermal management for data center via soft actor-critic algorithm with dynamic control interval and combined-value state space," Applied Energy, Elsevier, vol. 373(C).
    12. Bio Gassi, Karim & Baysal, Mustafa, 2023. "Improving real-time energy decision-making model with an actor-critic agent in modern microgrids with energy storage devices," Energy, Elsevier, vol. 263(PE).
    13. Zhu, Dafeng & Yang, Bo & Liu, Yuxiang & Wang, Zhaojian & Ma, Kai & Guan, Xinping, 2022. "Energy management based on multi-agent deep reinforcement learning for a multi-energy industrial park," Applied Energy, Elsevier, vol. 311(C).
    14. He, Yi & Guo, Su & Zhou, Jianxu & Ye, Jilei & Huang, Jing & Zheng, Kun & Du, Xinru, 2022. "Multi-objective planning-operation co-optimization of renewable energy system with hybrid energy storages," Renewable Energy, Elsevier, vol. 184(C), pages 776-790.
    15. Bhandari, Ramchandra, 2022. "Green hydrogen production potential in West Africa – Case of Niger," Renewable Energy, Elsevier, vol. 196(C), pages 800-811.
    16. Cephas Samende & Zhong Fan & Jun Cao & Renzo Fabián & Gregory N. Baltas & Pedro Rodriguez, 2023. "Battery and Hydrogen Energy Storage Control in a Smart Energy Network with Flexible Energy Demand Using Deep Reinforcement Learning," Energies, MDPI, vol. 16(19), pages 1-20, September.
    17. Yin, Linfei & Li, Yu, 2022. "Hybrid multi-agent emotional deep Q network for generation control of multi-area integrated energy systems," Applied Energy, Elsevier, vol. 324(C).
    18. Barra, P.H.A. & de Carvalho, W.C. & Menezes, T.S. & Fernandes, R.A.S. & Coury, D.V., 2021. "A review on wind power smoothing using high-power energy storage systems," Renewable and Sustainable Energy Reviews, Elsevier, vol. 137(C).
    19. Biemann, Marco & Scheller, Fabian & Liu, Xiufeng & Huang, Lizhen, 2021. "Experimental evaluation of model-free reinforcement learning algorithms for continuous HVAC control," Applied Energy, Elsevier, vol. 298(C).
    20. Perera, A.T.D. & Kamalaruban, Parameswaran, 2021. "Applications of reinforcement learning in energy systems," Renewable and Sustainable Energy Reviews, Elsevier, vol. 137(C).

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:renene:v:237:y:2024:i:pb:s0960148124017932. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.journals.elsevier.com/renewable-energy .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.