IDEAS home Printed from https://ideas.repec.org/a/gam/jeners/v17y2024i21p5307-d1506486.html
   My bibliography  Save this article

Reinforcement Learning Model-Based and Model-Free Paradigms for Optimal Control Problems in Power Systems: Comprehensive Review and Future Directions

Author

Listed:
  • Elinor Ginzburg-Ganz

    (The Andrew and Erna Viterbi Faculty of Electrical and Computer Engineering, Technion—Israel Institute of Technology, Haifa 3200003, Israel)

  • Itay Segev

    (The Andrew and Erna Viterbi Faculty of Electrical and Computer Engineering, Technion—Israel Institute of Technology, Haifa 3200003, Israel)

  • Alexander Balabanov

    (The Andrew and Erna Viterbi Faculty of Electrical and Computer Engineering, Technion—Israel Institute of Technology, Haifa 3200003, Israel)

  • Elior Segev

    (The Andrew and Erna Viterbi Faculty of Electrical and Computer Engineering, Technion—Israel Institute of Technology, Haifa 3200003, Israel)

  • Sivan Kaully Naveh

    (The Andrew and Erna Viterbi Faculty of Electrical and Computer Engineering, Technion—Israel Institute of Technology, Haifa 3200003, Israel)

  • Ram Machlev

    (The Andrew and Erna Viterbi Faculty of Electrical and Computer Engineering, Technion—Israel Institute of Technology, Haifa 3200003, Israel)

  • Juri Belikov

    (Department of Software Science, Tallinn University of Technology, Akadeemia tee 15a, 12618 Tallinn, Estonia)

  • Liran Katzir

    (Advanced Energy Industries, Caesarea 38900, Israel)

  • Sarah Keren

    (Computer Science Faculty of Electrical and Computer Engineering, Technion—Israel Institute of Technology, Haifa 3200003, Israel)

  • Yoash Levron

    (The Andrew and Erna Viterbi Faculty of Electrical and Computer Engineering, Technion—Israel Institute of Technology, Haifa 3200003, Israel)

Abstract

This paper reviews recent works related to applications of reinforcement learning in power system optimal control problems. Based on an extensive analysis of works in the recent literature, we attempt to better understand the gap between reinforcement learning methods that rely on complete or incomplete information about the model dynamics and data-driven reinforcement learning approaches. More specifically we ask how such models change based on the application or the algorithm, what the currently open theoretical and numerical challenges are in each of the leading applications, and which reinforcement-based control strategies will rise in the following years. The reviewed research works are divided into “model-based” methods and “model-free” methods in order to highlight the current developments and trends within each of these two groups. The optimal control problems reviewed are energy markets, grid stability and control, energy management in buildings, electrical vehicles, and energy storage.

Suggested Citation

  • Elinor Ginzburg-Ganz & Itay Segev & Alexander Balabanov & Elior Segev & Sivan Kaully Naveh & Ram Machlev & Juri Belikov & Liran Katzir & Sarah Keren & Yoash Levron, 2024. "Reinforcement Learning Model-Based and Model-Free Paradigms for Optimal Control Problems in Power Systems: Comprehensive Review and Future Directions," Energies, MDPI, vol. 17(21), pages 1-54, October.
  • Handle: RePEc:gam:jeners:v:17:y:2024:i:21:p:5307-:d:1506486
    as

    Download full text from publisher

    File URL: https://www.mdpi.com/1996-1073/17/21/5307/pdf
    Download Restriction: no

    File URL: https://www.mdpi.com/1996-1073/17/21/5307/
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Comello, Stephen & Reichelstein, Stefan J. & Sahoo, Anshuman, 2018. "The Road ahead for Solar PV Power," Research Papers 3620, Stanford University, Graduate School of Business.
    2. Eitan Altman, 1998. "Constrained Markov decision processes with total cost criteria: Lagrangian approach and dual linear program," Mathematical Methods of Operations Research, Springer;Gesellschaft für Operations Research (GOR);Nederlands Genootschap voor Besliskunde (NGB), vol. 48(3), pages 387-417, December.
    3. Bayón, L. & Grau, J.M. & Ruiz, M.M. & Suárez, P.M., 2016. "A comparative economic study of two configurations of hydro-wind power plants," Energy, Elsevier, vol. 112(C), pages 8-16.
    4. Harrold, Daniel J.B. & Cao, Jun & Fan, Zhong, 2022. "Renewable energy integration and microgrid energy trading using multi-agent deep reinforcement learning," Applied Energy, Elsevier, vol. 318(C).
    5. Drgoňa, Ján & Picard, Damien & Kvasnica, Michal & Helsen, Lieve, 2018. "Approximate model predictive building control via machine learning," Applied Energy, Elsevier, vol. 218(C), pages 199-216.
    6. Cao, Di & Zhao, Junbo & Hu, Weihao & Ding, Fei & Yu, Nanpeng & Huang, Qi & Chen, Zhe, 2022. "Model-free voltage control of active distribution system with PVs using surrogate model-based deep reinforcement learning," Applied Energy, Elsevier, vol. 306(PA).
    7. Yaqoot, Mohammed & Diwan, Parag & Kandpal, Tara C., 2016. "Review of barriers to the dissemination of decentralized renewable energy systems," Renewable and Sustainable Energy Reviews, Elsevier, vol. 58(C), pages 477-490.
    8. Dorokhova, Marina & Martinson, Yann & Ballif, Christophe & Wyrsch, Nicolas, 2021. "Deep reinforcement learning control of electric vehicle charging in the presence of photovoltaic generation," Applied Energy, Elsevier, vol. 301(C).
    9. Heldeweg, Michiel A. & Séverine Saintier,, 2020. "Renewable energy communities as ‘socio-legal institutions’: A normative frame for energy decentralization?," Renewable and Sustainable Energy Reviews, Elsevier, vol. 119(C).
    10. Arroyo, Javier & Manna, Carlo & Spiessens, Fred & Helsen, Lieve, 2022. "Reinforced model predictive control (RL-MPC) for building energy management," Applied Energy, Elsevier, vol. 309(C).
    11. Mudhafar Al-Saadi & Maher Al-Greer & Michael Short, 2023. "Reinforcement Learning-Based Intelligent Control Strategies for Optimal Power Management in Advanced Power Distribution Systems: A Survey," Energies, MDPI, vol. 16(4), pages 1-38, February.
    12. Zehui Kong & Yuan Zou & Teng Liu, 2017. "Implementation of real-time energy management strategy based on reinforcement learning for hybrid electric vehicles and simulation validation," PLOS ONE, Public Library of Science, vol. 12(7), pages 1-16, July.
    13. Teng Liu & Yuan Zou & Dexing Liu & Fengchun Sun, 2015. "Reinforcement Learning–Based Energy Management Strategy for a Hybrid Electric Tracked Vehicle," Energies, MDPI, vol. 8(7), pages 1-18, July.
    14. Li, Xiangyu & Luo, Fengji & Li, Chaojie, 2024. "Multi-agent deep reinforcement learning-based autonomous decision-making framework for community virtual power plants," Applied Energy, Elsevier, vol. 360(C).
    15. Cui, Li & Wang, Qingyuan & Qu, Hongquan & Wang, Mingshen & Wu, Yile & Ge, Le, 2023. "Dynamic pricing for fast charging stations with deep reinforcement learning," Applied Energy, Elsevier, vol. 346(C).
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Zhao, Jing & Yang, Zilan & Shi, Linyu & Liu, Dehan & Li, Haonan & Mi, Yumiao & Wang, Hongbin & Feng, Meili & Hutagaol, Timothy Joseph, 2024. "Photovoltaic capacity dynamic tracking model predictive control strategy of air-conditioning systems with consideration of flexible loads," Applied Energy, Elsevier, vol. 356(C).
    2. Du, Guodong & Zou, Yuan & Zhang, Xudong & Kong, Zehui & Wu, Jinlong & He, Dingbo, 2019. "Intelligent energy management for hybrid electric tracked vehicles using online reinforcement learning," Applied Energy, Elsevier, vol. 251(C), pages 1-1.
    3. Liu, Teng & Tan, Wenhao & Tang, Xiaolin & Zhang, Jinwei & Xing, Yang & Cao, Dongpu, 2021. "Driving conditions-driven energy management strategies for hybrid electric vehicles: A review," Renewable and Sustainable Energy Reviews, Elsevier, vol. 151(C).
    4. Mingliang Bai & Wenjiang Yang & Dongbin Song & Marek Kosuda & Stanislav Szabo & Pavol Lipovsky & Afshar Kasaei, 2020. "Research on Energy Management of Hybrid Unmanned Aerial Vehicles to Improve Energy-Saving and Emission Reduction Performance," IJERPH, MDPI, vol. 17(8), pages 1-24, April.
    5. Vaidyanathan, Geeta & Sankaranarayanan, Ramani & Yap, Nonita T., 2019. "Bridging the chasm – Diffusion of energy innovations in poor infrastructure starved communities," Renewable and Sustainable Energy Reviews, Elsevier, vol. 99(C), pages 243-255.
    6. Ahmed, R. & Sreeram, V. & Mishra, Y. & Arif, M.D., 2020. "A review and evaluation of the state-of-the-art in PV solar power forecasting: Techniques and optimization," Renewable and Sustainable Energy Reviews, Elsevier, vol. 124(C).
    7. Yang, Shiyu & Wan, Man Pun & Ng, Bing Feng & Dubey, Swapnil & Henze, Gregor P. & Chen, Wanyu & Baskaran, Krishnamoorthy, 2020. "Experimental study of model predictive control for an air-conditioning system with dedicated outdoor air system," Applied Energy, Elsevier, vol. 257(C).
    8. Zhu, Xingxu & Hou, Xiangchen & Li, Junhui & Yan, Gangui & Li, Cuiping & Wang, Dongbo, 2023. "Distributed online prediction optimization algorithm for distributed energy resources considering the multi-periods optimal operation," Applied Energy, Elsevier, vol. 348(C).
    9. Andrzej Ożadowicz & Gabriela Walczyk, 2023. "Energy Performance and Control Strategy for Dynamic Façade with Perovskite PV Panels—Technical Analysis and Case Study," Energies, MDPI, vol. 16(9), pages 1-23, April.
    10. Baruah, Debendra Chandra & Enweremadu, Christopher Chintua, 2019. "Prospects of decentralized renewable energy to improve energy access: A resource-inventory-based analysis of South Africa," Renewable and Sustainable Energy Reviews, Elsevier, vol. 103(C), pages 328-341.
    11. Yin, Linfei & He, Xiaoyu, 2023. "Artificial emotional deep Q learning for real-time smart voltage control of cyber-physical social power systems," Energy, Elsevier, vol. 273(C).
    12. Fachrizal, Reza & Shepero, Mahmoud & Åberg, Magnus & Munkhammar, Joakim, 2022. "Optimal PV-EV sizing at solar powered workplace charging stations with smart charging schemes considering self-consumption and self-sufficiency balance," Applied Energy, Elsevier, vol. 307(C).
    13. Feng, Zhanyu & Zhang, Jian & Jiang, Han & Yao, Xuejian & Qian, Yu & Zhang, Haiyan, 2024. "Energy consumption prediction strategy for electric vehicle based on LSTM-transformer framework," Energy, Elsevier, vol. 302(C).
    14. Murshed, Muntasir, 2019. "Trade Liberalization Policies and Renewable Energy Transition in Low and Middle-Income Countries? An Instrumental Variable Approach," MPRA Paper 97075, University Library of Munich, Germany.
    15. Hamed, Mohammad M. & Mohammed, Ali & Olabi, Abdul Ghani, 2023. "Renewable energy adoption decisions in Jordan's industrial sector: Statistical analysis with unobserved heterogeneity," Renewable and Sustainable Energy Reviews, Elsevier, vol. 184(C).
    16. Omar Al-Ani & Sanjoy Das, 2022. "Reinforcement Learning: Theory and Applications in HEMS," Energies, MDPI, vol. 15(17), pages 1-37, September.
    17. Fouladvand, Javanshir & Aranguren Rojas, Maria & Hoppe, Thomas & Ghorbani, Amineh, 2022. "Simulating thermal energy community formation: Institutional enablers outplaying technological choice," Applied Energy, Elsevier, vol. 306(PA).
    18. Zhao, Yincheng & Zhang, Guozhou & Hu, Weihao & Huang, Qi & Chen, Zhe & Blaabjerg, Frede, 2023. "Meta-learning based voltage control strategy for emergency faults of active distribution networks," Applied Energy, Elsevier, vol. 349(C).
    19. Joshi, Lalita & Choudhary, Deepak & Kumar, Praveen & Venkateswaran, Jayendran & Solanki, Chetan S., 2019. "Does involvement of local community ensure sustained energy access? A critical review of a solar PV technology intervention in rural India," World Development, Elsevier, vol. 122(C), pages 272-281.
    20. Talaat, M. & Farahat, M.A. & Elkholy, M.H., 2019. "Renewable power integration: Experimental and simulation study to investigate the ability of integrating wave, solar and wind energies," Energy, Elsevier, vol. 170(C), pages 668-682.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:gam:jeners:v:17:y:2024:i:21:p:5307-:d:1506486. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: MDPI Indexing Manager (email available below). General contact details of provider: https://www.mdpi.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.