IDEAS home Printed from https://ideas.repec.org/a/eee/appene/v301y2021ics0306261921008874.html
   My bibliography  Save this article

Deep reinforcement learning control of electric vehicle charging in the presence of photovoltaic generation

Author

Listed:
  • Dorokhova, Marina
  • Martinson, Yann
  • Ballif, Christophe
  • Wyrsch, Nicolas

Abstract

In recent years, the importance of electric mobility has increased in response to climate change. The fast-growing deployment of electric vehicles (EVs) worldwide is expected to decrease transportation-related CO2 emissions, facilitate the integration of renewables, and support the grid through demand–response services. Simultaneously, inadequate EV charging patterns can lead to undesirable effects in grid operation, such as high peak-loads or low self-consumption of solar electricity, thus calling for novel methods of control. This work focuses on applying deep reinforcement learning (RL) to the EV charging control problem with the objectives to increase photovoltaic self-consumption and EV state of charge at departure. Particularly, we propose mathematical formulations of environments with discrete, continuous, and parametrized action spaces and respective deep RL algorithms to resolve them. The benchmarking of the deep RL control against naive, rule-based, deterministic optimization, and model-predictive control demonstrates that the suggested methodology can produce consistent and employable EV charging strategies, while its performance holds a great promise for real-time implementations.

Suggested Citation

  • Dorokhova, Marina & Martinson, Yann & Ballif, Christophe & Wyrsch, Nicolas, 2021. "Deep reinforcement learning control of electric vehicle charging in the presence of photovoltaic generation," Applied Energy, Elsevier, vol. 301(C).
  • Handle: RePEc:eee:appene:v:301:y:2021:i:c:s0306261921008874
    DOI: 10.1016/j.apenergy.2021.117504
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0306261921008874
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.apenergy.2021.117504?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Luthander, Rasmus & Widén, Joakim & Nilsson, Daniel & Palm, Jenny, 2015. "Photovoltaic self-consumption in buildings: A review," Applied Energy, Elsevier, vol. 142(C), pages 80-94.
    2. Vázquez-Canteli, José R. & Nagy, Zoltán, 2019. "Reinforcement learning for demand response: A review of algorithms and modeling techniques," Applied Energy, Elsevier, vol. 235(C), pages 1072-1089.
    3. Jaehyun Lee & Eunjung Lee & Jinho Kim, 2020. "Electric Vehicle Charging and Discharging Algorithm Based on Reinforcement Learning with Data-Driven Approach in Dynamic Pricing Scheme," Energies, MDPI, vol. 13(8), pages 1-18, April.
    4. Kathirgamanathan, Anjukan & De Rosa, Mattia & Mangina, Eleni & Finn, Donal P., 2021. "Data-driven predictive control for unlocking building energy flexibility: A review," Renewable and Sustainable Energy Reviews, Elsevier, vol. 135(C).
    5. Sunyong Kim & Hyuk Lim, 2018. "Reinforcement Learning Based Energy Management Algorithm for Smart Energy Buildings," Energies, MDPI, vol. 11(8), pages 1-19, August.
    6. Xiaohan Fang & Jinkuan Wang & Guanru Song & Yinghua Han & Qiang Zhao & Zhiao Cao, 2019. "Multi-Agent Reinforcement Learning Approach for Residential Microgrid Energy Scheduling," Energies, MDPI, vol. 13(1), pages 1-26, December.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Omar Al-Ani & Sanjoy Das, 2022. "Reinforcement Learning: Theory and Applications in HEMS," Energies, MDPI, vol. 15(17), pages 1-37, September.
    2. Guo, Yurun & Wang, Shugang & Wang, Jihong & Zhang, Tengfei & Ma, Zhenjun & Jiang, Shuang, 2024. "Key district heating technologies for building energy flexibility: A review," Renewable and Sustainable Energy Reviews, Elsevier, vol. 189(PB).
    3. Gokhale, Gargya & Claessens, Bert & Develder, Chris, 2022. "Physics informed neural networks for control oriented thermal modeling of buildings," Applied Energy, Elsevier, vol. 314(C).
    4. Vašak, Mario & Banjac, Anita & Hure, Nikola & Novak, Hrvoje & Kovačević, Marko, 2023. "Predictive control based assessment of building demand flexibility in fixed time windows," Applied Energy, Elsevier, vol. 329(C).
    5. Hernandez-Matheus, Alejandro & Löschenbrand, Markus & Berg, Kjersti & Fuchs, Ida & Aragüés-Peñalba, Mònica & Bullich-Massagué, Eduard & Sumper, Andreas, 2022. "A systematic review of machine learning techniques related to local energy communities," Renewable and Sustainable Energy Reviews, Elsevier, vol. 170(C).
    6. Langer, Lissy & Volling, Thomas, 2020. "An optimal home energy management system for modulating heat pumps and photovoltaic systems," Applied Energy, Elsevier, vol. 278(C).
    7. Lilia Tightiz & Joon Yoo, 2022. "A Review on a Data-Driven Microgrid Management System Integrating an Active Distribution Network: Challenges, Issues, and New Trends," Energies, MDPI, vol. 15(22), pages 1-24, November.
    8. Svetozarevic, B. & Baumann, C. & Muntwiler, S. & Di Natale, L. & Zeilinger, M.N. & Heer, P., 2022. "Data-driven control of room temperature and bidirectional EV charging using deep reinforcement learning: Simulations and experiments," Applied Energy, Elsevier, vol. 307(C).
    9. Qiu, Dawei & Wang, Yi & Hua, Weiqi & Strbac, Goran, 2023. "Reinforcement learning for electric vehicle applications in power systems:A critical review," Renewable and Sustainable Energy Reviews, Elsevier, vol. 173(C).
    10. Van-Hai Bui & Akhtar Hussain & Hak-Man Kim, 2019. "Q-Learning-Based Operation Strategy for Community Battery Energy Storage System (CBESS) in Microgrid System," Energies, MDPI, vol. 12(9), pages 1-17, May.
    11. Kim, Sunwoo & Choi, Yechan & Park, Joungho & Adams, Derrick & Heo, Seongmin & Lee, Jay H., 2024. "Multi-period, multi-timescale stochastic optimization model for simultaneous capacity investment and energy management decisions for hybrid Micro-Grids with green hydrogen production under uncertainty," Renewable and Sustainable Energy Reviews, Elsevier, vol. 190(PA).
    12. Jin, Ruiyang & Zhou, Yuke & Lu, Chao & Song, Jie, 2022. "Deep reinforcement learning-based strategy for charging station participating in demand response," Applied Energy, Elsevier, vol. 328(C).
    13. Federica Cucchiella & Idiano D’Adamo & Paolo Rosa, 2015. "Industrial Photovoltaic Systems: An Economic Analysis in Non-Subsidized Electricity Markets," Energies, MDPI, vol. 8(11), pages 1-16, November.
    14. Byungsung Lee & Haesung Lee & Hyun Ahn, 2020. "Improving Load Forecasting of Electric Vehicle Charging Stations Through Missing Data Imputation," Energies, MDPI, vol. 13(18), pages 1-15, September.
    15. Harri Aaltonen & Seppo Sierla & Rakshith Subramanya & Valeriy Vyatkin, 2021. "A Simulation Environment for Training a Reinforcement Learning Agent Trading a Battery Storage," Energies, MDPI, vol. 14(17), pages 1-20, September.
    16. Imen Azzouz & Wiem Fekih Hassen, 2023. "Optimization of Electric Vehicles Charging Scheduling Based on Deep Reinforcement Learning: A Decentralized Approach," Energies, MDPI, vol. 16(24), pages 1-18, December.
    17. Martín Pensado-Mariño & Lara Febrero-Garrido & Pablo Eguía-Oller & Enrique Granada-Álvarez, 2021. "Feasibility of Different Weather Data Sources Applied to Building Indoor Temperature Estimation Using LSTM Neural Networks," Sustainability, MDPI, vol. 13(24), pages 1-15, December.
    18. Shafqat Jawad & Junyong Liu, 2020. "Electrical Vehicle Charging Services Planning and Operation with Interdependent Power Networks and Transportation Networks: A Review of the Current Scenario and Future Trends," Energies, MDPI, vol. 13(13), pages 1-24, July.
    19. Alqahtani, Mohammed & Hu, Mengqi, 2022. "Dynamic energy scheduling and routing of multiple electric vehicles using deep reinforcement learning," Energy, Elsevier, vol. 244(PA).
    20. Reza Fachrizal & Joakim Munkhammar, 2020. "Improved Photovoltaic Self-Consumption in Residential Buildings with Distributed and Centralized Smart Charging of Electric Vehicles," Energies, MDPI, vol. 13(5), pages 1-19, March.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:appene:v:301:y:2021:i:c:s0306261921008874. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/wps/find/journaldescription.cws_home/405891/description#description .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.