IDEAS home Printed from https://ideas.repec.org/a/eee/appene/v328y2022ics0306261922013976.html
   My bibliography  Save this article

Deep reinforcement learning-based strategy for charging station participating in demand response

Author

Listed:
  • Jin, Ruiyang
  • Zhou, Yuke
  • Lu, Chao
  • Song, Jie

Abstract

The trend of zero-carbonization has accelerated the prevalence of electric vehicles (EVs) owing to their advantages of low carbon emissions and high energy efficiency. The stochastic and high charging load of EVs results in a non-negligible challenge that may cause grid overload. A promising approach is the participation of charging stations in demand response as load aggregators by coordinating the charging power of electric vehicles. However, improper coordination of charging load may lead to unfulfilled charging demand, which would cause dissatisfaction on the demand side. In this study, the incentive-based and time-varying demand response mechanism is considered when charging stations coordinate charging of multiple EVs. A decentralized decision-making framework is innovatively applied to provide charging power of each EV. The charging process is modeled as a Markov decision process, and a virtual price is designed to help decide the charging power. Deep reinforcement learning algorithms such as deep deterministic policy gradient are applied to determine the charging strategy of multiple and heterogeneous EVs. Numerical experiments are performed to validate the effectiveness of the proposed method. A comparison with an optimal charging strategy and a heuristic rule-based method shows that the proposed method can trade off the revenue from demand response and user satisfaction, as well as reduce the peak load of the charging station. Furthermore, a test with inaccurate departure information indicates the robustness of the proposed method.

Suggested Citation

  • Jin, Ruiyang & Zhou, Yuke & Lu, Chao & Song, Jie, 2022. "Deep reinforcement learning-based strategy for charging station participating in demand response," Applied Energy, Elsevier, vol. 328(C).
  • Handle: RePEc:eee:appene:v:328:y:2022:i:c:s0306261922013976
    DOI: 10.1016/j.apenergy.2022.120140
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0306261922013976
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.apenergy.2022.120140?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Suganya, S. & Raja, S. Charles & Venkatesh, P., 2017. "Simultaneous coordination of distinct plug-in Hybrid Electric Vehicle charging stations: A modified Particle Swarm Optimization approach," Energy, Elsevier, vol. 138(C), pages 92-102.
    2. Vázquez-Canteli, José R. & Nagy, Zoltán, 2019. "Reinforcement learning for demand response: A review of algorithms and modeling techniques," Applied Energy, Elsevier, vol. 235(C), pages 1072-1089.
    3. Tuchnitz, Felix & Ebell, Niklas & Schlund, Jonas & Pruckner, Marco, 2021. "Development and Evaluation of a Smart Charging Strategy for an Electric Vehicle Fleet Based on Reinforcement Learning," Applied Energy, Elsevier, vol. 285(C).
    4. Škugor, Branimir & Deur, Joško, 2015. "Dynamic programming-based optimisation of charging an electric vehicle fleet system represented by an aggregate battery model," Energy, Elsevier, vol. 92(P3), pages 456-465.
    5. Elma, Onur, 2020. "A dynamic charging strategy with hybrid fast charging station for electric vehicles," Energy, Elsevier, vol. 202(C).
    6. Xu, Zhiwei & Hu, Zechun & Song, Yonghua & Zhao, Wei & Zhang, Yongwang, 2014. "Coordination of PEVs charging across multiple aggregators," Applied Energy, Elsevier, vol. 136(C), pages 582-589.
    7. Lijesen, Mark G., 2007. "The real-time price elasticity of electricity," Energy Economics, Elsevier, vol. 29(2), pages 249-258, March.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Truong, Van Binh & Le, Long Bao, 2024. "Electric vehicle charging design: The factored action based reinforcement learning approach," Applied Energy, Elsevier, vol. 359(C).
    2. Xie, Jiahan & Ajagekar, Akshay & You, Fengqi, 2023. "Multi-Agent attention-based deep reinforcement learning for demand response in grid-responsive buildings," Applied Energy, Elsevier, vol. 342(C).

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Pavlos S. Georgilakis, 2020. "Review of Computational Intelligence Methods for Local Energy Markets at the Power Distribution Level to Facilitate the Integration of Distributed Energy Resources: State-of-the-art and Future Researc," Energies, MDPI, vol. 13(1), pages 1-37, January.
    2. Paudel, Diwas & Das, Tapas K., 2023. "A deep reinforcement learning approach for power management of battery-assisted fast-charging EV hubs participating in day-ahead and real-time electricity markets," Energy, Elsevier, vol. 283(C).
    3. Omar Al-Ani & Sanjoy Das, 2022. "Reinforcement Learning: Theory and Applications in HEMS," Energies, MDPI, vol. 15(17), pages 1-37, September.
    4. Fanrong Kong & Jianhui Jiang & Zhigang Ding & Junjie Hu & Weian Guo & Lei Wang, 2017. "A Personalized Rolling Optimal Charging Schedule for Plug-In Hybrid Electric Vehicle Based on Statistical Energy Demand Analysis and Heuristic Algorithm," Energies, MDPI, vol. 10(9), pages 1-18, September.
    5. Pegah Alaee & Julius Bems & Amjad Anvari-Moghaddam, 2023. "A Review of the Latest Trends in Technical and Economic Aspects of EV Charging Management," Energies, MDPI, vol. 16(9), pages 1-28, April.
    6. Wang, Yi & Qiu, Dawei & Strbac, Goran, 2022. "Multi-agent deep reinforcement learning for resilience-driven routing and scheduling of mobile energy storage systems," Applied Energy, Elsevier, vol. 310(C).
    7. Qiu, Dawei & Wang, Yi & Hua, Weiqi & Strbac, Goran, 2023. "Reinforcement learning for electric vehicle applications in power systems:A critical review," Renewable and Sustainable Energy Reviews, Elsevier, vol. 173(C).
    8. Kim, Sunwoo & Choi, Yechan & Park, Joungho & Adams, Derrick & Heo, Seongmin & Lee, Jay H., 2024. "Multi-period, multi-timescale stochastic optimization model for simultaneous capacity investment and energy management decisions for hybrid Micro-Grids with green hydrogen production under uncertainty," Renewable and Sustainable Energy Reviews, Elsevier, vol. 190(PA).
    9. Guo, Yurun & Wang, Shugang & Wang, Jihong & Zhang, Tengfei & Ma, Zhenjun & Jiang, Shuang, 2024. "Key district heating technologies for building energy flexibility: A review," Renewable and Sustainable Energy Reviews, Elsevier, vol. 189(PB).
    10. Eicke, Anselm & Ruhnau, Oliver & Hirth, Lion, 2021. "Electricity balancing as a market equilibrium," EconStor Preprints 233852, ZBW - Leibniz Information Centre for Economics.
    11. Shafqat Jawad & Junyong Liu, 2020. "Electrical Vehicle Charging Services Planning and Operation with Interdependent Power Networks and Transportation Networks: A Review of the Current Scenario and Future Trends," Energies, MDPI, vol. 13(13), pages 1-24, July.
    12. Sgouridis, Sgouris & Kennedy, Scott, 2010. "Tangible and fungible energy: Hybrid energy market and currency system for total energy management. A Masdar City case study," Energy Policy, Elsevier, vol. 38(4), pages 1749-1758, April.
    13. Gokhale, Gargya & Claessens, Bert & Develder, Chris, 2022. "Physics informed neural networks for control oriented thermal modeling of buildings," Applied Energy, Elsevier, vol. 314(C).
    14. Espinosa Acuña, Óscar A. & Vaca González, Paola A. & Avila Forero, Raúl A., 2013. "Elasticidades de demanda por electricidad e impactos macroecon_omicos del precio de la energía eléctrica en Colombia || Elasticity of Electricity Demand and Macroeconomics Impacts of Electricity Price," Revista de Métodos Cuantitativos para la Economía y la Empresa = Journal of Quantitative Methods for Economics and Business Administration, Universidad Pablo de Olavide, Department of Quantitative Methods for Economics and Business Administration, vol. 16(1), pages 216-249, December.
    15. Sun, Hongchang & Niu, Yanlei & Li, Chengdong & Zhou, Changgeng & Zhai, Wenwen & Chen, Zhe & Wu, Hao & Niu, Lanqiang, 2022. "Energy consumption optimization of building air conditioning system via combining the parallel temporal convolutional neural network and adaptive opposition-learning chimp algorithm," Energy, Elsevier, vol. 259(C).
    16. Raja S, Charles & Kumar N M, Vijaya & J, Senthil kumar & Nesamalar J, Jeslin Drusila, 2021. "Enhancing system reliability by optimally integrating PHEV charging station and renewable distributed generators: A Bi-Level programming approach," Energy, Elsevier, vol. 229(C).
    17. Langer, Lissy & Volling, Thomas, 2022. "A reinforcement learning approach to home energy management for modulating heat pumps and photovoltaic systems," Applied Energy, Elsevier, vol. 327(C).
    18. Pineau, Pierre-Olivier & Rasata, Hasina & Zaccour, Georges, 2011. "Impact of some parameters on investments in oligopolistic electricity markets," European Journal of Operational Research, Elsevier, vol. 213(1), pages 180-195, August.
    19. Jianglong Li & Zhi Li, 2018. "Understanding the role of economic transition in enlarging energy price elasticity," The Economics of Transition, The European Bank for Reconstruction and Development, vol. 26(2), pages 253-281, April.
    20. Mulder, Machiel & Zeng, Yuyu, 2018. "Exploring interaction effects of climate policies: A model analysis of the power market," Resource and Energy Economics, Elsevier, vol. 54(C), pages 165-185.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:appene:v:328:y:2022:i:c:s0306261922013976. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/wps/find/journaldescription.cws_home/405891/description#description .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.