Reinforcement learning for electric vehicle charging scheduling: A systematic review
Author
Abstract
Suggested Citation
DOI: 10.1016/j.tre.2024.103698
Download full text from publisher
As the access to this document is restricted, you may want to search for a different version of it.
References listed on IDEAS
- Ivan Kristianto Singgih & Byung-In Kim, 2020. "Multi-type electric vehicle relocation problem considering required battery-charging time," European Journal of Industrial Engineering, Inderscience Enterprises Ltd, vol. 14(3), pages 335-368.
- Zhang, Shulei & Jia, Runda & Pan, Hengxin & Cao, Yankai, 2023. "A safe reinforcement learning-based charging strategy for electric vehicles in residential microgrid," Applied Energy, Elsevier, vol. 348(C).
- Li, Chengzhe & Zhang, Libo & Ou, Zihan & Wang, Qunwei & Zhou, Dequn & Ma, Jiayu, 2022. "Robust model of electric vehicle charging station location considering renewable energy and storage equipment," Energy, Elsevier, vol. 238(PA).
- Basso, Rafael & Kulcsár, Balázs & Sanchez-Diaz, Ivan & Qu, Xiaobo, 2022. "Dynamic stochastic electric vehicle routing with safe reinforcement learning," Transportation Research Part E: Logistics and Transportation Review, Elsevier, vol. 157(C).
- Jaehyun Lee & Eunjung Lee & Jinho Kim, 2020. "Electric Vehicle Charging and Discharging Algorithm Based on Reinforcement Learning with Data-Driven Approach in Dynamic Pricing Scheme," Energies, MDPI, vol. 13(8), pages 1-18, April.
- Wang, Ning & Tang, Linhao & Zhang, Wenjian & Guo, Jiahui, 2019. "How to face the challenges caused by the abolishment of subsidies for electric vehicles in China?," Energy, Elsevier, vol. 166(C), pages 359-372.
- Metais, M.O. & Jouini, O. & Perez, Y. & Berrada, J. & Suomalainen, E., 2022. "Too much or not enough? Planning electric vehicle charging infrastructure: A review of modeling options," Renewable and Sustainable Energy Reviews, Elsevier, vol. 153(C).
- Paudel, Diwas & Das, Tapas K., 2023. "A deep reinforcement learning approach for power management of battery-assisted fast-charging EV hubs participating in day-ahead and real-time electricity markets," Energy, Elsevier, vol. 283(C).
- Dorokhova, Marina & Martinson, Yann & Ballif, Christophe & Wyrsch, Nicolas, 2021. "Deep reinforcement learning control of electric vehicle charging in the presence of photovoltaic generation," Applied Energy, Elsevier, vol. 301(C).
- Tuchnitz, Felix & Ebell, Niklas & Schlund, Jonas & Pruckner, Marco, 2021. "Development and Evaluation of a Smart Charging Strategy for an Electric Vehicle Fleet Based on Reinforcement Learning," Applied Energy, Elsevier, vol. 285(C).
- Sperling, Dan & Collantes, Gustavo O, 2008. "The origin of California’s zero emission vehicle mandate," Institute of Transportation Studies, Working Paper Series qt9pd8m8gs, Institute of Transportation Studies, UC Davis.
- Alqahtani, Mohammed & Hu, Mengqi, 2022. "Dynamic energy scheduling and routing of multiple electric vehicles using deep reinforcement learning," Energy, Elsevier, vol. 244(PA).
- He, Yi & Liu, Zhaocai & Song, Ziqi, 2020. "Optimal charging scheduling and management for a fast-charging battery electric bus system," Transportation Research Part E: Logistics and Transportation Review, Elsevier, vol. 142(C).
- Park, Keonwoo & Moon, Ilkyeong, 2022. "Multi-agent deep reinforcement learning approach for EV charging scheduling in a smart grid," Applied Energy, Elsevier, vol. 328(C).
- Yang, Woosuk, 2018. "A user-choice model for locating congested fast charging stations," Transportation Research Part E: Logistics and Transportation Review, Elsevier, vol. 110(C), pages 189-213.
- Imen Azzouz & Wiem Fekih Hassen, 2023. "Optimization of Electric Vehicles Charging Scheduling Based on Deep Reinforcement Learning: A Decentralized Approach," Energies, MDPI, vol. 16(24), pages 1-18, December.
- Wang, Kang & Wang, Haixin & Yang, Zihao & Feng, Jiawei & Li, Yanzhen & Yang, Junyou & Chen, Zhe, 2023. "A transfer learning method for electric vehicles charging strategy based on deep reinforcement learning," Applied Energy, Elsevier, vol. 343(C).
- Seong Wook Hwang & Sunghoon Lim, 2022. "The charging infrastructure design problem with electric taxi demand prediction using convolutional LSTM," European Journal of Industrial Engineering, Inderscience Enterprises Ltd, vol. 16(6), pages 651-678.
- Lee, Sangyoon & Choi, Dae-Hyun, 2021. "Dynamic pricing and energy management for profit maximization in multiple smart electric vehicle charging stations: A privacy-preserving deep reinforcement learning approach," Applied Energy, Elsevier, vol. 304(C).
- Park, Junseok & Moon, Ilkyeong, 2023. "A facility location problem in a mixed duopoly on networks," Transportation Research Part E: Logistics and Transportation Review, Elsevier, vol. 175(C).
- Xiong, Rui & Cao, Jiayi & Yu, Quanqing, 2018. "Reinforcement learning-based real-time power management for hybrid energy storage system in the plug-in hybrid electric vehicle," Applied Energy, Elsevier, vol. 211(C), pages 538-548.
- Jin, Ruiyang & Zhou, Yuke & Lu, Chao & Song, Jie, 2022. "Deep reinforcement learning-based strategy for charging station participating in demand response," Applied Energy, Elsevier, vol. 328(C).
- Ruisheng Wang & Zhong Chen & Qiang Xing & Ziqi Zhang & Tian Zhang, 2022. "A Modified Rainbow-Based Deep Reinforcement Learning Method for Optimal Scheduling of Charging Station," Sustainability, MDPI, vol. 14(3), pages 1-14, February.
- Luo, Yugong & Zhu, Tao & Wan, Shuang & Zhang, Shuwei & Li, Keqiang, 2016. "Optimal charging scheduling for large-scale EV (electric vehicle) deployment based on the interaction of the smart-grid and intelligent-transport systems," Energy, Elsevier, vol. 97(C), pages 359-368.
- Hu, Xu & Yang, Zhaojun & Sun, Jun & Zhang, Yali, 2023. "Optimal pricing strategy for electric vehicle battery swapping: Pay-per-swap or subscription?," Transportation Research Part E: Logistics and Transportation Review, Elsevier, vol. 171(C).
- Bansal, Vishal & Kumar, Deepak Prakash & Roy, Debjit & Subramanian, Shankar C., 2022. "Performance evaluation and optimization of design parameters for electric vehicle-sharing platforms by considering vehicle dynamics," Transportation Research Part E: Logistics and Transportation Review, Elsevier, vol. 166(C).
- Junchi Ma & Yuan Zhang & Zongtao Duan & Lei Tang, 2023. "PROLIFIC: Deep Reinforcement Learning for Efficient EV Fleet Scheduling and Charging," Sustainability, MDPI, vol. 15(18), pages 1-22, September.
- Zhou, Kaile & Cheng, Lexin & Wen, Lulu & Lu, Xinhui & Ding, Tao, 2020. "A coordinated charging scheduling method for electric vehicles considering different charging demands," Energy, Elsevier, vol. 213(C).
- Liu, Haoxiang & Zou, Yuncheng & Chen, Ya & Long, Jiancheng, 2021. "Optimal locations and electricity prices for dynamic wireless charging links of electric vehicles for sustainable transportation," Transportation Research Part E: Logistics and Transportation Review, Elsevier, vol. 152(C).
- Jenn, Alan & Springel, Katalin & Gopal, Anand R., 2018. "Effectiveness of electric vehicle incentives in the United States," Energy Policy, Elsevier, vol. 119(C), pages 349-356.
- Liu, Yang & Wu, Fanyou & Lyu, Cheng & Li, Shen & Ye, Jieping & Qu, Xiaobo, 2022. "Deep dispatching: A deep reinforcement learning approach for vehicle dispatching on online ride-hailing platform," Transportation Research Part E: Logistics and Transportation Review, Elsevier, vol. 161(C).
- Volodymyr Mnih & Koray Kavukcuoglu & David Silver & Andrei A. Rusu & Joel Veness & Marc G. Bellemare & Alex Graves & Martin Riedmiller & Andreas K. Fidjeland & Georg Ostrovski & Stig Petersen & Charle, 2015. "Human-level control through deep reinforcement learning," Nature, Nature, vol. 518(7540), pages 529-533, February.
- Majidpour, Mostafa & Qiu, Charlie & Chu, Peter & Pota, Hemanshu R. & Gadh, Rajit, 2016. "Forecasting the EV charging load based on customer profile or station measurement?," Applied Energy, Elsevier, vol. 163(C), pages 134-141.
- Srivastava, Abhishek & Kumar, Rajeev Ranjan & Chakraborty, Abhishek & Mateen, Arqum & Narayanamurthy, Gopalakrishnan, 2022. "Design and selection of government policies for electric vehicles adoption: A global perspective," Transportation Research Part E: Logistics and Transportation Review, Elsevier, vol. 161(C).
Most related items
These are the items that most often cite the same works as this one and are cited by the same works as this one.- Qiu, Dawei & Wang, Yi & Hua, Weiqi & Strbac, Goran, 2023. "Reinforcement learning for electric vehicle applications in power systems:A critical review," Renewable and Sustainable Energy Reviews, Elsevier, vol. 173(C).
- Park, Junseok & Moon, Ilkyeong, 2023. "A facility location problem in a mixed duopoly on networks," Transportation Research Part E: Logistics and Transportation Review, Elsevier, vol. 175(C).
- Zhao, Zhonghao & Lee, Carman K.M. & Ren, Jingzheng, 2024. "A two-level charging scheduling method for public electric vehicle charging stations considering heterogeneous demand and nonlinear charging profile," Applied Energy, Elsevier, vol. 355(C).
- Omar Al-Ani & Sanjoy Das, 2022. "Reinforcement Learning: Theory and Applications in HEMS," Energies, MDPI, vol. 15(17), pages 1-37, September.
- Fescioglu-Unver, Nilgun & Yıldız Aktaş, Melike, 2023. "Electric vehicle charging service operations: A review of machine learning applications for infrastructure planning, control, pricing and routing," Renewable and Sustainable Energy Reviews, Elsevier, vol. 188(C).
- Dimitrios Vamvakas & Panagiotis Michailidis & Christos Korkas & Elias Kosmatopoulos, 2023. "Review and Evaluation of Reinforcement Learning Frameworks on Smart Grid Applications," Energies, MDPI, vol. 16(14), pages 1-38, July.
- Zhang, Tianren & Huang, Yuping & Liao, Hui & Liang, Yu, 2023. "A hybrid electric vehicle load classification and forecasting approach based on GBDT algorithm and temporal convolutional network," Applied Energy, Elsevier, vol. 351(C).
- Zhao, Zhonghao & Lee, Carman K.M. & Huo, Jiage, 2023. "EV charging station deployment on coupled transportation and power distribution networks via reinforcement learning," Energy, Elsevier, vol. 267(C).
- Paudel, Diwas & Das, Tapas K., 2023. "A deep reinforcement learning approach for power management of battery-assisted fast-charging EV hubs participating in day-ahead and real-time electricity markets," Energy, Elsevier, vol. 283(C).
- Truong, Van Binh & Le, Long Bao, 2024. "Electric vehicle charging design: The factored action based reinforcement learning approach," Applied Energy, Elsevier, vol. 359(C).
- Manzolli, Jônatas Augusto & Trovão, João Pedro & Antunes, Carlos Henggeler, 2022. "A review of electric bus vehicles research topics – Methods and trends," Renewable and Sustainable Energy Reviews, Elsevier, vol. 159(C).
- Abid, Md. Shadman & Apon, Hasan Jamil & Hossain, Salman & Ahmed, Ashik & Ahshan, Razzaqul & Lipu, M.S. Hossain, 2024. "A novel multi-objective optimization based multi-agent deep reinforcement learning approach for microgrid resources planning," Applied Energy, Elsevier, vol. 353(PA).
- Pegah Alaee & Julius Bems & Amjad Anvari-Moghaddam, 2023. "A Review of the Latest Trends in Technical and Economic Aspects of EV Charging Management," Energies, MDPI, vol. 16(9), pages 1-28, April.
- Liu, Lu & Zhou, Kaile, 2022. "Electric vehicle charging scheduling considering urgent demand under different charging modes," Energy, Elsevier, vol. 249(C).
- Ahmed M. Abed & Ali AlArjani, 2022. "The Neural Network Classifier Works Efficiently on Searching in DQN Using the Autonomous Internet of Things Hybridized by the Metaheuristic Techniques to Reduce the EVs’ Service Scheduling Time," Energies, MDPI, vol. 15(19), pages 1-25, September.
- Byungsung Lee & Haesung Lee & Hyun Ahn, 2020. "Improving Load Forecasting of Electric Vehicle Charging Stations Through Missing Data Imputation," Energies, MDPI, vol. 13(18), pages 1-15, September.
- Imen Azzouz & Wiem Fekih Hassen, 2023. "Optimization of Electric Vehicles Charging Scheduling Based on Deep Reinforcement Learning: A Decentralized Approach," Energies, MDPI, vol. 16(24), pages 1-18, December.
- Bhardwaj, Chandan & Axsen, Jonn & Kern, Florian & McCollum, David, 2020. "Why have multiple climate policies for light-duty vehicles? Policy mix rationales, interactions and research gaps," Transportation Research Part A: Policy and Practice, Elsevier, vol. 135(C), pages 309-326.
- Yang, Ting & Zhao, Liyuan & Li, Wei & Zomaya, Albert Y., 2021. "Dynamic energy dispatch strategy for integrated energy system based on improved deep reinforcement learning," Energy, Elsevier, vol. 235(C).
- Daniel Egan & Qilun Zhu & Robert Prucka, 2023. "A Review of Reinforcement Learning-Based Powertrain Controllers: Effects of Agent Selection for Mixed-Continuity Control and Reward Formulation," Energies, MDPI, vol. 16(8), pages 1-31, April.
More about this item
Keywords
Electric vehicle; Charging scheduling; Reinforcement learning; Charging station; Literature review;All these keywords.
Statistics
Access and download statisticsCorrections
All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:transe:v:190:y:2024:i:c:s1366554524002898. See general information about how to correct material in RePEc.
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/wps/find/journaldescription.cws_home/600244/description#description .
Please note that corrections may take a couple of weeks to filter through the various RePEc services.