Reinforcement learning for demand response: A review of algorithms and modeling techniques
Author
Abstract
Suggested Citation
DOI: 10.1016/j.apenergy.2018.11.002
Download full text from publisher
As the access to this document is restricted, you may want to search for a different version of it.
References listed on IDEAS
- Zeng, Bo & Wu, Geng & Wang, Jianhui & Zhang, Jianhua & Zeng, Ming, 2017. "Impact of behavior-driven demand response on supply adequacy in smart distribution systems," Applied Energy, Elsevier, vol. 202(C), pages 125-137.
- Zhang, Xiaoshun & Bao, Tao & Yu, Tao & Yang, Bo & Han, Chuanjia, 2017. "Deep transfer Q-learning with virtual leader-follower for supply-demand Stackelberg game of smart grid," Energy, Elsevier, vol. 133(C), pages 348-365.
- Venkatesan, Naveen & Solanki, Jignesh & Solanki, Sarika Khushalani, 2012. "Residential Demand Response model and impact on voltage profile and losses of an electric distribution network," Applied Energy, Elsevier, vol. 96(C), pages 84-91.
- Kazmi, Hussain & Mehmood, Fahad & Lodeweyckx, Stefan & Driesen, Johan, 2018. "Gigawatt-hour scale savings on a budget of zero: Deep reinforcement learning based optimal control of hot water systems," Energy, Elsevier, vol. 144(C), pages 159-168.
- Xiong, Rui & Duan, Yanzhou & Cao, Jiayi & Yu, Quanqing, 2018. "Battery and ultracapacitor in-the-loop approach to validate a real-time power management method for an all-climate electric vehicle," Applied Energy, Elsevier, vol. 217(C), pages 153-165.
- David P. Chassin & Jason C. Fuller & Ned Djilali, 2014. "GridLAB-D: An Agent-Based Simulation Framework for Smart Grids," Journal of Applied Mathematics, Hindawi, vol. 2014, pages 1-12, June.
- Park, June Young & Nagy, Zoltan, 2018. "Comprehensive analysis of the relationship between thermal comfort and building control research - A data-driven literature review," Renewable and Sustainable Energy Reviews, Elsevier, vol. 82(P3), pages 2664-2679.
- Yang, Lei & Nagy, Zoltan & Goffin, Philippe & Schlueter, Arno, 2015. "Reinforcement learning for optimal control of low exergy buildings," Applied Energy, Elsevier, vol. 156(C), pages 577-586.
- Salehizadeh, Mohammad Reza & Soltaniyan, Salman, 2016. "Application of fuzzy Q-learning for electricity market modeling by considering renewable power penetration," Renewable and Sustainable Energy Reviews, Elsevier, vol. 56(C), pages 1172-1181.
- Shariatzadeh, Farshid & Mandal, Paras & Srivastava, Anurag K., 2015. "Demand response for sustainable energy systems: A review, application and implementation strategy," Renewable and Sustainable Energy Reviews, Elsevier, vol. 45(C), pages 343-350.
- Dupont, B. & Dietrich, K. & De Jonghe, C. & Ramos, A. & Belmans, R., 2014. "Impact of residential demand response on power system operation: A Belgian case study," Applied Energy, Elsevier, vol. 122(C), pages 1-10.
- Kazmi, H. & D’Oca, S. & Delmastro, C. & Lodeweyckx, S. & Corgnati, S.P., 2016. "Generalizable occupant-driven optimization model for domestic hot water production in NZEB," Applied Energy, Elsevier, vol. 175(C), pages 1-15.
- Siano, Pierluigi, 2014. "Demand response and smart grids—A survey," Renewable and Sustainable Energy Reviews, Elsevier, vol. 30(C), pages 461-478.
- Shuxian Li & Minghui Hu & Changchao Gong & Sen Zhan & Datong Qin, 2018. "Energy Management Strategy for Hybrid Electric Vehicle Based on Driving Condition Identification Using KGA-Means," Energies, MDPI, vol. 11(6), pages 1-16, June.
- Frederik Ruelens & Sandro Iacovella & Bert J. Claessens & Ronnie Belmans, 2015. "Learning Agent for a Heat-Pump Thermostat with a Set-Back Strategy Using Model-Free Reinforcement Learning," Energies, MDPI, vol. 8(8), pages 1-19, August.
- Dupont, B. & De Jonghe, C. & Olmos, L. & Belmans, R., 2014. "Demand response with locational dynamic pricing to support the integration of renewables," Energy Policy, Elsevier, vol. 67(C), pages 344-354.
- Liu, Teng & Wang, Bo & Yang, Chenglang, 2018. "Online Markov Chain-based energy management for a hybrid tracked vehicle with speedy Q-learning," Energy, Elsevier, vol. 160(C), pages 544-555.
- Kofinas, P. & Dounis, A.I. & Vouros, G.A., 2018. "Fuzzy Q-Learning for multi-agent decentralized energy management in microgrids," Applied Energy, Elsevier, vol. 219(C), pages 53-67.
- Jiang, C.X. & Jing, Z.X. & Cui, X.R. & Ji, T.Y. & Wu, Q.H., 2018. "Multiple agents and reinforcement learning for modelling charging loads of electric taxis," Applied Energy, Elsevier, vol. 222(C), pages 158-168.
- Zou, Yuan & Liu, Teng & Liu, Dexing & Sun, Fengchun, 2016. "Reinforcement learning-based real-time energy management for a hybrid tracked vehicle," Applied Energy, Elsevier, vol. 171(C), pages 372-382.
- Wu, Jingda & He, Hongwen & Peng, Jiankun & Li, Yuecheng & Li, Zhanjiang, 2018. "Continuous reinforcement learning of energy management with deep Q network for a power split hybrid electric bus," Applied Energy, Elsevier, vol. 222(C), pages 799-811.
- Shen, Peihong & Zhao, Zhiguo & Zhan, Xiaowen & Li, Jingwei & Guo, Qiuyi, 2018. "Optimal energy management strategy for a plug-in hybrid electric commercial vehicle based on velocity prediction," Energy, Elsevier, vol. 155(C), pages 838-852.
- Xiong, Rui & Cao, Jiayi & Yu, Quanqing, 2018. "Reinforcement learning-based real-time power management for hybrid energy storage system in the plug-in hybrid electric vehicle," Applied Energy, Elsevier, vol. 211(C), pages 538-548.
- Nejat, Payam & Jomehzadeh, Fatemeh & Taheri, Mohammad Mahdi & Gohari, Mohammad & Abd. Majid, Muhd Zaimi, 2015. "A global review of energy consumption, CO2 emissions and policy in the residential sector (with an overview of the top ten CO2 emitting countries)," Renewable and Sustainable Energy Reviews, Elsevier, vol. 43(C), pages 843-862.
- Herter, Karen & McAuliffe, Patrick & Rosenfeld, Arthur, 2007. "An exploratory analysis of California residential customer response to critical peak pricing of electricity," Energy, Elsevier, vol. 32(1), pages 25-34.
- Leibowicz, Benjamin D. & Lanham, Christopher M. & Brozynski, Max T. & Vázquez-Canteli, José R. & Castejón, Nicolás Castillo & Nagy, Zoltan, 2018. "Optimal decarbonization pathways for urban residential building energy services," Applied Energy, Elsevier, vol. 230(C), pages 1311-1325.
- Zehui Kong & Yuan Zou & Teng Liu, 2017. "Implementation of real-time energy management strategy based on reinforcement learning for hybrid electric vehicles and simulation validation," PLOS ONE, Public Library of Science, vol. 12(7), pages 1-16, July.
- Aghaei, Jamshid & Alizadeh, Mohammad-Iman, 2013. "Demand response in smart electricity grids equipped with renewable energy sources: A review," Renewable and Sustainable Energy Reviews, Elsevier, vol. 18(C), pages 64-72.
- Dusparic, Ivana & Taylor, Adam & Marinescu, Andrei & Golpayegani, Fatemeh & Clarke, Siobhan, 2017. "Residential demand response: Experimental evaluation and comparison of self-organizing techniques," Renewable and Sustainable Energy Reviews, Elsevier, vol. 80(C), pages 1528-1536.
- Wang, Jianxiao & Zhong, Haiwang & Ma, Ziming & Xia, Qing & Kang, Chongqing, 2017. "Review and prospect of integrated demand response in the multi-energy system," Applied Energy, Elsevier, vol. 202(C), pages 772-782.
- Brida V. Mbuwir & Frederik Ruelens & Fred Spiessens & Geert Deconinck, 2017. "Battery Energy Management in a Microgrid Using Batch Reinforcement Learning," Energies, MDPI, vol. 10(11), pages 1-19, November.
Most related items
These are the items that most often cite the same works as this one and are cited by the same works as this one.- Perera, A.T.D. & Kamalaruban, Parameswaran, 2021. "Applications of reinforcement learning in energy systems," Renewable and Sustainable Energy Reviews, Elsevier, vol. 137(C).
- Wang, Zhe & Hong, Tianzhen, 2020. "Reinforcement learning for building controls: The opportunities and challenges," Applied Energy, Elsevier, vol. 269(C).
- Daniel Egan & Qilun Zhu & Robert Prucka, 2023. "A Review of Reinforcement Learning-Based Powertrain Controllers: Effects of Agent Selection for Mixed-Continuity Control and Reward Formulation," Energies, MDPI, vol. 16(8), pages 1-31, April.
- Wu, Yuankai & Tan, Huachun & Peng, Jiankun & Zhang, Hailong & He, Hongwen, 2019. "Deep reinforcement learning of energy management with continuous control strategy and traffic information for a series-parallel plug-in hybrid electric bus," Applied Energy, Elsevier, vol. 247(C), pages 454-466.
- Dong, Peng & Zhao, Junwei & Liu, Xuewu & Wu, Jian & Xu, Xiangyang & Liu, Yanfang & Wang, Shuhan & Guo, Wei, 2022. "Practical application of energy management strategy for hybrid electric vehicles based on intelligent and connected technologies: Development stages, challenges, and future trends," Renewable and Sustainable Energy Reviews, Elsevier, vol. 170(C).
- Ramya Kuppusamy & Srete Nikolovski & Yuvaraja Teekaraman, 2023. "Review of Machine Learning Techniques for Power Quality Performance Evaluation in Grid-Connected Systems," Sustainability, MDPI, vol. 15(20), pages 1-29, October.
- McPherson, Madeleine & Stoll, Brady, 2020. "Demand response for variable renewable energy integration: A proposed approach and its impacts," Energy, Elsevier, vol. 197(C).
- Yang, Ningkang & Han, Lijin & Xiang, Changle & Liu, Hui & Li, Xunmin, 2021. "An indirect reinforcement learning based real-time energy management strategy via high-order Markov Chain model for a hybrid electric vehicle," Energy, Elsevier, vol. 236(C).
- Pinto, Giuseppe & Piscitelli, Marco Savino & Vázquez-Canteli, José Ramón & Nagy, Zoltán & Capozzoli, Alfonso, 2021. "Coordinated energy management for a cluster of buildings through deep reinforcement learning," Energy, Elsevier, vol. 229(C).
- Shi, Wenzhuo & Huangfu, Yigeng & Xu, Liangcai & Pang, Shengzhao, 2022. "Online energy management strategy considering fuel cell fault for multi-stack fuel cell hybrid vehicle based on multi-agent reinforcement learning," Applied Energy, Elsevier, vol. 328(C).
- Talari, Saber & Shafie-khah, Miadreza & Osório, Gerardo J. & Aghaei, Jamshid & Catalão, João P.S., 2018. "Stochastic modelling of renewable energy sources from operators' point-of-view: A survey," Renewable and Sustainable Energy Reviews, Elsevier, vol. 81(P2), pages 1953-1965.
- Yang, Changhui & Meng, Chen & Zhou, Kaile, 2018. "Residential electricity pricing in China: The context of price-based demand response," Renewable and Sustainable Energy Reviews, Elsevier, vol. 81(P2), pages 2870-2878.
- Guo, Peiyang & Li, Victor O.K. & Lam, Jacqueline C.K., 2017. "Smart demand response in China: Challenges and drivers," Energy Policy, Elsevier, vol. 107(C), pages 1-10.
- Shen, Rendong & Zhong, Shengyuan & Wen, Xin & An, Qingsong & Zheng, Ruifan & Li, Yang & Zhao, Jun, 2022. "Multi-agent deep reinforcement learning optimization framework for building energy system with renewable energy," Applied Energy, Elsevier, vol. 312(C).
- Wu, Peng & Partridge, Julius & Bucknall, Richard, 2020. "Cost-effective reinforcement learning energy management for plug-in hybrid fuel cell and battery ships," Applied Energy, Elsevier, vol. 275(C).
- Stadler, Michael & Cardoso, Gonçalo & Mashayekh, Salman & Forget, Thibault & DeForest, Nicholas & Agarwal, Ankit & Schönbein, Anna, 2016. "Value streams in microgrids: A literature review," Applied Energy, Elsevier, vol. 162(C), pages 980-989.
- Haji Hosseinloo, Ashkan & Ryzhov, Alexander & Bischi, Aldo & Ouerdane, Henni & Turitsyn, Konstantin & Dahleh, Munther A., 2020. "Data-driven control of micro-climate in buildings: An event-triggered reinforcement learning approach," Applied Energy, Elsevier, vol. 277(C).
- Liu, Teng & Tan, Wenhao & Tang, Xiaolin & Zhang, Jinwei & Xing, Yang & Cao, Dongpu, 2021. "Driving conditions-driven energy management strategies for hybrid electric vehicles: A review," Renewable and Sustainable Energy Reviews, Elsevier, vol. 151(C).
- Li, Weihan & Cui, Han & Nemeth, Thomas & Jansen, Jonathan & Ünlübayir, Cem & Wei, Zhongbao & Feng, Xuning & Han, Xuebing & Ouyang, Minggao & Dai, Haifeng & Wei, Xuezhe & Sauer, Dirk Uwe, 2021. "Cloud-based health-conscious energy management of hybrid battery systems in electric vehicles with deep reinforcement learning," Applied Energy, Elsevier, vol. 293(C).
- Nyong-Bassey, Bassey Etim & Giaouris, Damian & Patsios, Charalampos & Papadopoulou, Simira & Papadopoulos, Athanasios I. & Walker, Sara & Voutetakis, Spyros & Seferlis, Panos & Gadoue, Shady, 2020. "Reinforcement learning based adaptive power pinch analysis for energy management of stand-alone hybrid energy storage systems considering uncertainty," Energy, Elsevier, vol. 193(C).
More about this item
Keywords
Machine learning; Deep learning; HVAC control; Building energy; Electric vehicles; Smart grid;All these keywords.
Statistics
Access and download statisticsCorrections
All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:appene:v:235:y:2019:i:c:p:1072-1089. See general information about how to correct material in RePEc.
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/wps/find/journaldescription.cws_home/405891/description#description .
Please note that corrections may take a couple of weeks to filter through the various RePEc services.