Supervised-learning-based hour-ahead demand response for a behavior-based home energy management system approximating MILP optimization
Author
Abstract
Suggested Citation
DOI: 10.1016/j.apenergy.2022.119382
Download full text from publisher
As the access to this document is restricted, you may want to search for a different version of it.
References listed on IDEAS
- Terlouw, Tom & AlSkaif, Tarek & Bauer, Christian & van Sark, Wilfried, 2019. "Optimal energy management in all-electric residential energy systems with heat and electricity storage," Applied Energy, Elsevier, vol. 254(C).
- Lu, Renzhi & Li, Yi-Chang & Li, Yuting & Jiang, Junhui & Ding, Yuemin, 2020. "Multi-agent deep reinforcement learning based demand response for discrete manufacturing systems energy management," Applied Energy, Elsevier, vol. 276(C).
- Fridgen, Gilbert & Kahlen, Micha & Ketter, Wolfgang & Rieger, Alexander & Thimmel, Markus, 2018. "One rate does not fit all: An empirical analysis of electricity tariffs for residential microgrids," Applied Energy, Elsevier, vol. 210(C), pages 800-814.
- Elkazaz, Mahmoud & Sumner, Mark & Naghiyev, Eldar & Pholboon, Seksak & Davies, Richard & Thomas, David, 2020. "A hierarchical two-stage energy management for a home microgrid using model predictive and real-time controllers," Applied Energy, Elsevier, vol. 269(C).
- Adnan Ahmad & Asif Khan & Nadeem Javaid & Hafiz Majid Hussain & Wadood Abdul & Ahmad Almogren & Atif Alamri & Iftikhar Azim Niaz, 2017. "An Optimized Home Energy Management System with Integrated Renewable Energy and Storage Resources," Energies, MDPI, vol. 10(4), pages 1-35, April.
- Lu, Renzhi & Hong, Seung Ho, 2019. "Incentive-based demand response for smart grid with reinforcement learning and deep neural network," Applied Energy, Elsevier, vol. 236(C), pages 937-949.
- Volodymyr Mnih & Koray Kavukcuoglu & David Silver & Andrei A. Rusu & Joel Veness & Marc G. Bellemare & Alex Graves & Martin Riedmiller & Andreas K. Fidjeland & Georg Ostrovski & Stig Petersen & Charle, 2015. "Human-level control through deep reinforcement learning," Nature, Nature, vol. 518(7540), pages 529-533, February.
Citations
Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
Cited by:
- Ren, Kezheng & Liu, Jun & Wu, Zeyang & Liu, Xinglei & Nie, Yongxin & Xu, Haitao, 2024. "A data-driven DRL-based home energy management system optimization framework considering uncertain household parameters," Applied Energy, Elsevier, vol. 355(C).
- Nedim Tutkun & Luigi Scarcello & Carlo Mastroianni, 2023. "Improved Low-Cost Home Energy Management Considering User Preferences with Photovoltaic and Energy-Storage Systems," Sustainability, MDPI, vol. 15(11), pages 1-25, May.
- Muhammad Irfan & Sara Deilami & Shujuan Huang & Binesh Puthen Veettil, 2023. "Rooftop Solar and Electric Vehicle Integration for Smart, Sustainable Homes: A Comprehensive Review," Energies, MDPI, vol. 16(21), pages 1-29, October.
Most related items
These are the items that most often cite the same works as this one and are cited by the same works as this one.- Xie, Jiahan & Ajagekar, Akshay & You, Fengqi, 2023. "Multi-Agent attention-based deep reinforcement learning for demand response in grid-responsive buildings," Applied Energy, Elsevier, vol. 342(C).
- Omar Al-Ani & Sanjoy Das, 2022. "Reinforcement Learning: Theory and Applications in HEMS," Energies, MDPI, vol. 15(17), pages 1-37, September.
- Zhang, Yang & Yang, Qingyu & Li, Donghe & An, Dou, 2022. "A reinforcement and imitation learning method for pricing strategy of electricity retailer with customers’ flexibility," Applied Energy, Elsevier, vol. 323(C).
- Lu, Renzhi & Bai, Ruichang & Ding, Yuemin & Wei, Min & Jiang, Junhui & Sun, Mingyang & Xiao, Feng & Zhang, Hai-Tao, 2021. "A hybrid deep learning-based online energy management scheme for industrial microgrid," Applied Energy, Elsevier, vol. 304(C).
- Qi, Chunyang & Zhu, Yiwen & Song, Chuanxue & Yan, Guangfu & Xiao, Feng & Da wang, & Zhang, Xu & Cao, Jingwei & Song, Shixin, 2022. "Hierarchical reinforcement learning based energy management strategy for hybrid electric vehicle," Energy, Elsevier, vol. 238(PA).
- Chao-Chung Hsu & Bi-Hai Jiang & Chun-Cheng Lin, 2023. "A Survey on Recent Applications of Artificial Intelligence and Optimization for Smart Grids in Smart Manufacturing," Energies, MDPI, vol. 16(22), pages 1-15, November.
- Park, Keonwoo & Moon, Ilkyeong, 2022. "Multi-agent deep reinforcement learning approach for EV charging scheduling in a smart grid," Applied Energy, Elsevier, vol. 328(C).
- Amit Shewale & Anil Mokhade & Nitesh Funde & Neeraj Dhanraj Bokde, 2022. "A Survey of Efficient Demand-Side Management Techniques for the Residential Appliance Scheduling Problem in Smart Homes," Energies, MDPI, vol. 15(8), pages 1-34, April.
- Antonopoulos, Ioannis & Robu, Valentin & Couraud, Benoit & Kirli, Desen & Norbu, Sonam & Kiprakis, Aristides & Flynn, David & Elizondo-Gonzalez, Sergio & Wattam, Steve, 2020. "Artificial intelligence and machine learning approaches to energy demand-side response: A systematic review," Renewable and Sustainable Energy Reviews, Elsevier, vol. 130(C).
- Ma, Siyu & Liu, Hui & Wang, Ni & Huang, Lidong & Goh, Hui Hwang, 2023. "Incentive-based demand response under incomplete information based on the deep deterministic policy gradient," Applied Energy, Elsevier, vol. 351(C).
- Qiu, Dawei & Ye, Yujian & Papadaskalopoulos, Dimitrios & Strbac, Goran, 2021. "Scalable coordinated management of peer-to-peer energy trading: A multi-cluster deep reinforcement learning approach," Applied Energy, Elsevier, vol. 292(C).
- Eduardo J. Salazar & Mauro Jurado & Mauricio E. Samper, 2023. "Reinforcement Learning-Based Pricing and Incentive Strategy for Demand Response in Smart Grids," Energies, MDPI, vol. 16(3), pages 1-33, February.
- Perera, A.T.D. & Kamalaruban, Parameswaran, 2021. "Applications of reinforcement learning in energy systems," Renewable and Sustainable Energy Reviews, Elsevier, vol. 137(C).
- Godiana Hagile Philipo & Josephine Nakato Kakande & Stefan Krauter, 2022. "Neural Network-Based Demand-Side Management in a Stand-Alone Solar PV-Battery Microgrid Using Load-Shifting and Peak-Clipping," Energies, MDPI, vol. 15(14), pages 1-18, July.
- Razzak, Abdur & Islam, Md. Tariqul & Roy, Palash & Razzaque, Md. Abdur & Hassan, Md. Rafiul & Hassan, Mohammad Mehedi, 2024. "Leveraging Deep Q-Learning to maximize consumer quality of experience in smart grid," Energy, Elsevier, vol. 290(C).
- Guo, Chenyu & Wang, Xin & Zheng, Yihui & Zhang, Feng, 2022. "Real-time optimal energy management of microgrid with uncertainties based on deep reinforcement learning," Energy, Elsevier, vol. 238(PC).
- Lu, Renzhi & Li, Yi-Chang & Li, Yuting & Jiang, Junhui & Ding, Yuemin, 2020. "Multi-agent deep reinforcement learning based demand response for discrete manufacturing systems energy management," Applied Energy, Elsevier, vol. 276(C).
- Ussama Assad & Muhammad Arshad Shehzad Hassan & Umar Farooq & Asif Kabir & Muhammad Zeeshan Khan & S. Sabahat H. Bukhari & Zain ul Abidin Jaffri & Judit Oláh & József Popp, 2022. "Smart Grid, Demand Response and Optimization: A Critical Review of Computational Methods," Energies, MDPI, vol. 15(6), pages 1-36, March.
- Wu, Yuankai & Tan, Huachun & Peng, Jiankun & Zhang, Hailong & He, Hongwen, 2019. "Deep reinforcement learning of energy management with continuous control strategy and traffic information for a series-parallel plug-in hybrid electric bus," Applied Energy, Elsevier, vol. 247(C), pages 454-466.
- Ajagekar, Akshay & Decardi-Nelson, Benjamin & You, Fengqi, 2024. "Energy management for demand response in networked greenhouses with multi-agent deep reinforcement learning," Applied Energy, Elsevier, vol. 355(C).
More about this item
Keywords
Behavior-based HEMS; MILP; Supervised learning; Deep reinforcement learning; Demand response;All these keywords.
Statistics
Access and download statisticsCorrections
All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:appene:v:321:y:2022:i:c:s0306261922007231. See general information about how to correct material in RePEc.
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/wps/find/journaldescription.cws_home/405891/description#description .
Please note that corrections may take a couple of weeks to filter through the various RePEc services.