IDEAS home Printed from https://ideas.repec.org/a/eee/appene/v324y2022ics0306261922006924.html
   My bibliography  Save this article

Deep reinforcement learning-based joint load scheduling for household multi-energy system

Author

Listed:
  • Zhao, Liyuan
  • Yang, Ting
  • Li, Wei
  • Zomaya, Albert Y.

Abstract

Under the background of the popularization of renewable energy sources and gas-fired domestic devices in households, this paper proposes a joint load scheduling strategy for household multi-energy system (HMES) aiming at minimizing residents’ energy cost while maintaining the thermal comfort. Specifically, the studied HMES contains photovoltaic, gas-electric hybrid heating system, gas-electric kitchen stove and various types of conventional loads. Yet, it is challenging to develop an efficient energy scheduling strategy due to the uncertainties in energy price, photovoltaic generation, outdoor temperature, and residents’ hot water demand. To tackle this problem, we formulate the HMES scheduling problem as a Markov decision process with both continuous and discrete actions and propose a deep reinforcement learning-based HMES scheduling approach. A mixed distribution is used to approximate the scheduling strategies of different types of household devices, and proximal policy optimization is used to optimize the scheduling strategies without requiring any prediction information or distribution knowledge of system uncertainties. The proposed approach can handle continuous actions of power-shiftable devices and discrete actions of time-shiftable devices simultaneously, as well as the optimal management of electrical devices and gas-fired devices, so as to jointly optimize the operation of all household loads. The proposed approach is compared with a deep Q network (DQN)-based approach and a model predictive control (MPC)-based approach. Comparison results show that the average energy cost of the proposed approach is reduced by 12.17% compared to the DQN-based approach and 4.59% compared to the MPC-based approach.

Suggested Citation

  • Zhao, Liyuan & Yang, Ting & Li, Wei & Zomaya, Albert Y., 2022. "Deep reinforcement learning-based joint load scheduling for household multi-energy system," Applied Energy, Elsevier, vol. 324(C).
  • Handle: RePEc:eee:appene:v:324:y:2022:i:c:s0306261922006924
    DOI: 10.1016/j.apenergy.2022.119346
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0306261922006924
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.apenergy.2022.119346?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Wang, Jidong & Liu, Jianxin & Li, Chenghao & Zhou, Yue & Wu, Jianzhong, 2020. "Optimal scheduling of gas and electricity consumption in a smart home with a hybrid gas boiler and electric heating system," Energy, Elsevier, vol. 204(C).
    2. Lu, Renzhi & Hong, Seung Ho & Zhang, Xiongfeng, 2018. "A Dynamic pricing demand response algorithm for smart grid: Reinforcement learning approach," Applied Energy, Elsevier, vol. 220(C), pages 220-230.
    3. Sun, Mingyang & Djapic, Predrag & Aunedi, Marko & Pudjianto, Danny & Strbac, Goran, 2019. "Benefits of smart control of hybrid heat pumps: An analysis of field trial data," Applied Energy, Elsevier, vol. 247(C), pages 525-536.
    4. Hua, Haochen & Qin, Yuchao & Hao, Chuantong & Cao, Junwei, 2019. "Optimal energy management strategies for energy Internet via deep reinforcement learning approach," Applied Energy, Elsevier, vol. 239(C), pages 598-609.
    5. McKenna, Eoghan & Thomson, Murray, 2016. "High-resolution stochastic integrated thermal–electrical domestic demand model," Applied Energy, Elsevier, vol. 165(C), pages 445-461.
    6. Zhu, Jiawei & Lin, Yishuai & Lei, Weidong & Liu, Youquan & Tao, Mengling, 2019. "Optimal household appliances scheduling of multiple smart homes using an improved cooperative algorithm," Energy, Elsevier, vol. 171(C), pages 944-955.
    7. Du, Yan & Zandi, Helia & Kotevska, Olivera & Kurte, Kuldeep & Munk, Jeffery & Amasyali, Kadir & Mckee, Evan & Li, Fangxing, 2021. "Intelligent multi-zone residential HVAC control strategy based on deep reinforcement learning," Applied Energy, Elsevier, vol. 281(C).
    8. Li, Gang & Du, Yuqing, 2018. "Performance investigation and economic benefits of new control strategies for heat pump-gas fired water heater hybrid system," Applied Energy, Elsevier, vol. 232(C), pages 101-118.
    9. Ampimah, Benjamin Chris & Sun, Mei & Han, Dun & Wang, Xueyin, 2018. "Optimizing sheddable and shiftable residential electricity consumption by incentivized peak and off-peak credit function approach," Applied Energy, Elsevier, vol. 210(C), pages 1299-1309.
    10. Jin, Xin & Baker, Kyri & Christensen, Dane & Isley, Steven, 2017. "Foresee: A user-centric home energy management system for energy efficiency and demand response," Applied Energy, Elsevier, vol. 205(C), pages 1583-1595.
    11. Jiang, Yibo & Xu, Jian & Sun, Yuanzhang & Wei, Congying & Wang, Jing & Ke, Deping & Li, Xiong & Yang, Jun & Peng, Xiaotao & Tang, Bowen, 2017. "Day-ahead stochastic economic dispatch of wind integrated power system considering demand response of residential hybrid energy system," Applied Energy, Elsevier, vol. 190(C), pages 1126-1137.
    12. Kong, Xiangyu & Kong, Deqian & Yao, Jingtao & Bai, Linquan & Xiao, Jie, 2020. "Online pricing of demand response based on long short-term memory and reinforcement learning," Applied Energy, Elsevier, vol. 271(C).
    13. Jing, Rui & Xie, Mei Na & Wang, Feng Xiang & Chen, Long Xiang, 2020. "Fair P2P energy trading between residential and commercial multi-energy systems enabling integrated demand-side management," Applied Energy, Elsevier, vol. 262(C).
    14. Zhang, Wei & Wang, Jixin & Liu, Yong & Gao, Guangzong & Liang, Siwen & Ma, Hongfeng, 2020. "Reinforcement learning-based intelligent energy management architecture for hybrid construction machinery," Applied Energy, Elsevier, vol. 275(C).
    15. Zhou, Zhihua & Zhang, Zhiming & Zuo, Jian & Huang, Ke & Zhang, Liying, 2015. "Phase change materials for solar thermal energy storage in residential buildings in cold climate," Renewable and Sustainable Energy Reviews, Elsevier, vol. 48(C), pages 692-703.
    16. Yang, Ting & Zhao, Liyuan & Li, Wei & Wu, Jianzhong & Zomaya, Albert Y., 2021. "Towards healthy and cost-effective indoor environment management in smart homes: A deep reinforcement learning approach," Applied Energy, Elsevier, vol. 300(C).
    17. Killian, M. & Zauner, M. & Kozek, M., 2018. "Comprehensive smart home energy management system using mixed-integer quadratic-programming," Applied Energy, Elsevier, vol. 222(C), pages 662-672.
    18. Lu, Renzhi & Hong, Seung Ho, 2019. "Incentive-based demand response for smart grid with reinforcement learning and deep neural network," Applied Energy, Elsevier, vol. 236(C), pages 937-949.
    19. Su, Yongxin & Zhou, Yao & Tan, Mao, 2020. "An interval optimization strategy of household multi-energy system considering tolerance degree and integrated demand response," Applied Energy, Elsevier, vol. 260(C).
    20. Wang, Zhikun & Crawley, Jenny & Li, Francis G.N. & Lowe, Robert, 2020. "Sizing of district heating systems based on smart meter data: Quantifying the aggregated domestic energy demand and demand diversity in the UK," Energy, Elsevier, vol. 193(C).
    21. Thomas, Dimitrios & D’Hoop, Gaspard & Deblecker, Olivier & Genikomsakis, Konstantinos N. & Ioakimidis, Christos S., 2020. "An integrated tool for optimal energy scheduling and power quality improvement of a microgrid under multiple demand response schemes," Applied Energy, Elsevier, vol. 260(C).
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Sabarathinam Srinivasan & Suresh Kumarasamy & Zacharias E. Andreadakis & Pedro G. Lind, 2023. "Artificial Intelligence and Mathematical Models of Power Grids Driven by Renewable Energy Sources: A Survey," Energies, MDPI, vol. 16(14), pages 1-56, July.
    2. Du, Yu & Li, Jun-qing, 2024. "A deep reinforcement learning based algorithm for a distributed precast concrete production scheduling," International Journal of Production Economics, Elsevier, vol. 268(C).
    3. Wang, Yi & Qiu, Dawei & Sun, Mingyang & Strbac, Goran & Gao, Zhiwei, 2023. "Secure energy management of multi-energy microgrid: A physical-informed safe reinforcement learning approach," Applied Energy, Elsevier, vol. 335(C).
    4. Liu, Di & Qin, Zhaoming & Hua, Haochen & Ding, Yi & Cao, Junwei, 2023. "Incremental incentive mechanism design for diversified consumers in demand response," Applied Energy, Elsevier, vol. 329(C).

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Omar Al-Ani & Sanjoy Das, 2022. "Reinforcement Learning: Theory and Applications in HEMS," Energies, MDPI, vol. 15(17), pages 1-37, September.
    2. Kong, Xiangyu & Kong, Deqian & Yao, Jingtao & Bai, Linquan & Xiao, Jie, 2020. "Online pricing of demand response based on long short-term memory and reinforcement learning," Applied Energy, Elsevier, vol. 271(C).
    3. Zhang, Xiongfeng & Lu, Renzhi & Jiang, Junhui & Hong, Seung Ho & Song, Won Seok, 2021. "Testbed implementation of reinforcement learning-based demand response energy management system," Applied Energy, Elsevier, vol. 297(C).
    4. Zhu, Ziqing & Hu, Ze & Chan, Ka Wing & Bu, Siqi & Zhou, Bin & Xia, Shiwei, 2023. "Reinforcement learning in deregulated energy market: A comprehensive review," Applied Energy, Elsevier, vol. 329(C).
    5. Ma, Siyu & Liu, Hui & Wang, Ni & Huang, Lidong & Goh, Hui Hwang, 2023. "Incentive-based demand response under incomplete information based on the deep deterministic policy gradient," Applied Energy, Elsevier, vol. 351(C).
    6. Xie, Jiahan & Ajagekar, Akshay & You, Fengqi, 2023. "Multi-Agent attention-based deep reinforcement learning for demand response in grid-responsive buildings," Applied Energy, Elsevier, vol. 342(C).
    7. Eduardo J. Salazar & Mauro Jurado & Mauricio E. Samper, 2023. "Reinforcement Learning-Based Pricing and Incentive Strategy for Demand Response in Smart Grids," Energies, MDPI, vol. 16(3), pages 1-33, February.
    8. Perera, A.T.D. & Kamalaruban, Parameswaran, 2021. "Applications of reinforcement learning in energy systems," Renewable and Sustainable Energy Reviews, Elsevier, vol. 137(C).
    9. Lu, Renzhi & Li, Yi-Chang & Li, Yuting & Jiang, Junhui & Ding, Yuemin, 2020. "Multi-agent deep reinforcement learning based demand response for discrete manufacturing systems energy management," Applied Energy, Elsevier, vol. 276(C).
    10. Chen, J.J. & Qi, B.X. & Rong, Z.K. & Peng, K. & Zhao, Y.L. & Zhang, X.H., 2021. "Multi-energy coordinated microgrid scheduling with integrated demand response for flexibility improvement," Energy, Elsevier, vol. 217(C).
    11. Gao, Jianwei & Ma, Zeyang & Guo, Fengjia, 2019. "The influence of demand response on wind-integrated power system considering participation of the demand side," Energy, Elsevier, vol. 178(C), pages 723-738.
    12. Yi Kuang & Xiuli Wang & Hongyang Zhao & Yijun Huang & Xianlong Chen & Xifan Wang, 2020. "Agent-Based Energy Sharing Mechanism Using Deep Deterministic Policy Gradient Algorithm," Energies, MDPI, vol. 13(19), pages 1-20, September.
    13. Huakun Huang & Dingrong Dai & Longtao Guo & Sihui Xue & Huijun Wu, 2023. "AI and Big Data-Empowered Low-Carbon Buildings: Challenges and Prospects," Sustainability, MDPI, vol. 15(16), pages 1-21, August.
    14. Zeng, Huibin & Shao, Bilin & Dai, Hongbin & Tian, Ning & Zhao, Wei, 2023. "Incentive-based demand response strategies for natural gas considering carbon emissions and load volatility," Applied Energy, Elsevier, vol. 348(C).
    15. Huang, Zhijia & Wang, Fang & Lu, Yuehong & Chen, Xiaofeng & Wu, Qiqi, 2023. "Optimization model for home energy management system of rural dwellings," Energy, Elsevier, vol. 283(C).
    16. Yang, Ting & Zhao, Liyuan & Li, Wei & Zomaya, Albert Y., 2021. "Dynamic energy dispatch strategy for integrated energy system based on improved deep reinforcement learning," Energy, Elsevier, vol. 235(C).
    17. Zeyue Sun & Mohsen Eskandari & Chaoran Zheng & Ming Li, 2022. "Handling Computation Hardness and Time Complexity Issue of Battery Energy Storage Scheduling in Microgrids by Deep Reinforcement Learning," Energies, MDPI, vol. 16(1), pages 1-20, December.
    18. Ibrahim, Muhammad Sohail & Dong, Wei & Yang, Qiang, 2020. "Machine learning driven smart electric power systems: Current trends and new perspectives," Applied Energy, Elsevier, vol. 272(C).
    19. Xu, Fangyuan & Zhu, Weidong & Wang, Yi Fei & Lai, Chun Sing & Yuan, Haoliang & Zhao, Yujia & Guo, Siming & Fu, Zhengxin, 2022. "A new deregulated demand response scheme for load over-shifting city in regulated power market," Applied Energy, Elsevier, vol. 311(C).
    20. Lu, Renzhi & Bai, Ruichang & Ding, Yuemin & Wei, Min & Jiang, Junhui & Sun, Mingyang & Xiao, Feng & Zhang, Hai-Tao, 2021. "A hybrid deep learning-based online energy management scheme for industrial microgrid," Applied Energy, Elsevier, vol. 304(C).

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:appene:v:324:y:2022:i:c:s0306261922006924. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/wps/find/journaldescription.cws_home/405891/description#description .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.