IDEAS home Printed from https://ideas.repec.org/a/gam/jsusta/v17y2025i2p407-d1562054.html
   My bibliography  Save this article

Deep Reinforcement Learning-Based Real-Time Energy Management for an Integrated Electric–Thermal Energy System

Author

Listed:
  • Qiang Shuai

    (College of Electrical Engineering, Sichuan University, Chengdu 610065, China)

  • Yue Yin

    (College of Electrical Engineering, Sichuan University, Chengdu 610065, China)

  • Shan Huang

    (College of Electrical Engineering, Sichuan University, Chengdu 610065, China)

  • Chao Chen

    (College of Electrical Engineering, Sichuan University, Chengdu 610065, China)

Abstract

Renewable energy plays a crucial role in achieving sustainable development and has the potential to meet humanity’s long-term energy requirements. Integrated electric–thermal energy systems are an important way to consume a high proportion of renewable energy. The intermittency and volatility of integrated electric–thermal energy systems make solving energy management optimization problems difficult. Thus, this paper proposes an energy management optimization method for an integrated electric–thermal energy system based on the improved proximal policy optimization algorithm, which effectively mitigates the problems of the traditional heuristic algorithms or mathematical planning methods with low accuracy and low solving efficiency. Meanwhile, the proposed algorithm enhances both the convergence speed and overall performance compared to the proximal policy optimization algorithm. This paper first establishes a mathematical model for the energy management of an integrated electric–thermal energy system. Then, the model is formulated as a Markov decision process, and a reward mechanism is designed to guide the agent to learn the uncertainty characteristics of renewable energy output and load consumption in the system through historical data. Finally, in the case study section, the proposed algorithm reduces the average running cost by 2.32% compared to the other algorithms discussed in this paper, thereby demonstrating its effectiveness and cost-efficiency.

Suggested Citation

  • Qiang Shuai & Yue Yin & Shan Huang & Chao Chen, 2025. "Deep Reinforcement Learning-Based Real-Time Energy Management for an Integrated Electric–Thermal Energy System," Sustainability, MDPI, vol. 17(2), pages 1-17, January.
  • Handle: RePEc:gam:jsusta:v:17:y:2025:i:2:p:407-:d:1562054
    as

    Download full text from publisher

    File URL: https://www.mdpi.com/2071-1050/17/2/407/pdf
    Download Restriction: no

    File URL: https://www.mdpi.com/2071-1050/17/2/407/
    Download Restriction: no
    ---><---

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:gam:jsusta:v:17:y:2025:i:2:p:407-:d:1562054. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: MDPI Indexing Manager (email available below). General contact details of provider: https://www.mdpi.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.