IDEAS home Printed from https://ideas.repec.org/a/hin/jnlamp/1041791.html
   My bibliography  Save this article

Constructing Attention-LSTM-VAE Power Load Model Based on Multiple Features

Author

Listed:
  • Chaoyue Ma
  • Ying Wang
  • Feng Li
  • Huiyan Zhang
  • Yong Zhang
  • Haiyan Zhang
  • S. A. Edalatpanah

Abstract

With the complexity of modern power system and the susceptibility to external weather influences, it brings challenges to build an accurate load model. This paper proposes a variational autoencoder (VAE) long short-term memory (LSTM) load model based on the attention mechanism (Attention). First, the Prophet data decomposition method is used to decompose long sequences of load data at multiple time scales. Second, the correlation-based feature selection with maximum information coefficient (CFS-MIC) method is employed to select weather features based on their relevance, a subset of features with high correlation and low redundancy is chosen as model inputs. Finally, the Attention-LSTM-VAE model is constructed to capture the temporal variations laws of load. The dataset includes 2 years of load values and weather data collected in Caojiaping, Hunan Province, China. The experimental results show that the Attention-LSTM-VAE model has the lowest mean absolute error of 0.0374 and the highest R-squared value of 0.9714, verifying the accuracy of the model. Therefore, the performance of the Attention-LSTM-VAE model is better than the general deep learning load models, which has important reference for the research of power load models. Comparisons with other deep learning methods, the experimental results show that the Attention-LSTM-VAE model has the lowest mean absolute error of 0.0374 and the highest R-squared value of 0.9714. The Attention-LSTM-VAE has better robustness, stability, and accuracy in load modeling, which has an important reference for the research of power load models.

Suggested Citation

  • Chaoyue Ma & Ying Wang & Feng Li & Huiyan Zhang & Yong Zhang & Haiyan Zhang & S. A. Edalatpanah, 2024. "Constructing Attention-LSTM-VAE Power Load Model Based on Multiple Features," Advances in Mathematical Physics, Hindawi, vol. 2024, pages 1-15, June.
  • Handle: RePEc:hin:jnlamp:1041791
    DOI: 10.1155/2024/1041791
    as

    Download full text from publisher

    File URL: http://downloads.hindawi.com/journals/amp/2024/1041791.pdf
    Download Restriction: no

    File URL: http://downloads.hindawi.com/journals/amp/2024/1041791.xml
    Download Restriction: no

    File URL: https://libkey.io/10.1155/2024/1041791?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:hin:jnlamp:1041791. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Mohamed Abdelhakeem (email available below). General contact details of provider: https://www.hindawi.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.