Author
Listed:
- Yin, Linfei
- Wang, Nannan
- Li, Jishen
Abstract
With the integration of distributed renewable energy into smart grids, the uncertainty of distributed systems and new energy generation seriously affect the stable operation of power grids. Demand-side management is a method for solving distributed electricity usage issues, thus monitoring the types of loads connected to the system has become a hot research topic. Load monitoring includes invasive load monitoring (ILM) and non-invasive load monitoring (NILM). Currently, NILM lacks an incremental capability and has a low recognition accuracy. The study proposes an electricity terminal multi-label recognition with a “One-Versus-All” rejection recognition algorithm based on adaptive distillation increment learning and attention MobileNetV2 network (ET-MR “OVA” RR-ADIL-AMN). The algorithm combines multi-label recognition with a “One-Versus-All” rejection recognition algorithm (MR “OVA” RR), support vector machine (SVM), adaptive distillation increment learning (ADIL), attention MobileNetV2 network (AMN) for electricity terminal recognition. The AMN consists of channel attention mechanism (CAM), co-attention, and MobileNetV2. The CAM significantly enhances the ability to characterize different channels, while the co-attention mechanism focuses on the spatial dimension to extract information about the feature map at different locations. The MobileNetV2 network is employed to optimize computational efficiency and model size through an inverted residual structure, depth separable convolution (DSC), and a linear bottleneck layer. The integration of attention mechanisms (AMs) into MobileNetV2 allows the model not only to maintain the original computational efficiency but also to process input data more efficiently than the MobileNetV2 network by dynamically readjusting the weights of the feature channels and spatial information. ADIL includes an adaptive distillation selector (ADS) and a variable learning rate controller (VLRC). The ADS can select suitable distillation samples by introducing distillation strategies. The distillation strategy allows the algorithm to absorb new categories of information while retaining the learning of old knowledge, contributing to balancing the transfer of old and new knowledge and avoiding catastrophic forgetting. The VLRC can reduce the learning rate in the specific direction of the co-attention mechanism, preventing shocks during the incremental learning process. The MR “OVA” RR algorithm adopts a “One-Versus-All” strategy to convert multi-label problems into many binary classification problems (BCPs). The rejection recognition operation is accomplished by comparing the confidence probability values output by SVM with a set threshold. By incorporating a rejection recognition strategy into multi-label recognition, unreliable decisions of the classifier are retained, consequently enhancing the accuracy of load type recognition. Experimental results show that ADIL saves 49 min compared to retraining after completing the incremental learning task for all 10 load types. After integrating the MR “OVA” RR algorithm, the average accuracy of the first-round model reaches 97.13 %. When the rejection recognition samples are inputted into a simple MobileNetV2 model for identification, the accuracy of the overall algorithm reaches 99.84 %. In contrast to other neural networks, the AMN demonstrates optimal performance with accuracy, precision, recall, and F1 scores of 98.38.60 %, 98.27 %, 98.31 %, and 98.38 % respectively, showing improvements of 2.78 %, 2.21 %, 2.53 %, and 2.78 %.
Suggested Citation
Download full text from publisher
As the access to this document is restricted, you may want to search for a different version of it.
Corrections
All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:appene:v:382:y:2025:i:c:s0306261925000376. See general information about how to correct material in RePEc.
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
We have no bibliographic references for this item. You can help adding them by using this form .
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/wps/find/journaldescription.cws_home/405891/description#description .
Please note that corrections may take a couple of weeks to filter through
the various RePEc services.