JOURNAL ARTICLE

Sequence to Sequence Load Recognition Model Based On Sparse Self-attention Transformer

Abstract

Load recognition is a technique for monitoring the status of different appliances from the aggregate measurement signal. Traditional load recognition methods either rely on high frequency sampled data or have significantly degraded performance as the amount of devices increases. Recently, research has shown that deep learning-based load recognition solutions offer better performance. However, the recognition accuracy still needs to be improved. In this paper, we propose a load recognition model based on tranformer, and we introduce a sparse self-attention mechanism to reduce the computational complexity. By conducting experiments on the low-frequency sampled UKDALE dataset, the results show that our proposed method outperforms previous methods, demonstrating the effectiveness of the proposed method.

Keywords:
Computer science Transformer Artificial intelligence Pattern recognition (psychology) Sequence (biology) Computational complexity theory Machine learning Speech recognition Algorithm Voltage Engineering

Metrics

0
Cited By
0.00
FWCI (Field Weighted Citation Impact)
23
Refs
0.20
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Topics

Smart Grid Energy Management
Physical Sciences →  Engineering →  Electrical and Electronic Engineering
IoT-based Smart Home Systems
Physical Sciences →  Engineering →  Electrical and Electronic Engineering
Context-Aware Activity Recognition Systems
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
© 2026 ScienceGate Book Chapters — All rights reserved.