JOURNAL ARTICLE

Sequence-to-Sequence Model with Transformer-based Attention Mechanism and Temporal Pooling for Non-Intrusive Load Monitoring

Abstract

This paper presents a novel Sequence-to-Sequence (Seq2Seq) model based on a transformer-based attention mechanism and temporal pooling for Non-Intrusive Load Monitoring (NILM) of smart buildings. The paper aims to improve the accuracy of NILM by using a deep learning-based method. The proposed method uses a Seq2Seq model with a transformer-based attention mechanism to capture the long-term dependencies of NILM data. Additionally, temporal pooling is used to improve the model's accuracy by capturing both the steady-state and transient behavior of appliances. The paper evaluates the proposed method on a publicly available dataset and compares the results with other state-of-the-art NILM techniques. The results demonstrate that the proposed method outperforms the existing methods in terms of both accuracy and computational efficiency.

Keywords:
Pooling Transformer Computer science Artificial intelligence Sequence (biology) Data mining Machine learning Voltage Engineering

Metrics

5
Cited By
0.83
FWCI (Field Weighted Citation Impact)
15
Refs
0.70
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Smart Grid Energy Management
Physical Sciences →  Engineering →  Electrical and Electronic Engineering
Building Energy and Comfort Optimization
Physical Sciences →  Engineering →  Building and Construction
Smart Parking Systems Research
Physical Sciences →  Engineering →  Building and Construction
© 2026 ScienceGate Book Chapters — All rights reserved.