JOURNAL ARTICLE

Deep Attention-based Neural Network for Electricity Theft Detection

Abstract

Electricity theft causes significant harm to social and economic development. In recent years, as a powerful technique in data mining, deep learning has attached much attention and become popular in electricity consumption sequence analysis. Nevertheless, existing methods mainly focus on short-term numerical data modeling, while the records in real-world scenarios (1) usually consist of multiple temporal features and (2) are often of large scale. In this paper, to overcome the two fundamental challenges, we propose a novel method called Deep Attention-based Neural Network for Electricity Theft Detection (DANN-ETD). Specifically, we first respectively decompose the electricity sequences into the trend, seasonal and residual views to fully exploit the temporal features. To effectively and efficiently model the large-scale time series, we then split the series into several snapshots and further design the deep attention-based recurrent neural networks which can detect the fine-grained evolution of electricity consumption. Experimental results on realworld datasets demonstrate that our method outperforms the state of the arts.

Keywords:
Computer science Electricity Deep learning Exploit Artificial intelligence Artificial neural network Machine learning Residual Harm Scale (ratio) Data modeling Consumption (sociology) Recurrent neural network Focus (optics) Data mining Computer security Engineering Database Algorithm

Metrics

9
Cited By
0.79
FWCI (Field Weighted Citation Impact)
19
Refs
0.73
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Electricity Theft Detection Techniques
Physical Sciences →  Engineering →  Electrical and Electronic Engineering
Water Systems and Optimization
Physical Sciences →  Engineering →  Civil and Structural Engineering
Non-Destructive Testing Techniques
Physical Sciences →  Engineering →  Mechanical Engineering
© 2026 ScienceGate Book Chapters — All rights reserved.