DISSERTATION

Transformers in Time-Series Forecasting

Abstract

Transformer architectures have emerged as powerful tools for time series forecasting, excelling at capturing complex temporal dependencies across multivariate inputs. However, these models are highly susceptible to adversarial attacks such as the Fast Gradient Sign Method (FGSM) and Basic Iterative Method (BIM), which can significantly degrade predictive performance through small, targeted perturbations. This work integrates dynamic attention mechanisms, adaptive masking modules that introduce controlled variability into attention pathways, into a transformer forecasting model to enhance robustness against such attacks. Using two distinct datasets, we compare the performance of a standard transformer and a dynamic attention-enhanced transformer under both clean and adversarial conditions. Results show that while both models perform similarly on non-attacked data, the dynamic attention model consistently maintains lower error rates under increasing adversarial intensities, demonstrating improved resilience without the need for adversarial training or additional defense layers. These findings highlight the effectiveness of dynamic architectural defenses as lightweight, model-level strategies for improving the robustness and reliability of deep learning systems in time series forecasting applications.

Keywords:
Series (stratigraphy) Time series Econometrics Computer science Mathematics Machine learning Geology

Metrics

0
Cited By
0.00
FWCI (Field Weighted Citation Impact)
0
Refs
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Topics

Neural Networks and Applications
Physical Sciences →  Computer Science →  Artificial Intelligence
Time Series Analysis and Forecasting
Physical Sciences →  Computer Science →  Signal Processing
© 2026 ScienceGate Book Chapters — All rights reserved.