JOURNAL ARTICLE

Scalable Spatiotemporal Graph Neural Networks

Andrea CiniIvan MariscaFilippo Maria BianchiCesare Alippi

Year: 2023 Journal:   Proceedings of the AAAI Conference on Artificial Intelligence Vol: 37 (6)Pages: 7218-7226   Publisher: Association for the Advancement of Artificial Intelligence

Abstract

Neural forecasting of spatiotemporal time series drives both research and industrial innovation in several relevant application domains. Graph neural networks (GNNs) are often the core component of the forecasting architecture. However, in most spatiotemporal GNNs, the computational complexity scales up to a quadratic factor with the length of the sequence times the number of links in the graph, hence hindering the application of these models to large graphs and long temporal sequences. While methods to improve scalability have been proposed in the context of static graphs, few research efforts have been devoted to the spatiotemporal case. To fill this gap, we propose a scalable architecture that exploits an efficient encoding of both temporal and spatial dynamics. In particular, we use a randomized recurrent neural network to embed the history of the input time series into high-dimensional state representations encompassing multi-scale temporal dynamics. Such representations are then propagated along the spatial dimension using different powers of the graph adjacency matrix to generate node embeddings characterized by a rich pool of spatiotemporal features. The resulting node embeddings can be efficiently pre-computed in an unsupervised manner, before being fed to a feed-forward decoder that learns to map the multi-scale spatiotemporal representations to predictions. The training procedure can then be parallelized node-wise by sampling the node embeddings without breaking any dependency, thus enabling scalability to large networks. Empirical results on relevant datasets show that our approach achieves results competitive with the state of the art, while dramatically reducing the computational burden.

Keywords:
Computer science Scalability Adjacency matrix Theoretical computer science Graph Recurrent neural network Artificial intelligence Context (archaeology) Artificial neural network

Metrics

53
Cited By
35.92
FWCI (Field Weighted Citation Impact)
40
Refs
1.00
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Energy Load and Power Forecasting
Physical Sciences →  Engineering →  Electrical and Electronic Engineering
Advanced Graph Neural Networks
Physical Sciences →  Computer Science →  Artificial Intelligence
Forecasting Techniques and Applications
Social Sciences →  Decision Sciences →  Management Science and Operations Research

Related Documents

BOOK-CHAPTER

Scalable Graph Neural Networks

Yao MaJiliang Tang

Cambridge University Press eBooks Year: 2021 Pages: 162-175
JOURNAL ARTICLE

GRAND+: Scalable Graph Random Neural Networks

Wenzheng FengYuxiao DongTinglin HuangZiqi YinXu ChengEvgeny KharlamovJie Tang

Journal:   Proceedings of the ACM Web Conference 2022 Year: 2022 Pages: 3248-3258
JOURNAL ARTICLE

Inductive Graph Neural Networks for Spatiotemporal Kriging

Yuankai WuDingyi ZhuangAurélie LabbeLijun Sun

Journal:   Proceedings of the AAAI Conference on Artificial Intelligence Year: 2021 Vol: 35 (5)Pages: 4478-4485
© 2026 ScienceGate Book Chapters — All rights reserved.