JOURNAL ARTICLE

Strategies for Pre-training Graph Neural Networks

Abstract

Many applications of machine learning require a model to make accurate pre-dictions on test examples that are distributionally different from training ones, while task-specific labels are scarce during training. An effective approach to this challenge is to pre-train a model on related tasks where data is abundant, and then fine-tune it on a downstream task of interest. While pre-training has been effective in many language and vision domains, it remains an open question how to effectively use pre-training on graph datasets. In this paper, we develop a new strategy and self-supervised methods for pre-training Graph Neural Networks (GNNs). The key to the success of our strategy is to pre-train an expressive GNN at the level of individual nodes as well as entire graphs so that the GNN can learn useful local and global representations simultaneously. We systematically study pre-training on multiple graph classification datasets. We find that naive strategies, which pre-train GNNs at the level of either entire graphs or individual nodes, give limited improvement and can even lead to negative transfer on many downstream tasks. In contrast, our strategy avoids negative transfer and improves generalization significantly across downstream tasks, leading up to 9.4% absolute improvements in ROC-AUC over non-pre-trained models and achieving state-of-the-art performance for molecular property prediction and protein function prediction.

Keywords:
Computer science Machine learning Artificial intelligence Task (project management) Graph Contrast (vision) Artificial neural network Training set Attention network Training (meteorology) Deep neural networks Transfer of learning Theoretical computer science

Metrics

186
Cited By
0.00
FWCI (Field Weighted Citation Impact)
62
Refs
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Advanced Graph Neural Networks
Physical Sciences →  Computer Science →  Artificial Intelligence
Topic Modeling
Physical Sciences →  Computer Science →  Artificial Intelligence
Machine Learning in Materials Science
Physical Sciences →  Materials Science →  Materials Chemistry

Related Documents

BOOK-CHAPTER

Neural Graph Matching for Pre-training Graph Neural Networks

Yupeng HouBinbin HuWayne Xin ZhaoZhiqiang ZhangJun ZhouJi-Rong Wen

Society for Industrial and Applied Mathematics eBooks Year: 2022 Pages: 172-180
JOURNAL ARTICLE

PHGNN: Pre-Training Heterogeneous Graph Neural Networks

Xin LiHao WeiYu Ding

Journal:   IEEE Access Year: 2024 Vol: 12 Pages: 135411-135418
JOURNAL ARTICLE

Pre-training on dynamic graph neural networks

Kejia ChenJiajun ZhangLinpu JiangYunyun WangYuxuan Dai

Journal:   Neurocomputing Year: 2022 Vol: 500 Pages: 679-687
© 2026 ScienceGate Book Chapters — All rights reserved.