JOURNAL ARTICLE

Fine-Tuning Graph Neural Networks by Preserving Graph Generative Patterns

Yifei SunQi ZhuYang YangChunping WangTianyu FanJiajun ZhuLei Chen

Year: 2024 Journal:   Proceedings of the AAAI Conference on Artificial Intelligence Vol: 38 (8)Pages: 9053-9061   Publisher: Association for the Advancement of Artificial Intelligence

Abstract

Recently, the paradigm of pre-training and fine-tuning graph neural networks has been intensively studied and applied in a wide range of graph mining tasks. Its success is generally attributed to the structural consistency between pre-training and downstream datasets, which, however, does not hold in many real-world scenarios. Existing works have shown that the structural divergence between pre-training and downstream graphs significantly limits the transferability when using the vanilla fine-tuning strategy. This divergence leads to model overfitting on pre-training graphs and causes difficulties in capturing the structural properties of the downstream graphs. In this paper, we identify the fundamental cause of structural divergence as the discrepancy of generative patterns between the pre-training and downstream graphs. Furthermore, we propose G-Tuning to preserve the generative patterns of downstream graphs. Given a downstream graph G, the core idea is to tune the pre-trained GNN so that it can reconstruct the generative patterns of G, the graphon W. However, the exact reconstruction of a graphon is known to be computationally expensive. To overcome this challenge, we provide a theoretical analysis that establishes the existence of a set of alternative graphons called graphon bases for any given graphon. By utilizing a linear combination of these graphon bases, we can efficiently approximate W. This theoretical finding forms the basis of our model, as it enables effective learning of the graphon bases and their associated coefficients. Compared with existing algorithms, G-Tuning demonstrates consistent performance improvement in 7 in-domain and 7 out-of-domain transfer learning experiments.

Keywords:
Computer science Graph Generative grammar Artificial neural network Artificial intelligence Theoretical computer science

Metrics

9
Cited By
2.18
FWCI (Field Weighted Citation Impact)
63
Refs
0.83
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Advanced Graph Neural Networks
Physical Sciences →  Computer Science →  Artificial Intelligence
Graph Theory and Algorithms
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
Neural Networks and Applications
Physical Sciences →  Computer Science →  Artificial Intelligence

Related Documents

JOURNAL ARTICLE

Fine-Tuning Graph Neural Networks via Graph Topology Induced Optimal Transport

Jiying ZhangXi XiaoLong-Kai HuangYu RongYatao Bian

Journal:   Proceedings of the Thirty-First International Joint Conference on Artificial Intelligence Year: 2022 Pages: 3730-3736
JOURNAL ARTICLE

GraphTeacher: Transductive Fine-Tuning of Encoders through Graph Neural Networks

Emirhan KoçArda Can ArasTuna AlikaşifoğluAykut Koç

Journal:   IEEE Transactions on Artificial Intelligence Year: 2025 Pages: 1-15
JOURNAL ARTICLE

Deep Generative Probabilistic Graph Neural Networks for Scene Graph Generation

Mahmoud KhademiOliver Schulte

Journal:   Proceedings of the AAAI Conference on Artificial Intelligence Year: 2020 Vol: 34 (07)Pages: 11237-11245
JOURNAL ARTICLE

Streaming Graph Neural Networks with Generative Replay

Junshan WangWenhao ZhuGuojie SongLiang Wang

Journal:   Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining Year: 2022 Pages: 1878-1888
JOURNAL ARTICLE

Measuring Task Similarity and Its Implication in Fine-Tuning Graph Neural Networks

Renhong HuangJiarong XuXin JiangChenglu PanZhiming YangChunping WangYang Yang

Journal:   Proceedings of the AAAI Conference on Artificial Intelligence Year: 2024 Vol: 38 (11)Pages: 12617-12625
© 2026 ScienceGate Book Chapters — All rights reserved.