JOURNAL ARTICLE

HGNAS++: Efficient Architecture Search for Heterogeneous Graph Neural Networks

Yang GaoPeng ZhangChuan ZhouHong YangZhao LiYue HuPhilip S. Yu

Year: 2023 Journal:   IEEE Transactions on Knowledge and Data Engineering Vol: 35 (9)Pages: 9448-9461   Publisher: IEEE Computer Society

Abstract

Heterogeneous graphs are commonly used to describe networked data with multiple types of nodes and edges. Heterogeneous Graph Neural Networks (HGNNs) are powerful tools for analyzing heterogeneous graphs. However, designing neural architectures of HGNNs requires extensive domain knowledge and time-consuming manual work. Recently, neural architecture search algorithms have become popular in automatically designing neural architectures for homogeneous graph neural networks. In this paper, we present a Heterogeneous Graph Neural Architecture Search algorithm (HGNAS for short) which allows the automatic design of heterogeneous graph neural architectures. Specifically, HGNAS first designs a new search space based on existing popular HGNNs. Then, HGNAS uses a policy network as the controller to sample and find the best neural architecture from the designed search space by maximizing the expected accuracy of the selected architectures on a given validation dataset. Moreover, we design a new method HGNAS++ to improve the efficiency of HGNAS by training the RNN controller within a generative adversarial learning framework. The basic idea of HGNAS++ is to embed a pairwise ranker into the reinforcement learning based architecture search algorithm. The pairwise ranker can be taken as a discriminator which selects more accurate architectures between pairs of candidate architectures. Then, the RNN controller can be updated more efficiently by only using a relatively small number of candidate architectures selected by the pairwise ranker. Experiments on real-world heterogeneous graph datasets show that HGNAS is capable of designing novel HGNNs that beat the best human-invented HGNNs. On the benchmark datasets, HGNAS++ improves HGNAS in terms of evaluation cost, with a reduction of 50% of the evaluated candidate architectures and a decrease of 24% in search time on average. As a byproduct, HGNAS++ can find sparse yet powerful neural architectures for HGNNs.

Keywords:
Computer science Pairwise comparison Artificial neural network Artificial intelligence Graph Machine learning Theoretical computer science

Metrics

37
Cited By
9.45
FWCI (Field Weighted Citation Impact)
76
Refs
0.98
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Advanced Graph Neural Networks
Physical Sciences →  Computer Science →  Artificial Intelligence
Graph Theory and Algorithms
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
Machine Learning in Materials Science
Physical Sciences →  Materials Science →  Materials Chemistry

Related Documents

JOURNAL ARTICLE

Heterogeneous Graph Neural Architecture Search

Yang GaoPeng ZhangZhao LiChuan ZhouYongchao LiuYue Hu

Journal:   2021 IEEE International Conference on Data Mining (ICDM) Year: 2021 Pages: 1066-1071
BOOK-CHAPTER

Neural Architecture Search in Graph Neural Networks

Matheus NunesGisele L. Pappa

Lecture notes in computer science Year: 2020 Pages: 302-317
JOURNAL ARTICLE

Dynamic Heterogeneous Graph Attention Neural Architecture Search

Zeyang ZhangZiwei ZhangXin WangYijian QinZhou QinWenwu Zhu

Journal:   Proceedings of the AAAI Conference on Artificial Intelligence Year: 2023 Vol: 37 (9)Pages: 11307-11315
© 2026 ScienceGate Book Chapters — All rights reserved.