JOURNAL ARTICLE

NEURAL NETWORK ARCHITECTURE BASED ON GRAPH CODES

Vasily UsatyukS. I. EgorovА. P. LoktionovЕ. А. TitenkoI. E. Chernetskaya

Year: 2023 Journal:   Izvestiâ ÛFU. Tehničeskie nauki Pages: 81-92   Publisher: Southern Federal University

Abstract

One of the important achievements of the theory of error-correcting coding is the discoveryof graph codes and their important subset - low-density parity check codes (LDPC codes). Usingthe parity check matrix of the code on the graph, one can obtain a Markov random field. LDPCcode can be embedded in an Ising model (a type of Markov random field) by using a torus topologywith negative curvature. In this case, codewords correspond to saddle points (extrema) in themodel, and trappin sets correspond to local minima. The use of LDPC codes with an increasedcode distance allows for maximum separation of saddle points, and thus increases the noise resistanceof the neural network and the representation power. At the same time, the block andsparse structure, characteristic of a torus of negative curvature, simplifies multiplexing and reducesthe number of trainable parameters of the neural network. The aim of the research is toreduce the computational complexity and increase the accuracy of neural networks through theuse of a priori structural (quasi-cyclic) sparse graphs for a wide class of machine learning problemson Markov random fields. The paper presents a new approach that allows the synthesis ofneural network architectures based on graph codes. The proposed approach provides an effectiverepresentation of Markov random fields through the use of QC-LDPC matrices and tensors.The proposed approach allows us to reduce the number of trainable parameters and logarithmicallyreduce the complexity of tensor multiplexing. The proposed approach provided an accuracyof 94.95% (1.72% to first place) of the binary classification problem “Pathfinder” of the “LongRange Arena” competition, with more than 5 times fewer parameters (multiplications). Applicationof the proposed approach to factorization problems on dense graphs, network problems, surfacemeshes, covariance matrices made it possible to increase the accuracy of reconstruction usingthe Frobenius metric in individual problems by more than 8 orders of magnitude in combinationwith simplifying the structure of the multiplexer.

Keywords:
Low-density parity-check code Computer science Markov chain Maxima and minima Artificial neural network Algorithm Markov random field Theoretical computer science Mathematics Decoding methods Artificial intelligence Segmentation Image segmentation

Metrics

1
Cited By
0.44
FWCI (Field Weighted Citation Impact)
16
Refs
0.68
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Information Systems and Technology Applications
Social Sciences →  Business, Management and Accounting →  Management Information Systems
Artificial Intelligence in Education
Physical Sciences →  Computer Science →  Information Systems
Advanced Data Processing Techniques
Physical Sciences →  Engineering →  Control and Systems Engineering

Related Documents

JOURNAL ARTICLE

Graph Neural Network-Based Surrogate Model for Evolutionary Neural Architecture Search

Yu XueXiaolei ZhangFerrante NeriBing XueMengjie Zhang

Journal:   IEEE Transactions on Systems Man and Cybernetics Systems Year: 2025 Vol: 55 (12)Pages: 9631-9644
BOOK-CHAPTER

Multi-objective Evolutionary Algorithm Based Graph Neural Network Architecture Search

Lianyi HeXiaobo LiuHongbo XiangGuangjun Wang

Communications in computer and information science Year: 2025 Pages: 328-342
JOURNAL ARTICLE

Lightweight graph neural network architecture search based on heuristic algorithms

Zihao ZhaoXianghong TangJianguang LuYong Huang

Journal:   International Journal of Machine Learning and Cybernetics Year: 2024 Vol: 16 (3)Pages: 1625-1641
© 2026 ScienceGate Book Chapters — All rights reserved.