JOURNAL ARTICLE

Interleaved Structured Sparse Convolutional Neural Networks

Abstract

In this paper, we study the problem of designing efficient convolutional neural network architectures with the interest in eliminating the redundancy in convolution kernels. In addition to structured sparse kernels, low-rank kernels and the product of low-rank kernels, the product of structured sparse kernels, which is a framework for interpreting the recently-developed interleaved group convolutions (IGC) and its variants (e.g., Xception), has been attracting increasing interests. Motivated by the observation that the convolutions contained in a group convolution in IGC can be further decomposed in the same manner, we present a modularized building block, IGC-V2: interleaved structured sparse convolutions. It generalizes interleaved group convolutions, which is composed of two structured sparse kernels, to the product of more structured sparse kernels, further eliminating the redundancy. We present the complementary condition and the balance condition to guide the design of structured sparse kernels, obtaining a balance among three aspects: model size, computation complexity and classification accuracy. Experimental results demonstrate the advantage on the balance among these three aspects compared to interleaved group convolutions and Xception, and competitive performance compared to other state-of-the-art architecture design methods.

Keywords:
Redundancy (engineering) Convolution (computer science) Kernel (algebra) Computer science Convolutional neural network Computation Block (permutation group theory) Rank (graph theory) Artificial intelligence Pattern recognition (psychology) Algorithm Artificial neural network Mathematics

Metrics

136
Cited By
13.86
FWCI (Field Weighted Citation Impact)
69
Refs
0.99
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Advanced Neural Network Applications
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
Machine Learning and ELM
Physical Sciences →  Computer Science →  Artificial Intelligence
Domain Adaptation and Few-Shot Learning
Physical Sciences →  Computer Science →  Artificial Intelligence

Related Documents

JOURNAL ARTICLE

A novel structured sparse fully connected layer in convolutional neural networks

Naoki MatsumuraYasuaki ItoKoji NakanoAkihiko KasagiTsuguchika Tabaru

Journal:   Concurrency and Computation Practice and Experience Year: 2021 Vol: 35 (11)
JOURNAL ARTICLE

Super Sparse Convolutional Neural Networks

Yao LuGuangming LuBob ZhangYuanrong XuJinxing Li

Journal:   Proceedings of the AAAI Conference on Artificial Intelligence Year: 2019 Vol: 33 (01)Pages: 4440-4447
JOURNAL ARTICLE

Symmetry-structured convolutional neural networks

Kehelwala Dewage Gayan MadurangaVasily ZadorozhnyyQiang Ye

Journal:   Neural Computing and Applications Year: 2022 Vol: 35 (6)Pages: 4421-4434
© 2026 ScienceGate Book Chapters — All rights reserved.