JOURNAL ARTICLE

Efficient Weighted Kernel Sharing Convolutional Neural Networks

Abstract

To lessen the redundancy of convolutional kernels, this paper proposes a new convolutional structure, i.e., weighted kernel sharing convolution (WKSC), which gathers the inputs with the same kernel, so the inputs in each group can share the same convolutional kernel. Also, an extra weighting is imposed for each input channel before the sharing process to manifest its diversity. As a consequence, the number of kernels can be greatly reduced, leading to a reduction of model parameters and the speedup of inference. Moreover, WKSC can be combined with other existing compression models such as depthwise separable convolutions, resulting in a more compressed architecture. Extensive experiments on CIFAR-100 and ImageNet classification demonstrate the effectiveness of the new approach in both computation cost and the parameters required compared with the state-of-the-art works.

Keywords:
Computer science Kernel (algebra) Speedup Convolutional neural network Convolution (computer science) Computation Redundancy (engineering) Weighting Inference Algorithm Artificial intelligence Pattern recognition (psychology) Mathematics Artificial neural network Parallel computing

Metrics

1
Cited By
0.00
FWCI (Field Weighted Citation Impact)
24
Refs
0.21
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Advanced Neural Network Applications
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
Domain Adaptation and Few-Shot Learning
Physical Sciences →  Computer Science →  Artificial Intelligence
Advanced Image and Video Retrieval Techniques
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition

Related Documents

JOURNAL ARTICLE

Vector-kernel convolutional neural networks

Jun OuYujian Li

Journal:   Neurocomputing Year: 2018 Vol: 330 Pages: 253-258
JOURNAL ARTICLE

Accelerating Convolutional Neural Networks in Frequency Domain via Kernel-Sharing Approach

Bosheng LiuHongyi LiangJigang WuXiaoming ChenPeng LiuYinhe Han

Journal:   Proceedings of the 28th Asia and South Pacific Design Automation Conference Year: 2023 Pages: 733-738
JOURNAL ARTICLE

Kernel Modulation: A Parameter-Efficient Method for Training Convolutional Neural Networks

Yuhuang HuShih‐Chii Liu

Journal:   2022 26th International Conference on Pattern Recognition (ICPR) Year: 2022 Pages: 2192-2198
JOURNAL ARTICLE

DK-CNNs: Dynamic kernel convolutional neural networks

Jialin LiuFei ChaoChih‐Min LinChangle ZhouChangjing Shang

Journal:   Neurocomputing Year: 2020 Vol: 422 Pages: 95-108
© 2026 ScienceGate Book Chapters — All rights reserved.