JOURNAL ARTICLE

Bilinear Adversarial Network for Fine-Grained Domain Adaptation

Abstract

Due to larger intra-class variation and inter-class similarity, fine-grained visual categorization (FGVC) is a challenging task. However, obtaining labelled samples for fine-grained datasets is more difficult than traditional datasets because of the need for expert-level domain knowledge. Consequently, the progress of FGVC has been limited by the availability of well-labelled datasets. In this paper, we propose a domain adaptation approach that leverages domain knowledge learned from existing large-scale fine-grained datasets to unlabeled real-life data, enabling FGVC applications in various daily life domains. Our approach utilizes high-dimensional features extracted by bilinear convolutional networks to bridge the source and target domains gap. We explore and compare different variants of bilinear convolutional networks, ultimately identifying the optimal method. Experimental results on two benchmarks verified the effectiveness of our proposed method. Additionally, detailed ablation experiments verify the contributions of each component in our method. Code is provided at https://github.com/PRIS-CV/Bilinear-Adversarial-Network.

Keywords:
Computer science Bilinear interpolation Domain (mathematical analysis) Adversarial system Convolutional neural network Artificial intelligence Categorization Task (project management) Domain adaptation Machine learning Source code Adaptation (eye) Class (philosophy) Key (lock) Deep learning Bridge (graph theory) Code (set theory)

Metrics

0
Cited By
0.00
FWCI (Field Weighted Citation Impact)
23
Refs
0.20
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Topics

Domain Adaptation and Few-Shot Learning
Physical Sciences →  Computer Science →  Artificial Intelligence
Multimodal Machine Learning Applications
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
Machine Learning and ELM
Physical Sciences →  Computer Science →  Artificial Intelligence
© 2026 ScienceGate Book Chapters — All rights reserved.