JOURNAL ARTICLE

Learning Domain-Independent Deep Representations by Mutual Information Minimization

Ke WangJiayong LiuJingyan Wang

Year: 2019 Journal:   Computational Intelligence and Neuroscience Vol: 2019 Pages: 1-14   Publisher: Hindawi Publishing Corporation

Abstract

Domain transfer learning aims to learn common data representations from a source domain and a target domain so that the source domain data can help the classification of the target domain. Conventional transfer representation learning imposes the distributions of source and target domain representations to be similar, which heavily relies on the characterization of the distributions of domains and the distribution matching criteria. In this paper, we proposed a novel framework for domain transfer representation learning. Our motive is to make the learned representations of data points independent from the domains which they belong to. In other words, from an optimal cross-domain representation of a data point, it is difficult to tell which domain it is from. In this way, the learned representations can be generalized to different domains. To measure the dependency between the representations and the corresponding domain which the data points belong to, we propose to use the mutual information between the representations and the domain-belonging indicators. By minimizing such mutual information, we learn the representations which are independent from domains. We build a classwise deep convolutional network model as a representation model and maximize the margin of each data point of the corresponding class, which is defined over the intraclass and interclass neighborhood. To learn the parameters of the model, we construct a unified minimization problem where the margins are maximized while the representation-domain mutual information is minimized. In this way, we learn representations which are not only discriminate but also independent from domains. An iterative algorithm based on the Adam optimization method is proposed to solve the minimization to learn the classwise deep model parameters and the cross-domain representations simultaneously. Extensive experiments over benchmark datasets show its effectiveness and advantage over existing domain transfer learning methods.

Keywords:
Computer science Mutual information Representation (politics) Domain (mathematical analysis) Artificial intelligence Margin (machine learning) Transfer of learning Feature learning Matching (statistics) Pattern recognition (psychology) Convolutional neural network Machine learning Mathematics Statistics

Metrics

6
Cited By
0.77
FWCI (Field Weighted Citation Impact)
25
Refs
0.78
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Domain Adaptation and Few-Shot Learning
Physical Sciences →  Computer Science →  Artificial Intelligence
Cancer-related molecular mechanisms research
Life Sciences →  Biochemistry, Genetics and Molecular Biology →  Cancer Research
Multimodal Machine Learning Applications
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition

Related Documents

JOURNAL ARTICLE

Learning Disentangled Representations for Counterfactual Regression via Mutual Information Minimization

Ming‐Yuan ChengXinru LiaoQuan LiuBin MaJian XuBo Zheng

Journal:   Proceedings of the 45th International ACM SIGIR Conference on Research and Development in Information Retrieval Year: 2022 Pages: 1802-1806
JOURNAL ARTICLE

Avoiding shortcut-learning by mutual information minimization in deep learning-based MR image processing

Louisa FayBin YangSergios GatidisThomas Kuestner

Journal:   Proceedings on CD-ROM - International Society for Magnetic Resonance in Medicine. Scientific Meeting and Exhibition/Proceedings of the International Society for Magnetic Resonance in Medicine, Scientific Meeting and Exhibition Year: 2024
© 2026 ScienceGate Book Chapters — All rights reserved.