JOURNAL ARTICLE

Inductive Semi-supervised Multi-Label Learning with Co-Training

Abstract

In multi-label learning, each training example is associated with multiple class labels and the task is to learn a mapping from the feature space to the power set of label space. It is generally demanding and time-consuming to obtain labels for training examples, especially for multi-label learning task where a number of class labels need to be annotated for the instance. To circumvent this difficulty, semi-supervised multi-label learning aims to exploit the readily-available unlabeled data to help build multi-label predictive model. Nonetheless, most semi-supervised solutions to multi-label learning work under transductive setting, which only focus on making predictions on existing unlabeled data and cannot generalize to unseen instances. In this paper, a novel approach named COINS is proposed to learning from labeled and unlabeled data by adapting the well-known co-training strategy which naturally works under inductive setting. In each co-training round, a dichotomy over the feature space is learned by maximizing the diversity between the two classifiers induced on either dichotomized feature subset. After that, pairwise ranking predictions on unlabeled data are communicated between either classifier for model refinement. Extensive experiments on a number of benchmark data sets show that COINS performs favorably against state-of-the-art multi-label learning approaches.

Keywords:
Computer science Machine learning Artificial intelligence Semi-supervised learning Multi-label classification Pairwise comparison Co-training Exploit Classifier (UML) Labeled data Training set Multi-task learning Feature vector Feature learning Supervised learning Feature (linguistics) Benchmark (surveying) Task (project management) Artificial neural network

Metrics

74
Cited By
7.33
FWCI (Field Weighted Citation Impact)
39
Refs
0.97
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Text and Document Classification Technologies
Physical Sciences →  Computer Science →  Artificial Intelligence
Spam and Phishing Detection
Physical Sciences →  Computer Science →  Information Systems
Web Data Mining and Analysis
Physical Sciences →  Computer Science →  Information Systems

Related Documents

JOURNAL ARTICLE

Stacked co-training for semi-supervised multi-label learning

Jiaxuan LiXiaoyan ZhuHongrui WangYu ZhangJiayin Wang

Journal:   Information Sciences Year: 2024 Vol: 677 Pages: 120906-120906
JOURNAL ARTICLE

Semi-supervised Learning with Multi-Head Co-Training

Mingcai ChenYuntao DuYi ZhangShuwei QianChongjun Wang

Journal:   Proceedings of the AAAI Conference on Artificial Intelligence Year: 2022 Vol: 36 (6)Pages: 6278-6286
JOURNAL ARTICLE

SMLE: Semi-Supervised Multi-Label Learning with Label Enhancement

Qianzhi YeJia ZhangHanrui WuTianlong GuC. L. Philip ChenJinyi Long

Journal:   IEEE Transactions on Knowledge and Data Engineering Year: 2025 Vol: 37 (9)Pages: 5613-5626
© 2026 ScienceGate Book Chapters — All rights reserved.