Wang ZhangXin WangYuhong WuXingpeng ZhangHuayi Zhan
Multi-label text classification (MLTC) is an important task in the field of natural language processing (NLP). Suffering from limited input length, most existing models learn text representation and label representation separately, leading to the overlook of correlations between texts and labels. To this end, we introduce a comprehensive model for the MLTC task. Under the same representation space, our model, which is equipped with Graph Convolutional Network (GCN) layer, attention mechanism, and contrastive learning objective, learns representations of texts and labels jointly. To tackle the issue caused by the input length limitation, we develop a two-stage label reduction method via the application of label merging and association. Our method's effectiveness is validated through extensive experiments on various MLTC datasets, unraveling the intricate correlations between texts and labels.
Peng ChengHaobo WangJue WangLidan ShouKe ChenGang ChenChang Yao
Lin XiaoXin HuangBoli ChenLiping Jing