JOURNAL ARTICLE

Trainable Weighted Pooling Method for Text Classification with BERT

Abstract

Text classification is one of the central challenges in natural language processing, encompassing techniques for cate-gorizing large amounts of text data into meaningful categories. This field plays an important role in many applications, such as information retrieval, sentiment analysis, and recommendation systems. In recent years, the remarkable development of deep learning technology has led to the proposal of large language models, which have achieved high performance in various tasks. BERT is one of the large language models widely recognized for its potential in text classification. Although BERT can effectively learn context-dependent word representations, an appropriate pooling strategy is necessary to obtain a representation of the entire document. In this study, we propose a pooling method called CLS-average pooling (CAP) that combines the commonly used the [CLS] embedding and the average pooling method in BERT for text classification. We obtain the sentence representations by taking the weighted sum of the embeddings obtained from the [CLS] embedding and the average pooling. At this time, we treat the weights used in CAP as trainable parameters to automatically acquire appropriate weights for text classification. We demonstrated that the proposed method is more effective than conventional pooling methods in text classification tasks by applying it to a dataset for text classification.

Keywords:
Pooling Computer science Artificial intelligence Natural language processing Pattern recognition (psychology)

Metrics

1
Cited By
0.26
FWCI (Field Weighted Citation Impact)
29
Refs
0.61
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Text and Document Classification Technologies
Physical Sciences →  Computer Science →  Artificial Intelligence

Related Documents

BOOK-CHAPTER

Extractive Text Summarization Using BERT with Weighted Pooling

Seema YadavSujeet Kumar SinghSubedar ChaurasiyaJay Prakash

Communications in computer and information science Year: 2026 Pages: 62-72
BOOK-CHAPTER

Weighted Hierarchy Mechanism over BERT for Long Text Classification

Yong JinQisi ZhuXuan DengLinli Hu

Lecture notes in computer science Year: 2021 Pages: 566-574
JOURNAL ARTICLE

A Text Document Clustering Method Based on Weighted BERT Model

Yutong LiJuanjuan CaiJingling Wang

Journal:   2020 IEEE 4th Information Technology, Networking, Electronic and Automation Control Conference (ITNEC) Year: 2020
BOOK-CHAPTER

VGCN-BERT: Augmenting BERT with Graph Embedding for Text Classification

Zhibin LuPan DuJian‐Yun Nie

Lecture notes in computer science Year: 2020 Pages: 369-382
© 2026 ScienceGate Book Chapters — All rights reserved.