JOURNAL ARTICLE

WeLT: Improving Biomedical Fine-tuned Pre-trained Language Models with Cost-sensitive Learning

Abstract

Fine-tuning biomedical pre-trained language models (BioPLMs) such as BioBERT has become a common practice dominating leaderboards across various natural language processing tasks. Despite their success and wide adoption, prevailing fine-tuning approaches for named entity recognition (NER) naively train BioPLMs on targeted datasets without considering class distributions. This is problematic especially when dealing with imbalanced biomedical gold-standard datasets for NER in which most biomedical entities are underrepresented.In this paper, we address the class imbalance problem and propose WeLT, a cost-sensitive fine-tuning approach based on new re-scaled class weights for the task of biomedical NER. We evaluate WeLT’s fine-tuning performance on mixed-domain and domain-specific BioPLMs using eight biomedical gold-standard datasets. We compare our approach against vanilla fine-tuning and three other existing re-weighting schemes. Our results show the positive impact of handling the class imbalance problem. WeLT outperforms all the vanilla fine-tuned models. Furthermore, our method demonstrates advantages over other existing weighting schemes in most experiments.

Keywords:
Computer science Weighting Class (philosophy) Fine-tuning Task (project management) Domain (mathematical analysis) Artificial intelligence Language model Machine learning Natural language processing Mathematics Engineering

Metrics

1
Cited By
0.26
FWCI (Field Weighted Citation Impact)
29
Refs
0.55
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Topic Modeling
Physical Sciences →  Computer Science →  Artificial Intelligence
Biomedical Text Mining and Ontologies
Life Sciences →  Biochemistry, Genetics and Molecular Biology →  Molecular Biology
Genomics and Rare Diseases
Life Sciences →  Biochemistry, Genetics and Molecular Biology →  Genetics
© 2026 ScienceGate Book Chapters — All rights reserved.