JOURNAL ARTICLE

Biomedical Lay Summarization Using Pre-Trained Adapters

Abstract

There is growing interest among the general public in accessing biomedical literature to find treatments and causes of common health problems or to read about significant global topics like disease outbreaks. However, the technical language and complex concepts can be difficult to understand for those without a background in the field. Despite advancements in general English lay summarization driven by large language models (LLMs), progress in the biomedical domain has been limited. Challenges include knowledge grounding, establishing correct relationships between entities, and discerning between abbreviations, synonyms, homographs, and hyponyms specific to the biomedical domain. To address these challenges, we developed an efficient model to simplify complex biomedical text by introducing custom adapter blocks into pre-trained language models (PLMs) and implementing a specific pre-training strategy for these adapters using distinct biomedical knowledge sources. We used two publicly available datasets, PLABA and PLOS, to evaluate the effectiveness of our models. Our findings indicate that incorporating external knowledge significantly improves lay summarization, particularly in generating readable text and clarifying technical concepts.

Keywords:
Automatic summarization Domain (mathematical analysis) Language model Domain knowledge Adapter (computing)

Metrics

0
Cited By
0.00
FWCI (Field Weighted Citation Impact)
0
Refs
0.40
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Topics

Biomedical Text Mining and Ontologies
Life Sciences →  Biochemistry, Genetics and Molecular Biology →  Molecular Biology
Topic Modeling
Physical Sciences →  Computer Science →  Artificial Intelligence
Text Readability and Simplification
Physical Sciences →  Computer Science →  Artificial Intelligence

Related Documents

JOURNAL ARTICLE

Biomedical Lay Summarization Using Pre-Trained Adapters

Dwarampudi, Veera Surya Sandeep Reddy

Journal:   OPAL (Open@LaTrobe) (La Trobe University) Year: 2025
JOURNAL ARTICLE

Biomedical-domain pre-trained language model for extractive summarization

Yongping DuQingxiao LiLulin WangYanqing He

Journal:   Knowledge-Based Systems Year: 2020 Vol: 199 Pages: 105964-105964
JOURNAL ARTICLE

Arabic Extractive Summarization Using Pre-Trained Models

Yasmin EiniehAmal AlmansourAmani Jamal

Journal:   Journal of King Abdulaziz University-Computing and Information Technology Sciences Year: 2023 Vol: 12 (1)
© 2026 ScienceGate Book Chapters — All rights reserved.