University of Illinois Chicago
Browse

Biomedical Lay Summarization Using Pre-Trained Adapters

thesis
posted on 2024-12-01, 00:00 authored by Veera Surya Sandeep Reddy Dwarampudi
There is growing interest among the general public in accessing biomedical literature to find treatments and causes of common health problems or to read about significant global topics like disease outbreaks. However, the technical language and complex concepts can be difficult to understand for those without a background in the field. Despite advancements in general English lay summarization driven by large language models (LLMs), progress in the biomedical domain has been limited. Challenges include knowledge grounding, establishing correct relationships between entities, and discerning between abbreviations, synonyms, homographs, and hyponyms specific to the biomedical domain. To address these challenges, we developed an efficient model to simplify complex biomedical text by introducing custom adapter blocks into pre-trained language models (PLMs) and implementing a specific pre-training strategy for these adapters using distinct biomedical knowledge sources. We used two publicly available datasets, PLABA and PLOS, to evaluate the effectiveness of our models. Our findings indicate that incorporating external knowledge significantly improves lay summarization, particularly in generating readable text and clarifying technical concepts.

History

Advisor

Shweta Yadav

Department

Computer Science

Degree Grantor

University of Illinois Chicago

Degree Level

  • Masters

Degree name

Master of Science

Committee Member

Sourav Medya Cornelia Caragea

Thesis type

application/pdf

Usage metrics

    Categories

    No categories selected

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC