SwePub
Sök i LIBRIS databas

  Extended search

onr:"swepub:oai:DiVA.org:hh-52235"
 

Search: onr:"swepub:oai:DiVA.org:hh-52235" > Pre-trained Languag...

  • 1 of 1
  • Previous record
  • Next record
  •    To hitlist

Pre-trained Language Models in Biomedical Domain : A Systematic Survey

Wang, Benyou (author)
The Chinese University of Hong Kong, Shenzhen, China
Chen, Zhihong (author)
The Chinese University of Hong Kong, Shenzhen, China
Xie, Qianqian (author)
University of Manchester, Manchester, United Kingdom
show more...
Pei, Jiahuan (author)
University of Amsterdam, Amsterdam, Netherlands
Tiwari, Prayag, 1991- (author)
Högskolan i Halmstad,Akademin för informationsteknologi
Li, Zhao (author)
The University of Texas Health Science Center, Houston, TX, USA
Fu, Jie (author)
University of Montreal, Montreal, PQ, Canada
show less...
 (creator_code:org_t)
New York, NY : Association for Computing Machinery (ACM), 2024
2024
English.
In: ACM Computing Surveys. - New York, NY : Association for Computing Machinery (ACM). - 0360-0300 .- 1557-7341. ; 56:3
  • Research review (peer-reviewed)
Abstract Subject headings
Close  
  • Pre-trained language models (PLMs) have been the de facto paradigm for most natural language processing tasks. This also benefits the biomedical domain: researchers from informatics, medicine, and computer science communities propose various PLMs trained on biomedical datasets, e.g., biomedical text, electronic health records, protein, and DNA sequences for various biomedical tasks. However, the cross-discipline characteristics of biomedical PLMs hinder their spreading among communities; some existing works are isolated from each other without comprehensive comparison and discussions. It is nontrivial to make a survey that not only systematically reviews recent advances in biomedical PLMs and their applications but also standardizes terminology and benchmarks. This article summarizes the recent progress of pre-trained language models in the biomedical domain and their applications in downstream biomedical tasks. Particularly, we discuss the motivations of PLMs in the biomedical domain and introduce the key concepts of pre-trained language models. We then propose a taxonomy of existing biomedical PLMs that categorizes them from various perspectives systematically. Plus, their applications in biomedical downstream tasks are exhaustively discussed, respectively. Last, we illustrate various limitations and future trends, which aims to provide inspiration for the future research. © 2023 Copyright held by the owner/author(s). Publication rights licensed to ACM.

Subject headings

NATURVETENSKAP  -- Data- och informationsvetenskap -- Språkteknologi (hsv//swe)
NATURAL SCIENCES  -- Computer and Information Sciences -- Language Technology (hsv//eng)

Keyword

Biomedical domain
pre-trained language models
natural language processing

Publication and Content Type

ref (subject category)
for (subject category)

Find in a library

To the university's database

  • 1 of 1
  • Previous record
  • Next record
  •    To hitlist

Search outside SwePub

Kungliga biblioteket hanterar dina personuppgifter i enlighet med EU:s dataskyddsförordning (2018), GDPR. Läs mer om hur det funkar här.
Så här hanterar KB dina uppgifter vid användning av denna tjänst.

 
pil uppåt Close

Copy and save the link in order to return to this view