Journal article
Lightweight transformers for clinical natural language processing
Abstract
Specialised pre-trained language models are becoming more frequent in Natural language Processing (NLP) since they can potentially outperform models trained on generic texts. BioBERT (Sanh et al., Distilbert, a distilled version of bert: smaller, faster, cheaper and lighter. arXiv preprint arXiv: 1910.01108, 2019) and BioClinicalBERT (Alsentzer et al., Publicly available clinical bert embeddings. In Proceedings of the 2nd Clinical Natural …
Authors
Rohanian O; Nouriborji M; Jauncey H; Kouchaki S; Nooralahzadeh F; Group ICC; Clifton L; Merson L; Clifton DA
Journal
Natural Language Engineering, Vol. 30, No. 5, pp. 887–914
Publisher
Cambridge University Press (CUP)
Publication Date
9 2024
DOI
10.1017/s1351324923000542
ISSN
1351-3249