--- language: en tags: - biomedical - lexical-semantics datasets: - UMLS ### SapBERT-PubMedBERT SapBERT by [Liu et al. (2020)](https://arxiv.org/pdf/2010.11784.pdf). Trained with [UMLS](https://www.nlm.nih.gov/research/umls/licensedcontent/umlsknowledgesources.html) 2020AA (English only), using [microsoft/BiomedNLP-PubMedBERT-base-uncased-abstract-fulltext](https://huggingface.co/microsoft/BiomedNLP-PubMedBERT-base-uncased-abstract-fulltext) as the base model. Please use [CLS] as the representation of the input. ### Citation ```bibtex @article{liu2020self, title={Self-alignment Pre-training for Biomedical Entity Representations}, author={Liu, Fangyu and Shareghi, Ehsan and Meng, Zaiqiao and Basaldella, Marco and Collier, Nigel}, journal={arXiv preprint arXiv:2010.11784}, year={2020} } ```