metadata
license: cc-by-nc-3.0
language:
- da
pipeline_tag: fill-mask
tags:
- bert
- danish
widget:
- text: Hvide blodlegemer beskytter kroppen mod [MASK]
Danish medical BERT
MeDa-BERT was initialized with weights from a pretrained Danish BERT model (https://huggingface.co/Maltehb/danish-bert-botxo). Next, it was fine-tuned for 48 epochs using the MLM objective on a Danish medical corpus of 123M tokens. The development of the corpus and model is described further in the paper:
Here is an example on how to load the model in PyTorch using the 🤗Transformers library:
from transformers import AutoTokenizer, AutoModelForMaskedLM
tokenizer = AutoTokenizer.from_pretrained("jannikskytt/MeDa-Bert")
model = AutoModelForMaskedLM.from_pretrained("jannikskytt/MeDa-Bert")
Citing
Citing If you find our model helps, please consider citing this :) ''' cite '''