BERT fine-tuned for Named Entity Recognition in Danish

The model tags tokens (in Danish sentences) with named entity tags (BIO format) [PER, ORG, LOC, MISC]. The pretrained language model used for fine-tuning is the Danish BERT by BotXO.

See the DaNLP documentation for more details.

Here is how to use the model:

from transformers import BertTokenizer, BertForTokenClassification

model = BertForTokenClassification.from_pretrained("alexandrainst/da-ner-base")
tokenizer = BertTokenizer.from_pretrained("alexandrainst/da-ner-base")

Training Data

The model has been trained on the DaNE.

Downloads last month
132
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Dataset used to train alexandrainst/da-ner-base