• Release 1.0beta (April 29, 2021)

NB-BERT-large (beta)

Description

NB-BERT-large is a general BERT-large model built on the large digital collection at the National Library of Norway.

This model is trained from scratch on a wide variety of Norwegian text (both bokmål and nynorsk) from the last 200 years using a monolingual Norwegian vocabulary.

Intended use & limitations

The 1.0 version of the model is general, and should be fine-tuned for any particular use. Some fine-tuning sets may be found on Github, see

Training data

The model is trained on a wide variety of text. The training set is described on

More information

For more information on the model, see

https://github.com/NBAiLab/notram

Downloads last month
806
Safetensors
Model size
356M params
Tensor type
I64
·
F32
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for NbAiLab/nb-bert-large

Finetunes
4 models

Spaces using NbAiLab/nb-bert-large 3

Collection including NbAiLab/nb-bert-large