mt5-large-nn / README.md
erlingh's picture
Update README.md
697aa39
|
raw
history blame
970 Bytes
metadata
license: apache-2.0
language:
  - 'no'
  - nn
widget:
  text: >-
    <extra_id_0> kvar veke samlar medlemene av Regjeringa seg til Statsråd på
    <extra_id_1>. Dette organet er den høgste <extra_id_2> i Noreg. For at møtet
    skal vere <extra_id_3>, må meir enn halvparten av <extra_id_4> i regjeringa
    vere til stades.

This is a pruned version of the google/mt5-large model. Here, the input and output embeddings are pruned to support a greatly reduced vocabulary. The chosen vocabulary has 30K norwegian, english and special tokens, ~12% of the old size. This reduces the model size by roughly 37%. The model is still OK on similar languages, like German and Danish, but very different languages like arabic are not a good fit anymore. This model is intended as a starting point for finetuning mt5 for norwegian applications.

In addition it has been trained on Nynorsk using masked language modeling, in accordance with pretraining objective in the T5 paper.