Update README.md
Browse files
README.md
CHANGED
@@ -6,10 +6,7 @@ language:
|
|
6 |
license: apache-2.0
|
7 |
---
|
8 |
# XLM-RoBERTa large model whole word masking finetuned on SQuAD
|
9 |
-
Pretrained model on English and Russian languages using a masked language modeling (MLM) objective.
|
10 |
-
[this paper](https://arxiv.org/abs/1810.04805) and first released in
|
11 |
-
[this repository](https://github.com/google-research/bert). This model is uncased: it does not make a difference
|
12 |
-
|
13 |
|
14 |
## Used Datasets
|
15 |
SQuAD + SberQuAD
|
|
|
6 |
license: apache-2.0
|
7 |
---
|
8 |
# XLM-RoBERTa large model whole word masking finetuned on SQuAD
|
9 |
+
Pretrained model on English and Russian languages using a masked language modeling (MLM) objective.
|
|
|
|
|
|
|
10 |
|
11 |
## Used Datasets
|
12 |
SQuAD + SberQuAD
|