asier-gutierrez
commited on
Commit
•
edd990b
1
Parent(s):
4395abf
Create README.md
Browse files
README.md
ADDED
@@ -0,0 +1,43 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
language:
|
3 |
+
- es
|
4 |
+
license: "cc-by-4.0"
|
5 |
+
tags:
|
6 |
+
- "national library of spain"
|
7 |
+
- "spanish"
|
8 |
+
- "bne"
|
9 |
+
- "qa"
|
10 |
+
- "question answering"
|
11 |
+
datasets:
|
12 |
+
- "BSC-TeMU/SQAC"
|
13 |
+
metrics:
|
14 |
+
- "f1"
|
15 |
+
|
16 |
+
---
|
17 |
+
|
18 |
+
# Spanish RoBERTa-large trained on BNE finetuned for Spanish Question Answering Corpus (SQAC) dataset.
|
19 |
+
RoBERTa-large-bne is a transformer-based masked language model for the Spanish language. It is based on the [RoBERTa](https://arxiv.org/abs/1907.11692) large model and has been pre-trained using the largest Spanish corpus known to date, with a total of 570GB of clean and deduplicated text processed for this work, compiled from the web crawlings performed by the [National Library of Spain (Biblioteca Nacional de España)](http://www.bne.es/en/Inicio/index.html) from 2009 to 2019.
|
20 |
+
|
21 |
+
Original pre-trained model can be found here: https://huggingface.co/BSC-TeMU/roberta-large-bne
|
22 |
+
|
23 |
+
## Dataset
|
24 |
+
The dataset used is the one from the [SQAC corpus](https://huggingface.co/datasets/BSC-TeMU/SQAC).
|
25 |
+
|
26 |
+
## Evaluation and results
|
27 |
+
F1 Score: 0.7993 (average of 5 runs).
|
28 |
+
|
29 |
+
For evaluation details visit our [GitHub repository](https://github.com/PlanTL-SANIDAD/lm-spanish).
|
30 |
+
|
31 |
+
|
32 |
+
## Citing
|
33 |
+
Check out our paper for all the details: https://arxiv.org/abs/2107.07253
|
34 |
+
```
|
35 |
+
@misc{gutierrezfandino2021spanish,
|
36 |
+
title={Spanish Language Models},
|
37 |
+
author={Asier Gutiérrez-Fandiño and Jordi Armengol-Estapé and Marc Pàmies and Joan Llop-Palao and Joaquín Silveira-Ocampo and Casimiro Pio Carrino and Aitor Gonzalez-Agirre and Carme Armentano-Oller and Carlos Rodriguez-Penagos and Marta Villegas},
|
38 |
+
year={2021},
|
39 |
+
eprint={2107.07253},
|
40 |
+
archivePrefix={arXiv},
|
41 |
+
primaryClass={cs.CL}
|
42 |
+
}
|
43 |
+
```
|