Update README.md
Browse files
README.md
CHANGED
@@ -18,7 +18,7 @@ This model is fine-tuned on [multilingual BERT](https://huggingface.co/bert-base
|
|
18 |
|
19 |
### Model Description
|
20 |
|
21 |
-
GalBERT for Semantic Role Labeling (SRL) is a transformers model, leveraging mBERT's extensive pretraining on 104 languages to achieve better SRL predictions for low-resource Galician. This model is cased: it makes a difference between english and English. It was fine-tuned with the following objectives:
|
22 |
|
23 |
- Identify up to 13 verbal roots within a sentence.
|
24 |
- Identify available arguments for each verbal root. Due to scarcity of data, this model focused solely on the identification of arguments 0, 1, and 2.
|
|
|
18 |
|
19 |
### Model Description
|
20 |
|
21 |
+
GalBERT for Semantic Role Labeling (SRL) is a transformers model, leveraging mBERT's extensive pretraining on 104 languages to achieve better SRL predictions for low-resource Galician. This model is cased: it makes a difference between english and English. It was fine-tuned on Galician with the following objectives:
|
22 |
|
23 |
- Identify up to 13 verbal roots within a sentence.
|
24 |
- Identify available arguments for each verbal root. Due to scarcity of data, this model focused solely on the identification of arguments 0, 1, and 2.
|