TFG
Collection
Datasets and models leveraged and developed during my final degree work (TFG). Info and code can be found at https://github.com/enriquesaou/tfg-lm-qa
•
18 items
•
Updated
•
2
This model is a fine-tuned version of FacebookAI/roberta-base on an unknown dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
Training Loss | Epoch | Step | Validation Loss |
---|---|---|---|
1.7637 | 1.0 | 1399 | 1.5842 |
1.3627 | 2.0 | 2798 | 1.6059 |
1.1236 | 3.0 | 4197 | 1.6199 |
Base model
FacebookAI/roberta-base