roberta-large-bne-nubes

This model is a finetuned version of roberta-large-bne for the nubes dataset used in a benchmark in the paper TODO. The model has a F1 of 0.911

Please refer to the original publication for more information TODO LINK

Parameters used

parameter Value
batch size 16
learning rate 1e-05
classifier dropout 0
warmup ratio 0
warmup steps 0
weight decay 0
optimizer AdamW
epochs 10
early stopping patience 3

BibTeX entry and citation info

TODO
Downloads last month
5
Safetensors
Model size
355M params
Tensor type
I64
·
F32
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Collection including IIC/roberta-large-bne-nubes

Evaluation results