This is a RoBERTa-base model trained from scratch in Spanish.
The training dataset is mc4 subsampling documents to a total of about 50 million examples. Sampling is random. This model continued training from sequence length 128 using 20.000 steps for length 512.
Please see our main card for more information.
This is part of the Flax/Jax Community Week, organised by HuggingFace and TPU usage sponsored by Google.
Team members
- Downloads last month
- 15
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.