Edit model card

Silvina Ocampo - Modelo de Exploraci贸n

Model description

Este modelo es una versi贸n fine-tuneada de DeepESP/gpt2-spanish, con un dataset de Autores y Autoras Latinoamericanos/as curado por Karen Palacio (https://github.com/karen-pal/borges), del cual seleccionamos a Silvina Ocampo. Fue creado durante un taller enfocado en exploraci贸n de LLMs por miembros de LAIA (laia.ar), siguiendo en grupo el Taller de Adaptaci贸n de Modelos de Lenguaje de Fundaci贸n Via Libre (https://github.com/nanom/llm_adaptation_workshop)

This model is a fine-tuned version of DeepESP/gpt2-spanish, with a dataset of Latin American authors by Karen Palacio (https://github.com/karen-pal/borges), of which Silvina Ocampo's work was chosen. It was created during a workshop focused on LLM exploration by members of LAIA (laia.ar), by group-following the LLM Adaptation Workshop by Fundaci贸n Via Libre (https://github.com/nanom/llm_adaptation_workshop)

It achieves the following results on the evaluation set:

  • Loss: 2.2787

Intended uses & limitations

El modelo fue fine-tuneado como ejercicio educativo. This model was fine tuned as an educational exercise.

Training and evaluation data

  • ver https://github.com/karen-pal/borges para los datasets. En este ejercicio, tambi茅n agregamos un cuento no disponible en los datos originales.
  • see the link above for the datasets

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 4

Training results

Training Loss Epoch Step Validation Loss
2.6172 1.0 58 2.4691
2.1861 2.0 116 2.3365
1.9253 3.0 174 2.2929
2.0581 4.0 232 2.2787

Framework versions

  • Transformers 4.36.1
  • Pytorch 2.1.0+cu121
  • Datasets 2.15.0
  • Tokenizers 0.15.0
Downloads last month
12
Safetensors
Model size
124M params
Tensor type
F32
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for squareoctopus/ocampo

Finetuned
(68)
this model

Space using squareoctopus/ocampo 1