metadata
license: apache-2.0
datasets:
- mrm8488/CHISTES_spanish_jokes
language:
- es
pipeline_tag: text-generation
Adapter for BERTIN-GPT-J-6B-ES fine-tuned on Jokes for jokes generation
Adapter Description
This adapter was created by using the PEFT library and allows the base model Bertin-GPT-J-&B-ES to be fine-tuned on the dataset mrm8488/CHISTES_spanish_jokes for Spanish jokes generation by using the method LoRA.
Model Description
BERTIN-GPT-J-6B is a Spanish finetuned version of GPT-J 6B, a transformer model trained using Ben Wang's Mesh Transformer JAX. "GPT-J" refers to the class of model, while "6B" represents the number of trainable parameters.
Training data
Dataset from Workshop for NLP introduction with Spanish jokes More Information needed
Training procedure
How to use
# TODO