license: mit | |
datasets: | |
- roneneldan/TinyStories | |
language: | |
- en | |
tags: | |
- gpt2 | |
- tinystories | |
# GPT2-124M-TinyStories | |
This is a prototype / proof of concept model to see what the results of pretraining GPT2 exclusively on narrative texts would look like. | |
That's right-- this isn't a finetune, it's entirely pretrained on TinyStories. | |
The GPT2 config and tokenizer is however unmodified from the original. |