GPT2-124M-TinyStories
This is a prototype / proof of concept model to see what the results of pretraining GPT2 exclusively on narrative texts would look like. That's right-- this isn't a finetune, it's entirely pretrained on TinyStories.
The GPT2 config and tokenizer is however unmodified from the original.
- Downloads last month
- 6
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.