File size: 413 Bytes
17e6800
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
---
license: mit
datasets:
- roneneldan/TinyStories
language:
- en
tags:
- gpt2
- tinystories
---

# GPT2-124M-TinyStories

This is a prototype / proof of concept model to see what the results of pretraining GPT2 exclusively on narrative texts would look like.
That's right-- this isn't a finetune, it's entirely pretrained on TinyStories. 

The GPT2 config and tokenizer is however unmodified from the original.