DarwinAnim8or's picture
Update README.md
17e6800 verified
|
raw
history blame
413 Bytes
metadata
license: mit
datasets:
  - roneneldan/TinyStories
language:
  - en
tags:
  - gpt2
  - tinystories

GPT2-124M-TinyStories

This is a prototype / proof of concept model to see what the results of pretraining GPT2 exclusively on narrative texts would look like. That's right-- this isn't a finetune, it's entirely pretrained on TinyStories.

The GPT2 config and tokenizer is however unmodified from the original.