GPT-2 (including tokenizer) trained from scratch on some of my favorite books (about 31M words in total).
It's only trained on an RTX 3090 for three hours, so don't take it seriously, just have fun!
- peak lr: 4e-4
- global batch size: 32
- weight decay: 0.01
- training steps: 25k
- warmup steps: 1k
- lr decay: cosine
Example usage:
from transformers import AutoTokenizer, GPT2LMHeadModel
tokenizer = AutoTokenizer.from_pretrained('Geralt-Targaryen/FantasyGPT')
model = GPT2LMHeadModel.from_pretrained('Geralt-Targaryen/FantasyGPT')
input_text = ["Daenerys kissed Gandalf, as the witcher hacked off Lord Voldemort's head with a brutal swing of Longclaw."]
input_tokenized = tokenizer(input_text, return_tensors='pt')
output = model.generate(inputs=input_tokenized.input_ids, max_new_tokens=256, do_sample=True, top_p=0.9, temperature=1)
print(tokenizer.decode(output[0]))
Sample output:
Daenerys kissed Gandalf, as the witcher hacked off Lord Voldemort's head with a brutal swing of Longclaw. “Do you know what a warrior like that will do?”
“I am a knight of seven,” Geralt said. “And how is this knight?”
“Prince, it is known,” replied the witcher, “I am a knight of Solamnia, not a Knight. A knight of Solamnia in the name of Reorx, with elven armies at the head of his knights and knights. You do not even remember my name, elf. It was a good call.”
“That name,” said the witcher, “what does it mean, elf?”
“Some story,” said the old knight. “A good story from the Cataclysm.”
The witcher snorted and looked at the witcher. “That is how you feel, elf. You don't understand why the knights won't be executed for them.”
“That's because the knight is a king,” Chireadan said finally, “the king's elder son. Because he is, the knights would also be honorable with the title of knighthood. But a warrior would be better suited to have the honor to win.”
“But…”
“It's true,” said the old knight, “that, isn't entirely what you want. The lance of your sword is one of the best
- Downloads last month
- 21