PythonBasic / README.md
MicroPanda123's picture
Update README.md
abb6e6b
|
raw
history blame
767 Bytes
metadata
license: gpl-2.0
pipeline_tag: text-generation

Got bored so used nanoGPT to train model on all Python snippets from https://www.kaggle.com/datasets/simiotic/github-code-snippets

Model was trained on default train.py settings, except

eval_intervals=20
eval_iters=40
batch_size=2
gradient_accumulation_steps = 64

This was because I was training it locally on RTX2060 and did not have enough power to train it on higher settings. Current model was trained for 8880 iterations. Took around 20 hours. At first I made it only save model after validation loss improved, to not allow overfitting, but after some time I decided to risk it and turned that off and allowed it to save everytime, luckly it worked out fine.