language: - en - code - multilingual license: mpl-2.0
A GPT2-type neural network trained on 16 gigabytes of Pyhon scripts from scratch. It has 50 million parameters.
Made as a toy.