File size: 666 Bytes
4d21948 d97087d c0a7d69 d97087d |
1 2 3 4 5 6 |
[CodeParrot](https://huggingface.co/lvwerra/codeparrot) uses GPT-2 architecture with BPE tokenizer trained on Python code. We released this model as an educational tool for training large language models from scratch on code, with detailed tutorials and descriptions of the training process. It makes use of [Accelerate](https://huggingface.co/docs/accelerate/index) for distributed training and mixed precision. See this [blog](https://huggingface.co/blog/codeparrot) and [repo](https://github.com/huggingface/transformers/tree/main/examples/research_projects/codeparrot) for more details. |Model | # parameters | | - | - | | GPT2 | 110M | | GPT2 | 1.5B | |