File size: 1,046 Bytes
6a0f35d 9ca02c6 758aaa7 9ca02c6 6a0f35d e25e033 6a0f35d e25e033 9ca02c6 6a0f35d 9ca02c6 6a0f35d 9ca02c6 6a0f35d e7a6f85 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 |
---
license: openrail
library_name: transformers
datasets:
- nuprl/MultiPL-T
---
# MultiPL-T StarCoder2-15b
This repository holds several [StarCoder2-15b](https://huggingface.co/bigcode/starcoder2-15b) fine-tunes, all fine-tuned on MultiPL-T data.
Examine the commit message to determine the language and checkpoint. We have a checkpoint
for each epoch.
For more information the training process, see the MultiPL-T paper:
```
@misc{cassano:multipl-t,
title={Knowledge Transfer from High-Resource to Low-Resource Programming Languages for Code LLMs},
author={Federico Cassano and John Gouwar and Francesca Lucchetti and Claire Schlesinger and Anders Freeman and Carolyn Jane Anderson and Molly Q Feldman and Michael Greenberg and Abhinav Jangda and Arjun Guha},
year={2024},
eprint={2308.09895},
archivePrefix={arXiv},
primaryClass={cs.PL}
}
```
For usage instructions, see the model card for the original model. Replace the model name with the name of this repository, and set `revision=COMMIT_HASH`.
|