Lucky52
Collection
Ji, S., & Chen, P. (2024). Lucky 52: How Many Languages Are Needed to Instruction Fine-Tune Large Language Models? https://arxiv.org/abs/2404.04850
•
52 items
•
Updated
This HF repository hosts instruction fine-tuned multilingual BLOOM model using the parallel instruction dataset called Bactrain-X in 52 languages. We progressively add a language during instruction fine-tuning at each time, and train 52 models in total. Then, we evaluate those models in three multilingual benchmarks.
Please refer to our paper for more details.
The model checkpoint should be loaded using transformers
library.
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("MaLA-LM/lucky52-bloom-7b1-no-35")
model = AutoModelForCausalLM.from_pretrained("MaLA-LM/lucky52-bloom-7b1-no-35")
@misc{lucky52,
title = "Lucky 52: How Many Languages Are Needed to Instruction Fine-Tune Large Language Models?",
author = "Shaoxiong Ji and Pinzhen Chen",
year = "2024",
eprint = "2404.04850",
archiveprefix = "arXiv",
primaryclass = "cs.CL"
}