Model Card for Model ID
Model Description
This model is a fine-tuned version of facebook/nllb-200-distilled-600M
on the galsenai/french-wolof-translation
dataset. It is designed to perform translation from French to Wolof.
Evaluation
The model was evaluated on a subset of 50 lines from the test split of the galsenai/french-wolof-translation dataset. The evaluation metric used was BLEU score, computed using the sacrebleu library.
Evaluation Results
BLEU score: 9.17
How to Use
from transformers import AutoModelForSeq2SeqLM, AutoTokenizer
model_name = "cibfaye/nllb-fr-wo"
model = AutoModelForSeq2SeqLM.from_pretrained(model_name)
tokenizer = AutoTokenizer.from_pretrained(model_name)
def translate(text, src_lang='fra_Latn', tgt_lang='wol_Latn', a=32, b=3, max_input_length=1024, num_beams=5, **kwargs):
tokenizer.src_lang = src_lang
tokenizer.tgt_lang = tgt_lang
inputs = tokenizer(text, return_tensors='pt', padding=True, truncation=True, max_length=max_input_length)
result = model.generate(
**inputs.to(model.device),
forced_bos_token_id=tokenizer.convert_tokens_to_ids(tgt_lang),
max_new_tokens=int(a + b * inputs.input_ids.shape[1]),
num_beams=num_beams,
**kwargs
)
return tokenizer.batch_decode(result, skip_special_tokens=True)
text = "Votre texte en français ici."
translation = translate(text)
print(translation)
- Downloads last month
- 145
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
the model is not deployed on the HF Inference API.
Dataset used to train cibfaye/nllb-fr-wo
Evaluation results
- sacrebleu on galsenai/french-wolof-translationself-reported9.170