LLaMA 3.2 1B – English ↔ Mandarin Translator
This model is a fine-tuned version of meta-llama/Llama-3.2-1B
, trained for bidirectional translation between English and Mandarin. It supports both:
- 🇬🇧 English → 🇨🇳 Mandarin
- 🇨🇳 Mandarin → 🇬🇧 English
Format
The model expects prompts in the following format:
### English:
The children were playing in the park.
### Mandarin:
or
### Mandarin:
孩子们在公园里玩耍。
### English:
Usage
from transformers import AutoModelForCausalLM, AutoTokenizer
model = AutoModelForCausalLM.from_pretrained("Sheikhaei/llama-3.2-1b-en-zh-translator", torch_dtype="auto", device_map="auto")
tokenizer = AutoTokenizer.from_pretrained("Sheikhaei/llama-3.2-1b-en-zh-translator")
prompt = """### English:
The children were playing in the park.
### Mandarin:
"""
inputs = tokenizer(prompt, return_tensors="pt").to(model.device)
outputs = model.generate(**inputs, max_new_tokens=100, do_sample=False)
print(tokenizer.decode(outputs[0], skip_special_tokens=True))
Training Data
This model was fine-tuned on a custom English–Mandarin parallel dataset containing ~640,000 sentence pairs. The source data was collected from Tatoeba and then translated and expanded using the Gemma-3-12B model.
Evaluation
Direction | BLEU | COMET |
---|---|---|
English → Mandarin | 0.47 | 0.87 |
Mandarin → English | 0.44 | 0.89 |
License
Apache 2.0
- Downloads last month
- 25
Model tree for Sheikhaei/llama-3.2-1b-en-zh-translator
Base model
meta-llama/Llama-3.2-1B