LYRICAL mini M.T. Model
Russian | English Songs & Poems
Machine Translation / Extrapolation
By Silver Age Poets
Variant 0.1m: Full ORPO finetune of MiniPLM-llama3.1-212M Model Weights
Model Card
This model is a fine-tune of MiniLLM/MiniPLM-llama3.1-212M.
NOTE:This is one of our many experimental/WIP test variants over numerous base models in a range of small-medium sizes.
By empirically cross-comparing foundations via training, versioning, and testing; thereby, we iterate towards a reliable actualization of our concept.
Fine-tuned leveraging Odds Ratio Preference Optimization (ORPO) on our Russian-to-English song lyrics translation/localization dataset.
Our aim with this work is to foster a translation model capable of adaptively localizing idiomatic, formal/poetic/rhythmic, and performance-catered features of lyrical input texts, whilst retaining adequate accuracy at the level of direct semantic translation.
It has been trained using TRL.
SUGGESTED SYSTEM PROMPT:
You are an award-winning bilingual Russian-American literary translator, poet, songwriter, and literary translator. You are famous for translating highly idiomatic, lyrical, and culturally specific songs and poems between Russian and English whilst retaining with perfect fidelity (or appropriately localizing) the expressive tone, melodic pattern, subtle lyricism, cultural resonance, and the formal characteristics of the source work. Translate the following song from Russian to English, whilst accurately matching and reproducing in English the source Russian semantics and phrasing of each line and the song as a whole. Take care to preserve the song’s formal and poetic characteristics (such as meter, verbal musicality, expressive attitude, mood, rhyme-scheme, and syllable pattern/count). Do not explain. Respond with the translation only.
SUGGESTED PROMPT PRE-PHRASE:
"Translate the following song to English, while accurately retaining the meter, syllable counts, rhymes, and style. Abide by the musical phrasing and the syllable pattern from the source. Translate: {insert song lyrics or poem verses}"
Training procedure
This model was trained with ORPO, a method introduced in ORPO: Monolithic Preference Optimization without Reference Model.
Framework versions
- TRL: 0.22.1
- Transformers: 4.56.0
- Pytorch: 2.8.0
- Datasets: 3.1.0
- Tokenizers: 0.22.0
Citations
Cite ORPO as:
@article{hong2024orpo,
title = {{ORPO: Monolithic Preference Optimization without Reference Model}},
author = {Jiwoo Hong and Noah Lee and James Thorne},
year = 2024,
eprint = {arXiv:2403.07691}
}
Cite TRL as:
@misc{vonwerra2022trl,
title = {{TRL: Transformer Reinforcement Learning}},
author = {Leandro von Werra and Younes Belkada and Lewis Tunstall and Edward Beeching and Tristan Thrush and Nathan Lambert and Shengyi Huang and Kashif Rasul and Quentin Gallou{\'e}dec},
year = 2020,
journal = {GitHub repository},
publisher = {GitHub},
howpublished = {\url{https://github.com/huggingface/trl}}
}
- Downloads last month
- 5
Model tree for AlekseyCalvin/Lyrical_MT_mini_rus2eng_MiniPLM_llama3.1_212m
Base model
MiniLLM/MiniPLM-llama3.1-212M