Instructions to use Helsinki-NLP/opus-mt-de-fi with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use Helsinki-NLP/opus-mt-de-fi with Transformers:
# Use a pipeline as a high-level helper # Warning: Pipeline type "translation" is no longer supported in transformers v5. # You must load the model directly (see below) or downgrade to v4.x with: # 'pip install "transformers<5.0.0' from transformers import pipeline pipe = pipeline("translation", model="Helsinki-NLP/opus-mt-de-fi")# Load model directly from transformers import AutoTokenizer, AutoModelForSeq2SeqLM tokenizer = AutoTokenizer.from_pretrained("Helsinki-NLP/opus-mt-de-fi") model = AutoModelForSeq2SeqLM.from_pretrained("Helsinki-NLP/opus-mt-de-fi") - Notebooks
- Google Colab
- Kaggle
Update config.json
Browse files- config.json +1 -0
config.json
CHANGED
|
@@ -15,6 +15,7 @@
|
|
| 15 |
"decoder_ffn_dim": 2048,
|
| 16 |
"decoder_layerdrop": 0.0,
|
| 17 |
"decoder_layers": 6,
|
|
|
|
| 18 |
"dropout": 0.1,
|
| 19 |
"encoder_attention_heads": 8,
|
| 20 |
"encoder_ffn_dim": 2048,
|
|
|
|
| 15 |
"decoder_ffn_dim": 2048,
|
| 16 |
"decoder_layerdrop": 0.0,
|
| 17 |
"decoder_layers": 6,
|
| 18 |
+
"decoder_start_token_id": 59970,
|
| 19 |
"dropout": 0.1,
|
| 20 |
"encoder_attention_heads": 8,
|
| 21 |
"encoder_ffn_dim": 2048,
|