Instructions to use miugod/bert-base-multilingual-cased-iwslt14deen with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use miugod/bert-base-multilingual-cased-iwslt14deen with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("fill-mask", model="miugod/bert-base-multilingual-cased-iwslt14deen")# Load model directly from transformers import AutoTokenizer, AutoModelForMaskedLM tokenizer = AutoTokenizer.from_pretrained("miugod/bert-base-multilingual-cased-iwslt14deen") model = AutoModelForMaskedLM.from_pretrained("miugod/bert-base-multilingual-cased-iwslt14deen") - Notebooks
- Google Colab
- Kaggle
YAML Metadata Warning:empty or missing yaml metadata in repo card
Check out the documentation for more information.
This model is based on the bert-base-multilingual-cased with mlm task fine-tuning on the iwslt14 German-English dataset. The data were stitched into 4 copies by src, tgt, src [SEP] tgt , tgt [SEP] src Parameters: bsz=6 , update_freq=2, graphics card is 3080ti, fp16, trained for 10w steps, and the training loss was reduced from 2.1869 to 1.2034.
- Downloads last month
- 8