File size: 953 Bytes
2394cc2 f420cd3 2394cc2 f420cd3 2394cc2 f420cd3 df1fa19 f420cd3 2394cc2 f420cd3 2394cc2 f420cd3 2394cc2 f420cd3 2394cc2 f420cd3 2394cc2 f420cd3 2394cc2 f420cd3 2394cc2 f420cd3 2394cc2 f420cd3 2394cc2 f420cd3 2394cc2 f420cd3 2394cc2 f420cd3 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 |
---
library_name: transformers
language:
- ko
license: gemma
tags:
- gemma
- pytorch
- instruct
- finetune
- translation
widget:
- messages:
- role: user
content: "Hamsters don't eat cats."
base_model: beomi/gemma-ko-2b
#datasets:
pipeline_tag: text-generation
---
# Gemma 2B Translation v0.125
- Eval Loss: `0.80386`
- Train Loss: `0.75039`
- lr: `6e-05`
- optimizer: adamw
- lr_scheduler_type: cosine
## Prompt Template
```
<bos>##English##
Hamsters don't eat cats.
##Korean##
ํ์คํฐ๋ ๊ณ ์์ด๋ฅผ ๋จน์ง ์์ต๋๋ค.<eos>
```
```
<bos>##Korean##
ํ์คํฐ๋ ๊ณ ์์ด๋ฅผ ๋จน์ง ์์ต๋๋ค.
##English##
Hamsters do not eat cats.<eos>
```
## Model Description
- **Developed by:** `lemon-mint`
- **Model type:** Gemma
- **Language(s) (NLP):** English
- **License:** [gemma-terms-of-use](https://ai.google.dev/gemma/terms)
- **Finetuned from model:** [beomi/gemma-ko-2b](https://huggingface.co/beomi/gemma-ko-2b)
|