Update README.md
Browse files
README.md
CHANGED
@@ -26,6 +26,7 @@ SambaLingo-Japanese-Chat is a human aligned chat model trained in Japanese and E
|
|
26 |
- **Language(s):** Japanese, English
|
27 |
- **Finetuned from model:** [Llama-2-7b](https://huggingface.co/meta-llama/Llama-2-7b-hf)
|
28 |
- **Try This Model:** [SambaLingo-chat-space](https://huggingface.co/spaces/sambanovasystems/SambaLingo-chat-space)
|
|
|
29 |
- **Blog Post**: [sambalingo-open-source-language-experts](https://sambanova.ai/blog/sambalingo-open-source-language-experts)
|
30 |
|
31 |
## Getting Started
|
@@ -83,6 +84,9 @@ The DPO phase was done on the [ultrafeedback](https://huggingface.co/datasets/Hu
|
|
83 |
## Tokenizer Details
|
84 |
We extended the vocabulary of the base llama model from 32,000 tokens to 57,000 tokens by adding up to 25,000 non-overlapping tokens from the new language.
|
85 |
|
|
|
|
|
|
|
86 |
## Uses
|
87 |
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
|
88 |
|
@@ -125,12 +129,12 @@ We would like to give a special thanks to the following groups:
|
|
125 |
|
126 |
## Cite SambaLingo
|
127 |
```
|
128 |
-
@
|
129 |
-
|
130 |
-
|
131 |
-
|
132 |
-
|
133 |
-
|
134 |
-
|
135 |
}
|
136 |
```
|
|
|
26 |
- **Language(s):** Japanese, English
|
27 |
- **Finetuned from model:** [Llama-2-7b](https://huggingface.co/meta-llama/Llama-2-7b-hf)
|
28 |
- **Try This Model:** [SambaLingo-chat-space](https://huggingface.co/spaces/sambanovasystems/SambaLingo-chat-space)
|
29 |
+
- **Paper:** [SambaLingo: Teaching Large Language Models New Languages](https://arxiv.org/abs/2404.05829)
|
30 |
- **Blog Post**: [sambalingo-open-source-language-experts](https://sambanova.ai/blog/sambalingo-open-source-language-experts)
|
31 |
|
32 |
## Getting Started
|
|
|
84 |
## Tokenizer Details
|
85 |
We extended the vocabulary of the base llama model from 32,000 tokens to 57,000 tokens by adding up to 25,000 non-overlapping tokens from the new language.
|
86 |
|
87 |
+
## Evaluation
|
88 |
+
For evaluation results see our paper: [SambaLingo: Teaching Large Language Models New Languages](https://arxiv.org/abs/2404.05829)
|
89 |
+
|
90 |
## Uses
|
91 |
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
|
92 |
|
|
|
129 |
|
130 |
## Cite SambaLingo
|
131 |
```
|
132 |
+
@misc{csaki2024sambalingo,
|
133 |
+
title={SambaLingo: Teaching Large Language Models New Languages},
|
134 |
+
author={Zoltan Csaki and Bo Li and Jonathan Li and Qiantong Xu and Pian Pawakapan and Leon Zhang and Yun Du and Hengyu Zhao and Changran Hu and Urmish Thakker},
|
135 |
+
year={2024},
|
136 |
+
eprint={2404.05829},
|
137 |
+
archivePrefix={arXiv},
|
138 |
+
primaryClass={cs.CL}
|
139 |
}
|
140 |
```
|