zolicsaki commited on
Commit
c734ac9
1 Parent(s): 67a11a1

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +8 -8
README.md CHANGED
@@ -54,7 +54,7 @@ All pre-training is done on the [Cultura-X](https://huggingface.co/datasets/uonl
54
  We extended the vocabulary of the base llama model from 32,000 tokens to 57,000 tokens by adding up to 25,000 non-overlapping tokens from the new language.
55
 
56
  ## Evaluation
57
-
58
 
59
  ## Uses
60
  <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
@@ -95,12 +95,12 @@ We would like to give a special thanks to the following groups:
95
 
96
  ## Cite SambaLingo
97
  ```
98
- @software{sambalingo,
99
- title = {{SambaLingo: Open Source Language Experts}},
100
- author = {SambaNova Systems},
101
- url = {https://huggingface.co/sambanovasystems/SambaLingo-Hungarian-Base-70B}
102
- month = {2},
103
- year = {2024},
104
- version = {1.0},
105
  }
106
  ```
 
54
  We extended the vocabulary of the base llama model from 32,000 tokens to 57,000 tokens by adding up to 25,000 non-overlapping tokens from the new language.
55
 
56
  ## Evaluation
57
+ For evaluation results see our paper: [SambaLingo: Teaching Large Language Models New Languages](https://arxiv.org/abs/2404.05829)
58
 
59
  ## Uses
60
  <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
 
95
 
96
  ## Cite SambaLingo
97
  ```
98
+ @misc{csaki2024sambalingo,
99
+ title={SambaLingo: Teaching Large Language Models New Languages},
100
+ author={Zoltan Csaki and Bo Li and Jonathan Li and Qiantong Xu and Pian Pawakapan and Leon Zhang and Yun Du and Hengyu Zhao and Changran Hu and Urmish Thakker},
101
+ year={2024},
102
+ eprint={2404.05829},
103
+ archivePrefix={arXiv},
104
+ primaryClass={cs.CL}
105
  }
106
  ```