Update README.md
Browse files
README.md
CHANGED
@@ -64,5 +64,18 @@ The model is a decoder-only transformer architecture with the following modifica
|
|
64 |
* **ReLU Activation Function**: ReLU([Glorot et al., 2011](https://proceedings.mlr.press/v15/glorot11a/glorot11a.pdf)) activation functions are adopted in feed-forward networks.
|
65 |
* **Tokenizer**: We use the SmolLM([Allal et al., 2024](https://huggingface.co/blog/smollm))'s tokenizer with a vocabulary size of 49,152.
|
66 |
|
67 |
-
##
|
68 |
-
* This repository is released under the [Apache-2.0](https://huggingface.co/mllmTeam/PhoneLM-1.5B-Instruct/blob/main/LICENSE) License.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
64 |
* **ReLU Activation Function**: ReLU([Glorot et al., 2011](https://proceedings.mlr.press/v15/glorot11a/glorot11a.pdf)) activation functions are adopted in feed-forward networks.
|
65 |
* **Tokenizer**: We use the SmolLM([Allal et al., 2024](https://huggingface.co/blog/smollm))'s tokenizer with a vocabulary size of 49,152.
|
66 |
|
67 |
+
## License
|
68 |
+
* This repository is released under the [Apache-2.0](https://huggingface.co/mllmTeam/PhoneLM-1.5B-Instruct/blob/main/LICENSE) License.
|
69 |
+
|
70 |
+
## Citation
|
71 |
+
```
|
72 |
+
@misc{yi2024phonelmanefficientcapablesmall,
|
73 |
+
title={PhoneLM:an Efficient and Capable Small Language Model Family through Principled Pre-training},
|
74 |
+
author={Rongjie Yi and Xiang Li and Weikai Xie and Zhenyan Lu and Chenghua Wang and Ao Zhou and Shangguang Wang and Xiwen Zhang and Mengwei Xu},
|
75 |
+
year={2024},
|
76 |
+
eprint={2411.05046},
|
77 |
+
archivePrefix={arXiv},
|
78 |
+
primaryClass={cs.CL},
|
79 |
+
url={https://arxiv.org/abs/2411.05046},
|
80 |
+
}
|
81 |
+
```
|