Update README.md
Browse files
README.md
CHANGED
@@ -114,19 +114,20 @@ The code in this repository is open source under the [Apache-2.0 license](./LICE
|
|
114 |
|
115 |
If you find our work useful, please consider citing the following paper:
|
116 |
```
|
117 |
-
@
|
118 |
title={xTrimoPGLM: unified 100B-scale pre-trained transformer for deciphering the language of protein},
|
119 |
author={Chen, Bo and Cheng, Xingyi and Li, Pan and Geng, Yangli-ao and Gong, Jing and Li, Shen and Bei, Zhilei and Tan, Xu and Wang, Boyan and Zeng, Xin and others},
|
120 |
-
|
121 |
-
|
|
|
|
|
|
|
122 |
}
|
123 |
|
124 |
-
@
|
125 |
title={Training Compute-Optimal Protein Language Models},
|
126 |
author={Cheng, Xingyi and Chen, Bo and Li, Pan and Gong, Jing and Tang, Jie and Song, Le},
|
127 |
-
journal={bioRxiv},
|
128 |
-
pages={2024--06},
|
129 |
year={2024},
|
130 |
-
|
131 |
}
|
132 |
```
|
|
|
114 |
|
115 |
If you find our work useful, please consider citing the following paper:
|
116 |
```
|
117 |
+
@misc{chen2024xtrimopglm,
|
118 |
title={xTrimoPGLM: unified 100B-scale pre-trained transformer for deciphering the language of protein},
|
119 |
author={Chen, Bo and Cheng, Xingyi and Li, Pan and Geng, Yangli-ao and Gong, Jing and Li, Shen and Bei, Zhilei and Tan, Xu and Wang, Boyan and Zeng, Xin and others},
|
120 |
+
year={2024},
|
121 |
+
eprint={2401.06199},
|
122 |
+
archivePrefix={arXiv},
|
123 |
+
primaryClass={cs.CL},
|
124 |
+
note={arXiv preprint arXiv:2401.06199}
|
125 |
}
|
126 |
|
127 |
+
@misc{cheng2024training,
|
128 |
title={Training Compute-Optimal Protein Language Models},
|
129 |
author={Cheng, Xingyi and Chen, Bo and Li, Pan and Gong, Jing and Tang, Jie and Song, Le},
|
|
|
|
|
130 |
year={2024},
|
131 |
+
note={bioRxiv, Cold Spring Harbor Laboratory, pages 2024--06}
|
132 |
}
|
133 |
```
|