Update README.md
Browse files
README.md
CHANGED
@@ -33,14 +33,14 @@ We evaluated the xTrimoPGLM (xTMLM or xTCLM) and xTrimoPGLM(100B) models on two
|
|
33 |
You can choose to manually download the necessary weights.
|
34 |
|
35 |
| Model |Download |
|
36 |
-
|
37 |
-
| xTrimoPGLM-1B-MLM | [π€ Huggingface](https://huggingface.co/biomap-research/xtrimopglm-1b-mlm)
|
38 |
-
| xTrimoPGLM-3B-MLM | [π€ Huggingface](https://huggingface.co/biomap-research/xtrimopglm-3b-mlm)
|
39 |
-
| xTrimoPGLM-10B-MLM | [π€ Huggingface](https://huggingface.co/biomap-research/xtrimopglm-10b-mlm)
|
40 |
-
| xTrimoPGLM-1B-CLM | [π€ Huggingface](https://huggingface.co/biomap-research/xtrimopglm-1b-clm)
|
41 |
-
| xTrimoPGLM-3B-CLM | [π€ Huggingface](https://huggingface.co/biomap-research/xtrimopglm-3b-clm)
|
42 |
-
| xTrimoPGLM-7B-CLM | [π€ Huggingface](https://huggingface.co/biomap-research/xtrimopglm-7b-clm)
|
43 |
-
| xTrimoPGLM-100B-Int4 (MLM or CLM) | [π€ Huggingface](https://huggingface.co/biomap-research/xtrimopglm-100b-int4)
|
44 |
|
45 |
## How to use
|
46 |
### xTrimoPGLM-MLM: Masked Langeuage Models for Protein Understanding tasks
|
@@ -78,9 +78,6 @@ model = AutoModelForTokenClassification.from_config(config, trust_remote_code=Tr
|
|
78 |
|
79 |
```
|
80 |
|
81 |
-
|
82 |
-
Refer the *finetune* folder to check more finetuning examples, such as LoRA and Linear Probing.
|
83 |
-
|
84 |
### xTrimoPGLM-CLM: Casusal Langeuage Models for Protein Design
|
85 |
```python
|
86 |
from transformers import AutoModelForCausalLM, AutoTokenizer, AutoConfig
|
@@ -106,9 +103,8 @@ for idx, each in enumerate(prompt):
|
|
106 |
output = model.chat(tokenizer, each)
|
107 |
print(f"\nEnd generation with length: {len(output.split())} - seqs: {output}\n")
|
108 |
```
|
109 |
-
For more inference scrpts of other models, please visit the model card of the huggingface page.
|
110 |
-
|
111 |
|
|
|
112 |
|
113 |
## LICENSE
|
114 |
|
|
|
33 |
You can choose to manually download the necessary weights.
|
34 |
|
35 |
| Model |Download |
|
36 |
+
|------------------|-----------------------------------------------------------------------------------------------------------|
|
37 |
+
| xTrimoPGLM-1B-MLM | [π€ Huggingface](https://huggingface.co/biomap-research/xtrimopglm-1b-mlm) |
|
38 |
+
| xTrimoPGLM-3B-MLM | [π€ Huggingface](https://huggingface.co/biomap-research/xtrimopglm-3b-mlm) |
|
39 |
+
| xTrimoPGLM-10B-MLM | [π€ Huggingface](https://huggingface.co/biomap-research/xtrimopglm-10b-mlm) |
|
40 |
+
| xTrimoPGLM-1B-CLM | [π€ Huggingface](https://huggingface.co/biomap-research/xtrimopglm-1b-clm) |
|
41 |
+
| xTrimoPGLM-3B-CLM | [π€ Huggingface](https://huggingface.co/biomap-research/xtrimopglm-3b-clm) |
|
42 |
+
| xTrimoPGLM-7B-CLM | [π€ Huggingface](https://huggingface.co/biomap-research/xtrimopglm-7b-clm) |
|
43 |
+
| xTrimoPGLM-100B-Int4 (MLM or CLM) | [π€ Huggingface](https://huggingface.co/biomap-research/xtrimopglm-100b-int4)| | |
|
44 |
|
45 |
## How to use
|
46 |
### xTrimoPGLM-MLM: Masked Langeuage Models for Protein Understanding tasks
|
|
|
78 |
|
79 |
```
|
80 |
|
|
|
|
|
|
|
81 |
### xTrimoPGLM-CLM: Casusal Langeuage Models for Protein Design
|
82 |
```python
|
83 |
from transformers import AutoModelForCausalLM, AutoTokenizer, AutoConfig
|
|
|
103 |
output = model.chat(tokenizer, each)
|
104 |
print(f"\nEnd generation with length: {len(output.split())} - seqs: {output}\n")
|
105 |
```
|
|
|
|
|
106 |
|
107 |
+
For more inference or fine-tuning code, datasets, and requirements, please visit our [GitHub page](https://github.com/biomap-research/xTrimoPGLM).
|
108 |
|
109 |
## LICENSE
|
110 |
|