Bo1015 commited on
Commit
d71bb28
β€’
1 Parent(s): 83d5354

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +9 -13
README.md CHANGED
@@ -33,14 +33,14 @@ We evaluated the xTrimoPGLM (xTMLM or xTCLM) and xTrimoPGLM(100B) models on two
33
  You can choose to manually download the necessary weights.
34
 
35
  | Model |Download |
36
- |------------------|-----------------------------------------------------------------------------------------------------------------------------------------|
37
- | xTrimoPGLM-1B-MLM | [πŸ€— Huggingface](https://huggingface.co/biomap-research/xtrimopglm-1b-mlm) [πŸ”¨ SwissArmyTransformer]() |
38
- | xTrimoPGLM-3B-MLM | [πŸ€— Huggingface](https://huggingface.co/biomap-research/xtrimopglm-3b-mlm) [πŸ”¨ SwissArmyTransformer]() |
39
- | xTrimoPGLM-10B-MLM | [πŸ€— Huggingface](https://huggingface.co/biomap-research/xtrimopglm-10b-mlm) [πŸ”¨ SwissArmyTransformer]() |
40
- | xTrimoPGLM-1B-CLM | [πŸ€— Huggingface](https://huggingface.co/biomap-research/xtrimopglm-1b-clm) [πŸ”¨ SwissArmyTransformer]() |
41
- | xTrimoPGLM-3B-CLM | [πŸ€— Huggingface](https://huggingface.co/biomap-research/xtrimopglm-3b-clm) [πŸ”¨ SwissArmyTransformer]() |
42
- | xTrimoPGLM-7B-CLM | [πŸ€— Huggingface](https://huggingface.co/biomap-research/xtrimopglm-7b-clm) [πŸ”¨ SwissArmyTransformer]() |
43
- | xTrimoPGLM-100B-Int4 (MLM or CLM) | [πŸ€— Huggingface](https://huggingface.co/biomap-research/xtrimopglm-100b-int4) [πŸ”¨ SwissArmyTransformer]() | | |
44
 
45
  ## How to use
46
  ### xTrimoPGLM-MLM: Masked Langeuage Models for Protein Understanding tasks
@@ -78,9 +78,6 @@ model = AutoModelForTokenClassification.from_config(config, trust_remote_code=Tr
78
 
79
  ```
80
 
81
-
82
- Refer the *finetune* folder to check more finetuning examples, such as LoRA and Linear Probing.
83
-
84
  ### xTrimoPGLM-CLM: Casusal Langeuage Models for Protein Design
85
  ```python
86
  from transformers import AutoModelForCausalLM, AutoTokenizer, AutoConfig
@@ -106,9 +103,8 @@ for idx, each in enumerate(prompt):
106
  output = model.chat(tokenizer, each)
107
  print(f"\nEnd generation with length: {len(output.split())} - seqs: {output}\n")
108
  ```
109
- For more inference scrpts of other models, please visit the model card of the huggingface page.
110
-
111
 
 
112
 
113
  ## LICENSE
114
 
 
33
  You can choose to manually download the necessary weights.
34
 
35
  | Model |Download |
36
+ |------------------|-----------------------------------------------------------------------------------------------------------|
37
+ | xTrimoPGLM-1B-MLM | [πŸ€— Huggingface](https://huggingface.co/biomap-research/xtrimopglm-1b-mlm) |
38
+ | xTrimoPGLM-3B-MLM | [πŸ€— Huggingface](https://huggingface.co/biomap-research/xtrimopglm-3b-mlm) |
39
+ | xTrimoPGLM-10B-MLM | [πŸ€— Huggingface](https://huggingface.co/biomap-research/xtrimopglm-10b-mlm) |
40
+ | xTrimoPGLM-1B-CLM | [πŸ€— Huggingface](https://huggingface.co/biomap-research/xtrimopglm-1b-clm) |
41
+ | xTrimoPGLM-3B-CLM | [πŸ€— Huggingface](https://huggingface.co/biomap-research/xtrimopglm-3b-clm) |
42
+ | xTrimoPGLM-7B-CLM | [πŸ€— Huggingface](https://huggingface.co/biomap-research/xtrimopglm-7b-clm) |
43
+ | xTrimoPGLM-100B-Int4 (MLM or CLM) | [πŸ€— Huggingface](https://huggingface.co/biomap-research/xtrimopglm-100b-int4)| | |
44
 
45
  ## How to use
46
  ### xTrimoPGLM-MLM: Masked Langeuage Models for Protein Understanding tasks
 
78
 
79
  ```
80
 
 
 
 
81
  ### xTrimoPGLM-CLM: Casusal Langeuage Models for Protein Design
82
  ```python
83
  from transformers import AutoModelForCausalLM, AutoTokenizer, AutoConfig
 
103
  output = model.chat(tokenizer, each)
104
  print(f"\nEnd generation with length: {len(output.split())} - seqs: {output}\n")
105
  ```
 
 
106
 
107
+ For more inference or fine-tuning code, datasets, and requirements, please visit our [GitHub page](https://github.com/biomap-research/xTrimoPGLM).
108
 
109
  ## LICENSE
110