idea-teacher
commited on
Commit
•
8c73be9
1
Parent(s):
2df3b0b
Update README.md
Browse files
README.md
CHANGED
@@ -18,9 +18,9 @@ inference: false
|
|
18 |
|
19 |
## 简介 Brief Introduction
|
20 |
|
21 |
-
110M
|
22 |
|
23 |
-
The TCBert with 110M parameters is pre-trained for
|
24 |
|
25 |
## 模型分类 Model Taxonomy
|
26 |
|
@@ -33,7 +33,7 @@ The TCBert with 110M parameters is pre-trained for, not limited to, Chinese topi
|
|
33 |
|
34 |
为了提高模型在话题分类上句子表征效果,我们收集了大量话题分类数据进行基于prompts的对比学习预训练。
|
35 |
|
36 |
-
To improve the model performance on the topic classification task, we collected numerous topic classification datasets for contrastive pre-training based on general prompts.
|
37 |
### 下游效果 Performance
|
38 |
|
39 |
Stay tuned.
|
|
|
18 |
|
19 |
## 简介 Brief Introduction
|
20 |
|
21 |
+
110M参数的句子表征Topic Classification BERT (TCBert)。
|
22 |
|
23 |
+
The TCBert with 110M parameters is pre-trained for sentence representation for Chinese topic classification tasks.
|
24 |
|
25 |
## 模型分类 Model Taxonomy
|
26 |
|
|
|
33 |
|
34 |
为了提高模型在话题分类上句子表征效果,我们收集了大量话题分类数据进行基于prompts的对比学习预训练。
|
35 |
|
36 |
+
To improve the model performance on sentence representation for the topic classification task, we collected numerous topic classification datasets for contrastive pre-training based on general prompts.
|
37 |
### 下游效果 Performance
|
38 |
|
39 |
Stay tuned.
|