File size: 1,907 Bytes
961dfdd
ed8254e
 
 
961dfdd
ed8254e
 
 
 
 
 
961dfdd
ed8254e
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
90fabeb
ed8254e
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
---
language: 
  - zh

license: apache-2.0

tags:
  - classification

inference: false

---

# IDEA-CCNL/Erlangshen-TCBert-110M-Sentence-Embedding-Chinese

- Github: [Fengshenbang-LM](https://github.com/IDEA-CCNL/Fengshenbang-LM)
- Docs: [Fengshenbang-Docs](https://fengshenbang-doc.readthedocs.io/)

## 简介 Brief Introduction

110M参数的Topic Classification BERT (TCBert)。

The TCBert with 110M parameters is pre-trained for, not limited to, Chinese topic classification tasks.

## 模型分类 Model Taxonomy

|  需求 Demand  | 任务 Task       | 系列 Series      | 模型 Model    | 参数 Parameter | 额外 Extra |
|  :----:  | :----:  | :----:  | :----:  | :----:  | :----:  |
| 通用 General  | 句子表征 | 二郎神 Erlangshen | TCBert      |      110M     |   Chinese     |

## 模型信息 Model Information


为了提高模型在话题分类上句子表征效果,我们收集了大量话题分类数据进行基于prompts的对比学习预训练。

To improve the model performance on the topic classification task, we collected numerous topic classification datasets for contrastive pre-training based on general prompts.
### 下游效果 Performance

Stay tuned.

## 使用 Usage

```python
from transformers import BertForMaskedLM, BertTokenizer
import torch
tokenizer=BertTokenizer.from_pretrained("IDEA-CCNL/Erlangshen-TCBert-110M-Classification-Chinese")
model=BertForMaskedLM.from_pretrained('IDEA-CCNL/Erlangshen-TCBert-110M-Classification-Chinese')
```
Stay tuned for more details.

如果您在您的工作中使用了我们的模型,可以引用我们的[网站](https://github.com/IDEA-CCNL/Fengshenbang-LM/):

You can also cite our [website](https://github.com/IDEA-CCNL/Fengshenbang-LM/):

```text
@misc{Fengshenbang-LM,
  title={Fengshenbang-LM},
  author={IDEA-CCNL},
  year={2021},
  howpublished={\url{https://github.com/IDEA-CCNL/Fengshenbang-LM}},
}
```