Create README.md
Browse files
README.md
ADDED
@@ -0,0 +1,38 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
license: apache-2.0
|
3 |
+
language:
|
4 |
+
- zh
|
5 |
+
---
|
6 |
+
# Chinese MentalBERT, a pre-trained language model specifically designed for mental tasks.
|
7 |
+
|
8 |
+
In this study, we employ a domain-adaptive pretraining model, and introduce a novel lexicon guided masking machanism strategy based on the Chinese depression lexicon.
|
9 |
+
|
10 |
+
## How to use
|
11 |
+
|
12 |
+
```bash
|
13 |
+
from transformers import BertTokenizer, BertForMaskedLM
|
14 |
+
|
15 |
+
tokenizer = BertTokenizer.from_pretrained('zwzzz/Chinese-MentalBERT')
|
16 |
+
|
17 |
+
model = BertForMaskedLM.from_pretrained('zwzzz/Chinese-MentalBERT')
|
18 |
+
```
|
19 |
+
|
20 |
+
## Citation
|
21 |
+
|
22 |
+
If you find the technical report or resource is useful, please cite the following technical report in your paper.
|
23 |
+
|
24 |
+
Article address:[https://arxiv.org/pdf/2402.09151.pdf](https://arxiv.org/pdf/2402.09151.pdf)
|
25 |
+
```bash
|
26 |
+
|
27 |
+
|
28 |
+
@misc
|
29 |
+
|
30 |
+
{zhai2024chinese,
|
31 |
+
title={Chinese MentalBERT: Domain-Adaptive Pre-training on Social Media for Chinese Mental Health Text Analysis},
|
32 |
+
author={Wei Zhai and Hongzhi Qi and Qing Zhao and Jianqiang Li and Ziqi Wang and Han Wang and Bing Xiang Yang and Guanghui Fu},
|
33 |
+
year={2024},
|
34 |
+
eprint={2402.09151},
|
35 |
+
archivePrefix={arXiv},
|
36 |
+
primaryClass={cs.CL}
|
37 |
+
}
|
38 |
+
```
|