Edit model card

Chinese MentalBERT, a pre-trained language model specifically designed for mental tasks.

In this study, we employ a domain-adaptive pretraining model, and introduce a novel lexicon guided masking machanism strategy based on the Chinese depression lexicon.

How to use

from transformers import BertTokenizer, BertForMaskedLM

tokenizer = BertTokenizer.from_pretrained('zwzzz/Chinese-MentalBERT')

model = BertForMaskedLM.from_pretrained('zwzzz/Chinese-MentalBERT')

Citation

If you find the technical report or resource is useful, please cite the following technical report in your paper.

Article address:https://arxiv.org/pdf/2402.09151.pdf

@misc{zhai2024chinese,
      title={Chinese MentalBERT: Domain-Adaptive Pre-training on Social Media for Chinese Mental Health Text Analysis}, 
      author={Wei Zhai and Hongzhi Qi and Qing Zhao and Jianqiang Li and Ziqi Wang and Han Wang and Bing Xiang Yang and Guanghui Fu},
      year={2024},
      eprint={2402.09151},
      archivePrefix={arXiv},
      primaryClass={cs.CL}
}
Downloads last month
322
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.