language: en | |
# RoCBert | |
## Introduction | |
RoCBert is a pretrained Chinese language model that is robust under various forms of adversarial attacks proposed by WeChatAI in 2022, | |
More detail: https://aclanthology.org/2022.acl-long.65.pdf | |
## How to use | |
```Python | |
from transformers import AutoTokenizer, AutoModel | |
tokenizer = AutoTokenizer.from_pretrained("weiweishi/roc-bert-base-zh") | |
model = AutoModel.from_pretrained("weiweishi/roc-bert-base-zh") | |
``` | |
## Citation | |
```bibtex | |
@inproceedings{su2022rocbert, | |
title={RoCBert: Robust Chinese Bert with Multimodal Contrastive Pretraining}, | |
author={Su, Hui and Shi, Weiwei and Shen, Xiaoyu and Xiao, Zhou and Ji, Tuo and Fang, Jiarui and Zhou, Jie}, | |
booktitle={Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)}, | |
pages={921--931}, | |
year={2022} | |
} | |
``` |