Fill-Mask
Transformers
PyTorch
Chinese
bert
Inference Endpoints
wangyulong commited on
Commit
a685cb1
1 Parent(s): 0c5248a

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -3
README.md CHANGED
@@ -22,10 +22,10 @@ model = BertModel.from_pretrained("Langboat/mengzi-bert-base")
22
  ```
23
 
24
  ## Scores on nine chinese tasks (without any data augmentation)
25
- |Model|AFQMC|TNEWS|IFLYTEK|CMNLI|WSC|CSL|CMRC|C3|CHID|
26
  |-|-|-|-|-|-|-|-|-|-|
27
- |RoBERTa-wwm-ext|74.04|56.94|60.31|80.51|67.80|81.00|75.20|66.50|83.62|
28
- |Mengzi-BERT-base|74.58|57.97|60.68|82.12|87.50|85.40|78.54|71.70|84.16|
29
 
30
  RoBERTa-wwm-ext scores are from CLUE baseline
31
 
 
22
  ```
23
 
24
  ## Scores on nine chinese tasks (without any data augmentation)
25
+ | Model | AFQMC | TNEWS | IFLYTEK | CMNLI | WSC | CSL | CMRC2018 | C3 | CHID |
26
  |-|-|-|-|-|-|-|-|-|-|
27
+ |RoBERTa-wwm-ext| 74.30 | 57.51 | 60.80 | 80.70 | 67.20 | 80.67 | 77.59 | 67.06 | 83.78 |
28
+ |Mengzi-BERT-base| 74.58 | 57.97 | 60.68 | 82.12 | 87.50 | 85.40 | 78.54 | 71.70 | 84.16 |
29
 
30
  RoBERTa-wwm-ext scores are from CLUE baseline
31