dahe827 commited on
Commit
9da6d9f
1 Parent(s): d02c620

End of training

Browse files
README.md ADDED
@@ -0,0 +1,163 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ base_model: bert-base-cased
4
+ tags:
5
+ - generated_from_trainer
6
+ metrics:
7
+ - f1
8
+ model-index:
9
+ - name: bert-base-cased-airlines-news-multi-label
10
+ results: []
11
+ ---
12
+
13
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
14
+ should probably proofread and complete it, then remove this comment. -->
15
+
16
+ # bert-base-cased-airlines-news-multi-label
17
+
18
+ This model is a fine-tuned version of [bert-base-cased](https://huggingface.co/bert-base-cased) on the None dataset.
19
+ It achieves the following results on the evaluation set:
20
+ - Loss: 0.3009
21
+ - F1: 0.8533
22
+ - Jaccard: 0.4071
23
+ - Precisions: 0.8126
24
+ - Recalls: 0.8999
25
+
26
+ ## Model description
27
+
28
+ More information needed
29
+
30
+ ## Intended uses & limitations
31
+
32
+ More information needed
33
+
34
+ ## Training and evaluation data
35
+
36
+ More information needed
37
+
38
+ ## Training procedure
39
+
40
+ ### Training hyperparameters
41
+
42
+ The following hyperparameters were used during training:
43
+ - learning_rate: 8e-05
44
+ - train_batch_size: 24
45
+ - eval_batch_size: 24
46
+ - seed: 42
47
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
48
+ - lr_scheduler_type: linear
49
+ - lr_scheduler_warmup_steps: 150
50
+ - num_epochs: 100
51
+
52
+ ### Training results
53
+
54
+ | Training Loss | Epoch | Step | Validation Loss | F1 | Jaccard | Precisions | Recalls |
55
+ |:-------------:|:-----:|:----:|:---------------:|:------:|:-------:|:----------:|:-------:|
56
+ | No log | 1.0 | 76 | 0.5236 | 0.7888 | 0.1283 | 0.8216 | 0.7804 |
57
+ | No log | 2.0 | 152 | 0.3180 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
58
+ | No log | 3.0 | 228 | 0.3117 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
59
+ | No log | 4.0 | 304 | 0.3106 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
60
+ | No log | 5.0 | 380 | 0.3110 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
61
+ | No log | 6.0 | 456 | 0.3095 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
62
+ | 0.3902 | 7.0 | 532 | 0.3096 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
63
+ | 0.3902 | 8.0 | 608 | 0.3089 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
64
+ | 0.3902 | 9.0 | 684 | 0.3094 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
65
+ | 0.3902 | 10.0 | 760 | 0.3092 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
66
+ | 0.3902 | 11.0 | 836 | 0.3088 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
67
+ | 0.3902 | 12.0 | 912 | 0.3082 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
68
+ | 0.3902 | 13.0 | 988 | 0.3086 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
69
+ | 0.3182 | 14.0 | 1064 | 0.3089 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
70
+ | 0.3182 | 15.0 | 1140 | 0.3088 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
71
+ | 0.3182 | 16.0 | 1216 | 0.3081 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
72
+ | 0.3182 | 17.0 | 1292 | 0.3076 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
73
+ | 0.3182 | 18.0 | 1368 | 0.3079 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
74
+ | 0.3182 | 19.0 | 1444 | 0.3066 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
75
+ | 0.3157 | 20.0 | 1520 | 0.3081 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
76
+ | 0.3157 | 21.0 | 1596 | 0.3079 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
77
+ | 0.3157 | 22.0 | 1672 | 0.3074 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
78
+ | 0.3157 | 23.0 | 1748 | 0.3069 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
79
+ | 0.3157 | 24.0 | 1824 | 0.3074 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
80
+ | 0.3157 | 25.0 | 1900 | 0.3061 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
81
+ | 0.3157 | 26.0 | 1976 | 0.3060 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
82
+ | 0.3139 | 27.0 | 2052 | 0.3060 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
83
+ | 0.3139 | 28.0 | 2128 | 0.3059 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
84
+ | 0.3139 | 29.0 | 2204 | 0.3057 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
85
+ | 0.3139 | 30.0 | 2280 | 0.3054 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
86
+ | 0.3139 | 31.0 | 2356 | 0.3061 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
87
+ | 0.3139 | 32.0 | 2432 | 0.3062 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
88
+ | 0.313 | 33.0 | 2508 | 0.3055 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
89
+ | 0.313 | 34.0 | 2584 | 0.3054 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
90
+ | 0.313 | 35.0 | 2660 | 0.3051 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
91
+ | 0.313 | 36.0 | 2736 | 0.3054 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
92
+ | 0.313 | 37.0 | 2812 | 0.3047 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
93
+ | 0.313 | 38.0 | 2888 | 0.3042 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
94
+ | 0.313 | 39.0 | 2964 | 0.3042 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
95
+ | 0.3117 | 40.0 | 3040 | 0.3044 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
96
+ | 0.3117 | 41.0 | 3116 | 0.3043 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
97
+ | 0.3117 | 42.0 | 3192 | 0.3040 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
98
+ | 0.3117 | 43.0 | 3268 | 0.3040 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
99
+ | 0.3117 | 44.0 | 3344 | 0.3040 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
100
+ | 0.3117 | 45.0 | 3420 | 0.3039 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
101
+ | 0.3117 | 46.0 | 3496 | 0.3038 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
102
+ | 0.3101 | 47.0 | 3572 | 0.3041 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
103
+ | 0.3101 | 48.0 | 3648 | 0.3042 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
104
+ | 0.3101 | 49.0 | 3724 | 0.3035 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
105
+ | 0.3101 | 50.0 | 3800 | 0.3036 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
106
+ | 0.3101 | 51.0 | 3876 | 0.3031 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
107
+ | 0.3101 | 52.0 | 3952 | 0.3029 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
108
+ | 0.3101 | 53.0 | 4028 | 0.3030 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
109
+ | 0.3101 | 54.0 | 4104 | 0.3029 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
110
+ | 0.3101 | 55.0 | 4180 | 0.3033 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
111
+ | 0.3101 | 56.0 | 4256 | 0.3027 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
112
+ | 0.3101 | 57.0 | 4332 | 0.3026 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
113
+ | 0.3101 | 58.0 | 4408 | 0.3026 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
114
+ | 0.3101 | 59.0 | 4484 | 0.3023 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
115
+ | 0.308 | 60.0 | 4560 | 0.3029 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
116
+ | 0.308 | 61.0 | 4636 | 0.3024 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
117
+ | 0.308 | 62.0 | 4712 | 0.3022 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
118
+ | 0.308 | 63.0 | 4788 | 0.3024 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
119
+ | 0.308 | 64.0 | 4864 | 0.3025 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
120
+ | 0.308 | 65.0 | 4940 | 0.3023 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
121
+ | 0.3078 | 66.0 | 5016 | 0.3019 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
122
+ | 0.3078 | 67.0 | 5092 | 0.3020 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
123
+ | 0.3078 | 68.0 | 5168 | 0.3017 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
124
+ | 0.3078 | 69.0 | 5244 | 0.3019 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
125
+ | 0.3078 | 70.0 | 5320 | 0.3020 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
126
+ | 0.3078 | 71.0 | 5396 | 0.3018 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
127
+ | 0.3078 | 72.0 | 5472 | 0.3019 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
128
+ | 0.3081 | 73.0 | 5548 | 0.3017 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
129
+ | 0.3081 | 74.0 | 5624 | 0.3016 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
130
+ | 0.3081 | 75.0 | 5700 | 0.3015 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
131
+ | 0.3081 | 76.0 | 5776 | 0.3015 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
132
+ | 0.3081 | 77.0 | 5852 | 0.3016 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
133
+ | 0.3081 | 78.0 | 5928 | 0.3014 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
134
+ | 0.3066 | 79.0 | 6004 | 0.3014 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
135
+ | 0.3066 | 80.0 | 6080 | 0.3014 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
136
+ | 0.3066 | 81.0 | 6156 | 0.3013 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
137
+ | 0.3066 | 82.0 | 6232 | 0.3013 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
138
+ | 0.3066 | 83.0 | 6308 | 0.3012 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
139
+ | 0.3066 | 84.0 | 6384 | 0.3014 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
140
+ | 0.3066 | 85.0 | 6460 | 0.3012 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
141
+ | 0.3076 | 86.0 | 6536 | 0.3012 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
142
+ | 0.3076 | 87.0 | 6612 | 0.3012 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
143
+ | 0.3076 | 88.0 | 6688 | 0.3011 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
144
+ | 0.3076 | 89.0 | 6764 | 0.3011 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
145
+ | 0.3076 | 90.0 | 6840 | 0.3010 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
146
+ | 0.3076 | 91.0 | 6916 | 0.3011 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
147
+ | 0.3076 | 92.0 | 6992 | 0.3010 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
148
+ | 0.3059 | 93.0 | 7068 | 0.3010 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
149
+ | 0.3059 | 94.0 | 7144 | 0.3010 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
150
+ | 0.3059 | 95.0 | 7220 | 0.3010 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
151
+ | 0.3059 | 96.0 | 7296 | 0.3009 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
152
+ | 0.3059 | 97.0 | 7372 | 0.3010 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
153
+ | 0.3059 | 98.0 | 7448 | 0.3009 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
154
+ | 0.306 | 99.0 | 7524 | 0.3009 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
155
+ | 0.306 | 100.0 | 7600 | 0.3009 | 0.8533 | 0.4071 | 0.8126 | 0.8999 |
156
+
157
+
158
+ ### Framework versions
159
+
160
+ - Transformers 4.41.2
161
+ - Pytorch 2.3.0+cu121
162
+ - Datasets 2.19.2
163
+ - Tokenizers 0.19.1
config.json ADDED
@@ -0,0 +1,47 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "bert-base-cased",
3
+ "architectures": [
4
+ "BertForSequenceClassification"
5
+ ],
6
+ "attention_probs_dropout_prob": 0.1,
7
+ "classifier_dropout": null,
8
+ "gradient_checkpointing": false,
9
+ "hidden_act": "gelu",
10
+ "hidden_dropout_prob": 0.1,
11
+ "hidden_size": 768,
12
+ "id2label": {
13
+ "0": "capacity expansion",
14
+ "1": "legal action",
15
+ "2": "market expansion",
16
+ "3": "marketing",
17
+ "4": "merger & acquisition and finance investments",
18
+ "5": "outsourcing and alliance",
19
+ "6": "pricing",
20
+ "7": "product introductions and improvements"
21
+ },
22
+ "initializer_range": 0.02,
23
+ "intermediate_size": 3072,
24
+ "label2id": {
25
+ "capacity expansion": 0,
26
+ "legal action": 1,
27
+ "market expansion": 2,
28
+ "marketing": 3,
29
+ "merger & acquisition and finance investments": 4,
30
+ "outsourcing and alliance": 5,
31
+ "pricing": 6,
32
+ "product introductions and improvements": 7
33
+ },
34
+ "layer_norm_eps": 1e-12,
35
+ "max_position_embeddings": 512,
36
+ "model_type": "bert",
37
+ "num_attention_heads": 12,
38
+ "num_hidden_layers": 12,
39
+ "pad_token_id": 0,
40
+ "position_embedding_type": "absolute",
41
+ "problem_type": "multi_label_classification",
42
+ "torch_dtype": "float32",
43
+ "transformers_version": "4.41.2",
44
+ "type_vocab_size": 2,
45
+ "use_cache": true,
46
+ "vocab_size": 28996
47
+ }
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:7b6ba168b00b595c78fe487ec3826cb3b7671f7cca9459ceb86fddc838b07c5d
3
+ size 434076848
runs/Jun20_13-54-12_ubuntu-System-Product-Name/events.out.tfevents.1718862853.ubuntu-System-Product-Name.1565371.1 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:f09dce6872a378b4c562c2d616105da8f58b487a8d4d53e1bb2451c2843cec35
3
+ size 44968
runs/Jun20_13-54-12_ubuntu-System-Product-Name/events.out.tfevents.1718869631.ubuntu-System-Product-Name.1565371.2 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:7f819c381604ff90535c391734f713cb1ef603522b4db55f0ef7be9f0ba4ae2d
3
+ size 5910
runs/Jun20_15-48-56_ubuntu-System-Product-Name/events.out.tfevents.1718869737.ubuntu-System-Product-Name.1669351.0 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:605206de9ce7803d2ee8b9dadc8ef37a5c29c21bd62f546c54478ce41798c364
3
+ size 7701
runs/Jun20_15-52-09_ubuntu-System-Product-Name/events.out.tfevents.1718869933.ubuntu-System-Product-Name.1669351.1 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:82408b9f6b715d71f3ee3484ac2c3528927ff9919a228a303b1bccf3acd706c9
3
+ size 5505
runs/Jun20_22-26-45_ubuntu-System-Product-Name/events.out.tfevents.1718893606.ubuntu-System-Product-Name.1682743.0 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:3cb8112758569e20c0afc3929f5b1a0ed7e63fa048aa13bd9d90fc2d80da6239
3
+ size 5505
runs/Jun21_16-55-08_ubuntu-System-Product-Name/events.out.tfevents.1718960109.ubuntu-System-Product-Name.1682743.1 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:e49982ecb787dc87b3109b69a86409cc2d57862726e55527a0a88e333485269f
3
+ size 44966
runs/Jun21_18-07-53_ubuntu-System-Product-Name/events.out.tfevents.1718964473.ubuntu-System-Product-Name.1733416.0 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:2da1724cee83ac44cacbf55104709b90f597bd4b9a80ceaa707785abc5c97864
3
+ size 45817
runs/Jun21_18-07-53_ubuntu-System-Product-Name/events.out.tfevents.1718966755.ubuntu-System-Product-Name.1733416.1 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:9b808a6beda4e0443400c7b36c94a24a8f6437bd4c901ea6aead818c486683eb
3
+ size 456
runs/Jun24_17-43-17_ubuntu-System-Product-Name/events.out.tfevents.1719222209.ubuntu-System-Product-Name.1856559.0 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:b50200ad9d0a6a6825a3cc55b04207be6d242e14c5a893f04c609d61c1e50e7b
3
+ size 24271
runs/Jun24_18-18-58_ubuntu-System-Product-Name/events.out.tfevents.1719224350.ubuntu-System-Product-Name.1866513.0 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:976b1f8d9f57b311487d572dc70607d1d4099a61c1428c571591dc6da79a4461
3
+ size 56315
special_tokens_map.json ADDED
@@ -0,0 +1,7 @@
 
 
 
 
 
 
 
 
1
+ {
2
+ "cls_token": "[CLS]",
3
+ "mask_token": "[MASK]",
4
+ "pad_token": "[PAD]",
5
+ "sep_token": "[SEP]",
6
+ "unk_token": "[UNK]"
7
+ }
tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
tokenizer_config.json ADDED
@@ -0,0 +1,55 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "added_tokens_decoder": {
3
+ "0": {
4
+ "content": "[PAD]",
5
+ "lstrip": false,
6
+ "normalized": false,
7
+ "rstrip": false,
8
+ "single_word": false,
9
+ "special": true
10
+ },
11
+ "100": {
12
+ "content": "[UNK]",
13
+ "lstrip": false,
14
+ "normalized": false,
15
+ "rstrip": false,
16
+ "single_word": false,
17
+ "special": true
18
+ },
19
+ "101": {
20
+ "content": "[CLS]",
21
+ "lstrip": false,
22
+ "normalized": false,
23
+ "rstrip": false,
24
+ "single_word": false,
25
+ "special": true
26
+ },
27
+ "102": {
28
+ "content": "[SEP]",
29
+ "lstrip": false,
30
+ "normalized": false,
31
+ "rstrip": false,
32
+ "single_word": false,
33
+ "special": true
34
+ },
35
+ "103": {
36
+ "content": "[MASK]",
37
+ "lstrip": false,
38
+ "normalized": false,
39
+ "rstrip": false,
40
+ "single_word": false,
41
+ "special": true
42
+ }
43
+ },
44
+ "clean_up_tokenization_spaces": true,
45
+ "cls_token": "[CLS]",
46
+ "do_lower_case": false,
47
+ "mask_token": "[MASK]",
48
+ "model_max_length": 512,
49
+ "pad_token": "[PAD]",
50
+ "sep_token": "[SEP]",
51
+ "strip_accents": null,
52
+ "tokenize_chinese_chars": true,
53
+ "tokenizer_class": "BertTokenizer",
54
+ "unk_token": "[UNK]"
55
+ }
training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:c231001fd06a02a360b2b8c6fa7357e69e2eac4359a412aa9b35fedfadabf77e
3
+ size 5176
vocab.txt ADDED
The diff for this file is too large to render. See raw diff