cartesinus
commited on
Commit
•
49a7198
1
Parent(s):
f234e6d
Update README.md
Browse files
README.md
CHANGED
@@ -21,7 +21,15 @@ should probably proofread and complete it, then remove this comment. -->
|
|
21 |
|
22 |
# fedcsis-slot_baseline-xlm_r-en
|
23 |
|
24 |
-
This model is a fine-tuned version of [xlm-roberta-base](https://huggingface.co/xlm-roberta-base) on the
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
25 |
It achieves the following results on the evaluation set:
|
26 |
- Loss: 0.1097
|
27 |
- Precision: 0.9705
|
@@ -74,6 +82,93 @@ The following hyperparameters were used during training:
|
|
74 |
| 0.0052 | 14.0 | 3256 | 0.1052 | 0.9706 | 0.9715 | 0.9710 | 0.9860 |
|
75 |
| 0.0031 | 15.0 | 4070 | 0.1097 | 0.9705 | 0.9723 | 0.9714 | 0.9859 |
|
76 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
77 |
|
78 |
### Framework versions
|
79 |
|
|
|
21 |
|
22 |
# fedcsis-slot_baseline-xlm_r-en
|
23 |
|
24 |
+
This model is a fine-tuned version of [xlm-roberta-base](https://huggingface.co/xlm-roberta-base) on the
|
25 |
+
[leyzer-fedcsis](https://huggingface.co/cartesinus/leyzer-fedcsis) dataset.
|
26 |
+
|
27 |
+
Results on test set:
|
28 |
+
- Precision: 0.7767
|
29 |
+
- Recall: 0.7991
|
30 |
+
- F1: 0.7877
|
31 |
+
- Accuracy: 0.9425
|
32 |
+
|
33 |
It achieves the following results on the evaluation set:
|
34 |
- Loss: 0.1097
|
35 |
- Precision: 0.9705
|
|
|
82 |
| 0.0052 | 14.0 | 3256 | 0.1052 | 0.9706 | 0.9715 | 0.9710 | 0.9860 |
|
83 |
| 0.0031 | 15.0 | 4070 | 0.1097 | 0.9705 | 0.9723 | 0.9714 | 0.9859 |
|
84 |
|
85 |
+
### Per slot evaluation on test set
|
86 |
+
|
87 |
+
| slot_name | precision | recall | f1 | tc_size |
|
88 |
+
|-----------|-----------|--------|----|---------|
|
89 |
+
| album | 0.7000 | 0.8750 | 0.7778 | 8 |
|
90 |
+
| album_name | 0.9091 | 0.6250 | 0.7407 | 16 |
|
91 |
+
| album_type | 0.1842 | 0.5385 | 0.2745 | 13 |
|
92 |
+
| album_type_1a | 0.0000 | 0.0000 | 0.0000 | 10 |
|
93 |
+
| album_type_an | 0.0000 | 0.0000 | 0.0000 | 20 |
|
94 |
+
| all_lang | 0.5556 | 0.7143 | 0.6250 | 7 |
|
95 |
+
| artist | 0.7500 | 0.7857 | 0.7674 | 42 |
|
96 |
+
| av_alias | 0.8333 | 0.5263 | 0.6452 | 19 |
|
97 |
+
| caption | 0.8065 | 0.7576 | 0.7813 | 33 |
|
98 |
+
| category | 0.8571 | 1.0000 | 0.9231 | 18 |
|
99 |
+
| channel | 0.6786 | 0.8085 | 0.7379 | 47 |
|
100 |
+
| channel_id | 0.7826 | 0.9000 | 0.8372 | 20 |
|
101 |
+
| count | 0.5714 | 1.0000 | 0.7273 | 4 |
|
102 |
+
| date | 0.8333 | 0.7500 | 0.7895 | 40 |
|
103 |
+
| date_day | 1.0000 | 1.0000 | 1.0000 | 4 |
|
104 |
+
| date_month | 1.0000 | 1.0000 | 1.0000 | 8 |
|
105 |
+
| device_name | 0.8621 | 0.7576 | 0.8065 | 33 |
|
106 |
+
| email | 1.0000 | 1.0000 | 1.0000 | 16 |
|
107 |
+
| event_name | 0.5467 | 0.5325 | 0.5395 | 77 |
|
108 |
+
| file_name | 0.7333 | 0.7857 | 0.7586 | 14 |
|
109 |
+
| file_size | 1.0000 | 1.0000 | 1.0000 | 1 |
|
110 |
+
| filename | 0.7083 | 0.7391 | 0.7234 | 23 |
|
111 |
+
| filter | 0.8333 | 0.9375 | 0.8824 | 16 |
|
112 |
+
| from | 1.0000 | 1.0000 | 1.0000 | 33 |
|
113 |
+
| hashtag | 1.0000 | 0.6000 | 0.7500 | 10 |
|
114 |
+
| img_query | 0.9388 | 0.9246 | 0.9316 | 199 |
|
115 |
+
| label | 0.2500 | 1.0000 | 0.4000 | 1 |
|
116 |
+
| location | 0.8319 | 0.9171 | 0.8724 | 205 |
|
117 |
+
| mail | 1.0000 | 1.0000 | 1.0000 | 2 |
|
118 |
+
| massage | 0.0000 | 0.0000 | 0.0000 | 1 |
|
119 |
+
| mesage | 0.0000 | 0.0000 | 0.0000 | 1 |
|
120 |
+
| message | 0.5856 | 0.5285 | 0.5556 | 123 |
|
121 |
+
| mime_type | 0.6667 | 1.0000 | 0.8000 | 2 |
|
122 |
+
| name | 0.9412 | 0.8767 | 0.9078 | 73 |
|
123 |
+
| pathname | 0.7805 | 0.6809 | 0.7273 | 47 |
|
124 |
+
| percent | 1.0000 | 0.9583 | 0.9787 | 24 |
|
125 |
+
| phone_number | 1.0000 | 1.0000 | 1.0000 | 48 |
|
126 |
+
| phone_type | 1.0000 | 0.9375 | 0.9677 | 16 |
|
127 |
+
| picture_url | 1.0000 | 1.0000 | 1.0000 | 14 |
|
128 |
+
| playlist | 0.7219 | 0.8134 | 0.7649 | 134 |
|
129 |
+
| portal | 0.8108 | 0.7692 | 0.7895 | 39 |
|
130 |
+
| power | 1.0000 | 1.0000 | 1.0000 | 1 |
|
131 |
+
| priority | 0.6667 | 1.0000 | 0.8000 | 2 |
|
132 |
+
| purpose | 1.0000 | 1.0000 | 1.0000 | 8 |
|
133 |
+
| query | 0.6706 | 0.6064 | 0.6369 | 94 |
|
134 |
+
| rating | 0.9167 | 0.9167 | 0.9167 | 12 |
|
135 |
+
| review_count | 0.8750 | 0.7778 | 0.8235 | 9 |
|
136 |
+
| section | 0.9091 | 0.9091 | 0.9091 | 22 |
|
137 |
+
| seek_time | 0.6667 | 1.0000 | 0.8000 | 2 |
|
138 |
+
| sender | 0.6000 | 0.6000 | 0.6000 | 10 |
|
139 |
+
| sender_address | 0.6364 | 0.8750 | 0.7368 | 8 |
|
140 |
+
| song | 0.5476 | 0.6133 | 0.5786 | 75 |
|
141 |
+
| src_lang_de | 0.8765 | 0.9467 | 0.9103 | 75 |
|
142 |
+
| src_lang_en | 0.6604 | 0.6481 | 0.6542 | 54 |
|
143 |
+
| src_lang_es | 0.8132 | 0.9024 | 0.8555 | 82 |
|
144 |
+
| src_lang_fr | 0.8795 | 0.9125 | 0.8957 | 80 |
|
145 |
+
| src_lang_it | 0.8590 | 0.9437 | 0.8993 | 71 |
|
146 |
+
| src_lang_pl | 0.7910 | 0.8833 | 0.8346 | 60 |
|
147 |
+
| state | 1.0000 | 1.0000 | 1.0000 | 1 |
|
148 |
+
| status | 0.5455 | 0.5000 | 0.5217 | 12 |
|
149 |
+
| subject | 0.6154 | 0.5581 | 0.5854 | 86 |
|
150 |
+
| text_de | 0.9091 | 0.9091 | 0.9091 | 77 |
|
151 |
+
| text_en | 0.5909 | 0.5843 | 0.5876 | 89 |
|
152 |
+
| text_es | 0.7935 | 0.8111 | 0.8022 | 90 |
|
153 |
+
| text_esi | 0.0000 | 0.0000 | 0.0000 | 1 |
|
154 |
+
| text_fr | 0.9125 | 0.8588 | 0.8848 | 85 |
|
155 |
+
| text_it | 0.8205 | 0.9014 | 0.8591 | 71 |
|
156 |
+
| text_multi | 0.3333 | 1.0000 | 0.5000 | 1 |
|
157 |
+
| text_pl | 0.8167 | 0.7656 | 0.7903 | 64 |
|
158 |
+
| time | 0.8750 | 1.0000 | 0.9333 | 7 |
|
159 |
+
| to | 0.8927 | 0.9186 | 0.9054 | 172 |
|
160 |
+
| topic | 0.4000 | 0.6667 | 0.5000 | 3 |
|
161 |
+
| translator | 0.7991 | 0.9777 | 0.8794 | 179 |
|
162 |
+
| trg_lang_de | 0.8116 | 0.8615 | 0.8358 | 65 |
|
163 |
+
| trg_lang_en | 0.8841 | 0.8841 | 0.8841 | 69 |
|
164 |
+
| trg_lang_es | 0.8906 | 0.8769 | 0.8837 | 65 |
|
165 |
+
| trg_lang_fr | 0.8676 | 0.9365 | 0.9008 | 63 |
|
166 |
+
| trg_lang_general | 0.8235 | 0.7368 | 0.7778 | 19 |
|
167 |
+
| trg_lang_it | 0.8254 | 0.8667 | 0.8455 | 60 |
|
168 |
+
| trg_lang_pl | 0.8077 | 0.8630 | 0.8344 | 73 |
|
169 |
+
| txt_query | 0.5714 | 0.7059 | 0.6316 | 17 |
|
170 |
+
| username | 0.6875 | 0.7333 | 0.7097 | 15 |
|
171 |
+
| value | 0.7500 | 0.8571 | 0.8000 | 14 |
|
172 |
|
173 |
### Framework versions
|
174 |
|