mbruton commited on
Commit
46a7219
1 Parent(s): 7c08a6f

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +396 -0
README.md ADDED
@@ -0,0 +1,396 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ datasets:
4
+ - mbruton/spanish_srl
5
+ - PropBank.Br
6
+ language:
7
+ - es
8
+ - pt
9
+ metrics:
10
+ - seqeval
11
+ library_name: transformers
12
+ pipeline_tag: token-classification
13
+ ---
14
+
15
+ # Model Card for SpaXLM-R-pt for Semantic Role Labeling
16
+
17
+ This model is fine-tuned on a version of [XLM RoBERTa Base](https://huggingface.co/xlm-roberta-base) which is pre-trained on the SRL task for Portuguese, and is one of 24 models introduced as part of [this project](https://github.com/mbruton0426/GalicianSRL).
18
+
19
+ ## Model Details
20
+
21
+ ### Model Description
22
+
23
+ SpaXLM-R-pt for Semantic Role Labeling (SRL) is a transformers model, leveraging XLM-R's extensive pretraining on 100 languages to achieve better SRL predictions for Spanish. This model is additionally pre-trained on the SRL task for Portuguese. It was fine-tuned on Spanish with the following objectives:
24
+
25
+ - Identify up to 16 verbal roots within a sentence.
26
+ - Identify available arguments and thematic roles for each verbal root.
27
+
28
+ Labels are formatted as: r#:tag, where r# links the token to a specific verbal root of index #, and tag identifies the token as the verbal root (root) or an individual argument (arg0/arg1/arg2/arg3/argM) and its thematic role (adv/agt/atr/ben/cau/cot/des/efi/ein/exp/ext/fin/ins/loc/mnr/ori/pat/src/tem/tmp)
29
+
30
+ - **Developed by:** [Micaella Bruton](mailto:micaellabruton@gmail.com)
31
+ - **Model type:** Transformers
32
+ - **Language(s) (NLP):** Spanish (es), Portuguese (pt)
33
+ - **License:** Apache 2.0
34
+ - **Finetuned from model:** [Portuguese pre-trained XLM RoBERTa Base](https://huggingface.co/liaad/srl-pt_xlmr-base)
35
+
36
+ ### Model Sources
37
+
38
+ - **Repository:** [GalicianSRL](https://github.com/mbruton0426/GalicianSRL)
39
+ - **Paper:** To be updated
40
+
41
+ ## Uses
42
+
43
+ This model is intended to be used to develop and improve natural language processing tools for Spanish.
44
+
45
+ ## Bias, Risks, and Limitations
46
+
47
+ The Spanish training set lacked highly complex sentences and as such, performs much better on sentences of mid- to low-complexity.
48
+
49
+ ## Training Details
50
+
51
+ ### Training Data
52
+
53
+ This model was pre-trained on the [PropBank.Br Portuguese SRL corpus](http://www.nilc.icmc.usp.br/portlex/index.php/en/projects/propbankbringl).
54
+ This model was fine-tuned on the "train" portion of the [SpanishSRL Dataset](https://huggingface.co/datasets/mbruton/spanish_srl) produced as part of this same project.
55
+
56
+ #### Training Hyperparameters
57
+
58
+ - **Learning Rate:** 2e-5
59
+ - **Batch Size:** 16
60
+ - **Weight Decay:** 0.01
61
+ - **Early Stopping:** 10 epochs
62
+
63
+ ## Evaluation
64
+
65
+ #### Testing Data
66
+
67
+ This model was tested on the "test" portion of the [SpanishSRL Dataset](https://huggingface.co/datasets/mbruton/spanish_srl) produced as part of this same project.
68
+
69
+ #### Metrics
70
+
71
+ [seqeval](https://huggingface.co/spaces/evaluate-metric/seqeval) is a Python framework for sequence labeling evaluation. It can evaluate the performance of chunking tasks such as named-entity recognition, part-of-speech tagging, and semantic role labeling.
72
+ It supplies scoring both overall and per label type.
73
+
74
+ Overall:
75
+ - `accuracy`: the average [accuracy](https://huggingface.co/metrics/accuracy), on a scale between 0.0 and 1.0.
76
+ - `precision`: the average [precision](https://huggingface.co/metrics/precision), on a scale between 0.0 and 1.0.
77
+ - `recall`: the average [recall](https://huggingface.co/metrics/recall), on a scale between 0.0 and 1.0.
78
+ - `f1`: the average [F1 score](https://huggingface.co/metrics/f1), which is the harmonic mean of the precision and recall. It also has a scale of 0.0 to 1.0.
79
+
80
+ Per label type:
81
+ - `precision`: the average [precision](https://huggingface.co/metrics/precision), on a scale between 0.0 and 1.0.
82
+ - `recall`: the average [recall](https://huggingface.co/metrics/recall), on a scale between 0.0 and 1.0.
83
+ - `f1`: the average [F1 score](https://huggingface.co/metrics/f1), on a scale between 0.0 and 1.0.
84
+
85
+ ### Results
86
+
87
+ | Label | Precision | Recall | f1-score | Support |
88
+ | :----------: | :-------: | :----: | :------: | :-----: |
89
+ | 0:arg0:agt | 0.93 | 0.93 | 0.93 | 867 |
90
+ | 0:arg0:cau | 0.69 | 0.60 | 0.64 | 57 |
91
+ | 0:arg0:src | 0.00 | 0.00 | 0.00 | 1 |
92
+ | 0:arg1:ext | 0.00 | 0.00 | 0.00 | 3 |
93
+ | 0:arg1:pat | 0.88 | 0.89 | 0.88 | 536 |
94
+ | 0:arg1:tem | 0.88 | 0.89 | 0.89 | 589 |
95
+ | 0:arg2:atr | 0.85 | 0.89 | 0.87 | 278 |
96
+ | 0:arg2:ben | 0.81 | 0.90 | 0.85 | 78 |
97
+ | 0:arg2:efi | 0.67 | 0.29 | 0.40 | 7 |
98
+ | 0:arg2:exp | 0.33 | 0.17 | 0.22 | 6 |
99
+ | 0:arg2:ext | 0.57 | 0.53 | 0.55 | 15 |
100
+ | 0:arg2:loc | 0.73 | 0.33 | 0.46 | 57 |
101
+ | 0:arg3:ben | 1.00 | 0.20 | 0.33 | 5 |
102
+ | 0:arg3:ein | 0.50 | 1.00 | 0.67 | 1 |
103
+ | 0:arg3:fin | 0.50 | 0.50 | 0.50 | 2 |
104
+ | 0:arg3:ori | 0.67 | 0.60 | 0.63 | 10 |
105
+ | 0:arg4:des | 0.58 | 0.94 | 0.71 | 16 |
106
+ | 0:arg4:efi | 0.67 | 0.40 | 0.50 | 5 |
107
+ | 0:argM:adv | 0.58 | 0.60 | 0.59 | 268 |
108
+ | 0:argM:atr | 0.65 | 0.62 | 0.64 | 24 |
109
+ | 0:argM:cau | 0.79 | 0.56 | 0.66 | 41 |
110
+ | 0:argM:ext | 0.00 | 0.00 | 0.00 | 5 |
111
+ | 0:argM:fin | 0.80 | 0.78 | 0.79 | 46 |
112
+ | 0:argM:loc | 0.69 | 0.80 | 0.74 | 186 |
113
+ | 0:argM:mnr | 0.72 | 0.47 | 0.57 | 66 |
114
+ | 0:argM:tmp | 0.86 | 0.86 | 0.86 | 411 |
115
+ | 0:root | 0.99 | 0.99 | 0.99 | 1662 |
116
+ | 1:arg0:agt | 0.92 | 0.91 | 0.92 | 564 |
117
+ | 1:arg0:cau | 0.73 | 0.82 | 0.77 | 44 |
118
+ | 1:arg1:ext | 0.00 | 0.00 | 0.00 | 2 |
119
+ | 1:arg1:pat | 0.89 | 0.87 | 0.88 | 482 |
120
+ | 1:arg1:tem | 0.88 | 0.90 | 0.89 | 390 |
121
+ | 1:arg2:atr | 0.89 | 0.88 | 0.88 | 197 |
122
+ | 1:arg2:ben | 0.75 | 0.89 | 0.81 | 66 |
123
+ | 1:arg2:efi | 1.00 | 0.50 | 0.67 | 6 |
124
+ | 1:arg2:ext | 0.71 | 0.71 | 0.71 | 7 |
125
+ | 1:arg2:ins | 0.00 | 0.00 | 0.00 | 1 |
126
+ | 1:arg2:loc | 0.62 | 0.52 | 0.57 | 44 |
127
+ | 1:arg3:ben | 0.00 | 0.00 | 0.00 | 2 |
128
+ | 1:arg3:ein | 0.00 | 0.00 | 0.00 | 3 |
129
+ | 1:arg3:fin | 1.00 | 1.00 | 1.00 | 2 |
130
+ | 1:arg3:ori | 0.12 | 0.50 | 0.20 | 2 |
131
+ | 1:arg4:des | 0.47 | 0.90 | 0.62 | 10 |
132
+ | 1:arg4:efi | 0.50 | 0.50 | 0.50 | 2 |
133
+ | 1:argM:adv | 0.56 | 0.58 | 0.57 | 220 |
134
+ | 1:argM:atr | 0.67 | 0.74 | 0.70 | 19 |
135
+ | 1:argM:cau | 0.65 | 0.74 | 0.69 | 35 |
136
+ | 1:argM:ext | 0.00 | 0.00 | 0.00 | 7 |
137
+ | 1:argM:fin | 0.57 | 0.66 | 0.61 | 38 |
138
+ | 1:argM:loc | 0.74 | 0.74 | 0.74 | 156 |
139
+ | 1:argM:mnr | 0.60 | 0.27 | 0.37 | 44 |
140
+ | 1:argM:tmp | 0.83 | 0.81 | 0.82 | 247 |
141
+ | 1:root | 0.97 | 0.97 | 0.97 | 1323 |
142
+ | 2:arg0:agt | 0.86 | 0.90 | 0.88 | 336 |
143
+ | 2:arg0:cau | 0.79 | 0.77 | 0.78 | 35 |
144
+ | 2:arg0:exp | 0.00 | 0.00 | 0.00 | 1 |
145
+ | 2:arg0:src | 0.00 | 0.00 | 0.00 | 1 |
146
+ | 2:arg1:pat | 0.84 | 0.82 | 0.83 | 333 |
147
+ | 2:arg1:tem | 0.84 | 0.84 | 0.84 | 291 |
148
+ | 2:arg2:atr | 0.92 | 0.89 | 0.90 | 124 |
149
+ | 2:arg2:ben | 0.69 | 0.84 | 0.76 | 43 |
150
+ | 2:arg2:efi | 0.89 | 0.89 | 0.89 | 9 |
151
+ | 2:arg2:ext | 0.33 | 0.60 | 0.43 | 5 |
152
+ | 2:arg2:ins | 0.00 | 0.00 | 0.00 | 1 |
153
+ | 2:arg2:loc | 0.43 | 0.44 | 0.44 | 27 |
154
+ | 2:arg3:ben | 0.00 | 0.00 | 0.00 | 4 |
155
+ | 2:arg3:ein | 0.00 | 0.00 | 0.00 | 1 |
156
+ | 2:arg3:ori | 0.40 | 0.67 | 0.50 | 3 |
157
+ | 2:arg4:des | 0.50 | 0.88 | 0.64 | 16 |
158
+ | 2:arg4:efi | 0.00 | 0.00 | 0.00 | 6 |
159
+ | 2:argM:adv | 0.54 | 0.51 | 0.52 | 176 |
160
+ | 2:argM:atr | 0.56 | 0.33 | 0.42 | 15 |
161
+ | 2:argM:cau | 0.43 | 0.59 | 0.50 | 17 |
162
+ | 2:argM:ext | 0.00 | 0.00 | 0.00 | 4 |
163
+ | 2:argM:fin | 0.78 | 0.69 | 0.74 | 36 |
164
+ | 2:argM:ins | 0.00 | 0.00 | 0.00 | 1 |
165
+ | 2:argM:loc | 0.73 | 0.74 | 0.73 | 117 |
166
+ | 2:argM:mnr | 0.38 | 0.29 | 0.33 | 35 |
167
+ | 2:argM:tmp | 0.78 | 0.76 | 0.77 | 161 |
168
+ | 2:root | 0.93 | 0.94 | 0.94 | 913 |
169
+ | 3:arg0:agt | 0.86 | 0.87 | 0.86 | 227 |
170
+ | 3:arg0:cau | 0.71 | 0.71 | 0.71 | 14 |
171
+ | 3:arg1:pat | 0.81 | 0.83 | 0.82 | 199 |
172
+ | 3:arg1:tem | 0.78 | 0.81 | 0.79 | 160 |
173
+ | 3:arg2:atr | 0.78 | 0.77 | 0.78 | 79 |
174
+ | 3:arg2:ben | 0.69 | 0.93 | 0.79 | 27 |
175
+ | 3:arg2:efi | 0.00 | 0.00 | 0.00 | 1 |
176
+ | 3:arg2:ext | 0.00 | 0.00 | 0.00 | 3 |
177
+ | 3:arg2:loc | 0.50 | 0.38 | 0.43 | 21 |
178
+ | 3:arg3:ben | 0.00 | 0.00 | 0.00 | 3 |
179
+ | 3:arg3:ein | 0.00 | 0.00 | 0.00 | 2 |
180
+ | 3:arg3:ori | 0.00 | 0.00 | 0.00 | 3 |
181
+ | 3:arg4:des | 0.47 | 1.00 | 0.64 | 7 |
182
+ | 3:arg4:efi | 0.00 | 0.00 | 0.00 | 5 |
183
+ | 3:argM:adv | 0.51 | 0.47 | 0.49 | 98 |
184
+ | 3:argM:atr | 1.00 | 0.14 | 0.25 | 7 |
185
+ | 3:argM:cau | 0.50 | 0.31 | 0.38 | 13 |
186
+ | 3:argM:ext | 0.00 | 0.00 | 0.00 | 1 |
187
+ | 3:argM:fin | 0.56 | 0.67 | 0.61 | 15 |
188
+ | 3:argM:loc | 0.64 | 0.68 | 0.66 | 69 |
189
+ | 3:argM:mnr | 0.43 | 0.55 | 0.48 | 11 |
190
+ | 3:argM:tmp | 0.86 | 0.82 | 0.84 | 92 |
191
+ | 3:root | 0.92 | 0.93 | 0.92 | 569 |
192
+ | 4:arg0:agt | 0.86 | 0.81 | 0.83 | 119 |
193
+ | 4:arg0:cau | 1.00 | 0.67 | 0.80 | 6 |
194
+ | 4:arg1:pat | 0.71 | 0.75 | 0.73 | 87 |
195
+ | 4:arg1:tem | 0.85 | 0.75 | 0.80 | 109 |
196
+ | 4:arg2:atr | 0.75 | 0.92 | 0.83 | 53 |
197
+ | 4:arg2:ben | 0.53 | 0.82 | 0.64 | 11 |
198
+ | 4:arg2:ext | 0.00 | 0.00 | 0.00 | 1 |
199
+ | 4:arg2:loc | 0.58 | 0.64 | 0.61 | 11 |
200
+ | 4:arg3:ein | 0.00 | 0.00 | 0.00 | 1 |
201
+ | 4:arg3:ori | 0.00 | 0.00 | 0.00 | 1 |
202
+ | 4:arg4:des | 0.69 | 0.90 | 0.78 | 10 |
203
+ | 4:arg4:efi | 0.00 | 0.00 | 0.00 | 1 |
204
+ | 4:argM:adv | 0.56 | 0.60 | 0.58 | 50 |
205
+ | 4:argM:atr | 0.00 | 0.00 | 0.00 | 4 |
206
+ | 4:argM:cau | 0.14 | 0.33 | 0.20 | 3 |
207
+ | 4:argM:ext | 0.00 | 0.00 | 0.00 | 1 |
208
+ | 4:argM:fin | 0.64 | 0.64 | 0.64 | 11 |
209
+ | 4:argM:loc | 0.58 | 0.75 | 0.65 | 24 |
210
+ | 4:argM:mnr | 0.50 | 0.31 | 0.38 | 16 |
211
+ | 4:argM:tmp | 0.75 | 0.69 | 0.72 | 52 |
212
+ | 4:root | 0.90 | 0.91 | 0.90 | 322 |
213
+ | 5:arg0:agt | 0.79 | 0.88 | 0.83 | 72 |
214
+ | 5:arg0:cau | 1.00 | 0.40 | 0.57 | 5 |
215
+ | 5:arg1:pat | 0.64 | 0.65 | 0.64 | 71 |
216
+ | 5:arg1:tem | 0.81 | 0.61 | 0.69 | 41 |
217
+ | 5:arg2:atr | 0.62 | 0.48 | 0.54 | 21 |
218
+ | 5:arg2:ben | 0.43 | 1.00 | 0.60 | 6 |
219
+ | 5:arg2:efi | 0.00 | 0.00 | 0.00 | 1 |
220
+ | 5:arg2:ext | 0.00 | 0.00 | 0.00 | 1 |
221
+ | 5:arg2:loc | 0.00 | 0.00 | 0.00 | 1 |
222
+ | 5:arg3:ein | 0.00 | 0.00 | 0.00 | 1 |
223
+ | 5:arg4:des | 0.00 | 0.00 | 0.00 | 1 |
224
+ | 5:arg4:efi | 0.00 | 0.00 | 0.00 | 1 |
225
+ | 5:argM:adv | 0.33 | 0.35 | 0.34 | 26 |
226
+ | 5:argM:cau | 0.00 | 0.00 | 0.00 | 3 |
227
+ | 5:argM:fin | 0.50 | 0.80 | 0.62 | 5 |
228
+ | 5:argM:loc | 0.58 | 0.67 | 0.62 | 21 |
229
+ | 5:argM:mnr | 0.00 | 0.00 | 0.00 | 7 |
230
+ | 5:argM:tmp | 0.62 | 0.67 | 0.65 | 30 |
231
+ | 5:root | 0.82 | 0.84 | 0.83 | 173 |
232
+ | 6:arg0:agt | 0.69 | 0.53 | 0.60 | 34 |
233
+ | 6:arg0:cau | 0.00 | 0.00 | 0.00 | 1 |
234
+ | 6:arg1:loc | 0.00 | 0.00 | 0.00 | 1 |
235
+ | 6:arg1:pat | 0.43 | 0.82 | 0.57 | 28 |
236
+ | 6:arg1:tem | 0.39 | 0.44 | 0.41 | 16 |
237
+ | 6:arg2:atr | 0.31 | 0.38 | 0.34 | 13 |
238
+ | 6:arg2:ben | 0.50 | 0.60 | 0.55 | 5 |
239
+ | 6:arg2:loc | 0.00 | 0.00 | 0.00 | 1 |
240
+ | 6:arg3:ben | 0.00 | 0.00 | 0.00 | 1 |
241
+ | 6:argM:adv | 0.23 | 0.70 | 0.34 | 10 |
242
+ | 6:argM:atr | 0.00 | 0.00 | 0.00 | 2 |
243
+ | 6:argM:cau | 0.00 | 0.00 | 0.00 | 1 |
244
+ | 6:argM:fin | 0.33 | 0.50 | 0.40 | 2 |
245
+ | 6:argM:loc | 0.18 | 0.57 | 0.28 | 7 |
246
+ | 6:argM:mnr | 0.00 | 0.00 | 0.00 | 5 |
247
+ | 6:argM:tmp | 0.50 | 0.86 | 0.63 | 7 |
248
+ | 6:root | 0.65 | 0.59 | 0.62 | 82 |
249
+ | 7:arg0:agt | 0.35 | 0.88 | 0.50 | 17 |
250
+ | 7:arg1:pat | 0.54 | 0.82 | 0.65 | 17 |
251
+ | 7:arg1:tem | 0.59 | 0.67 | 0.62 | 15 |
252
+ | 7:arg2:atr | 0.53 | 0.53 | 0.53 | 15 |
253
+ | 7:arg2:ben | 0.40 | 0.29 | 0.33 | 7 |
254
+ | 7:arg2:loc | 0.00 | 0.00 | 0.00 | 1 |
255
+ | 7:arg3:ori | 0.00 | 0.00 | 0.00 | 1 |
256
+ | 7:arg4:des | 0.00 | 0.00 | 0.00 | 1 |
257
+ | 7:argM:adv | 0.14 | 0.20 | 0.17 | 5 |
258
+ | 7:argM:atr | 0.00 | 0.00 | 0.00 | 1 |
259
+ | 7:argM:fin | 0.00 | 0.00 | 0.00 | 1 |
260
+ | 7:argM:loc | 0.00 | 0.00 | 0.00 | 3 |
261
+ | 7:argM:tmp | 0.42 | 0.83 | 0.56 | 6 |
262
+ | 7:root | 0.54 | 0.84 | 0.66 | 45 |
263
+ | 8:arg0:agt | 0.00 | 0.00 | 0.00 | 8 |
264
+ | 8:arg0:cau | 0.00 | 0.00 | 0.00 | 1 |
265
+ | 8:arg1:pat | 0.00 | 0.00 | 0.00 | 4 |
266
+ | 8:arg1:tem | 0.21 | 0.56 | 0.30 | 9 |
267
+ | 8:arg2:atr | 0.08 | 0.25 | 0.12 | 4 |
268
+ | 8:arg2:ext | 0.00 | 0.00 | 0.00 | 1 |
269
+ | 8:arg2:loc | 0.00 | 0.00 | 0.00 | 2 |
270
+ | 8:arg3:ori | 0.00 | 0.00 | 0.00 | 1 |
271
+ | 8:argM:adv | 0.27 | 0.38 | 0.32 | 8 |
272
+ | 8:argM:ext | 0.00 | 0.00 | 0.00 | 1 |
273
+ | 8:argM:fin | 0.00 | 0.00 | 0.00 | 1 |
274
+ | 8:argM:loc | 0.00 | 0.00 | 0.00 | 4 |
275
+ | 8:argM:mnr | 0.00 | 0.00 | 0.00 | 1 |
276
+ | 8:argM:tmp | 0.00 | 0.00 | 0.00 | 1 |
277
+ | 8:root | 0.38 | 0.68 | 0.49 | 25 |
278
+ | 9:arg0:agt | 0.00 | 0.00 | 0.00 | 6 |
279
+ | 9:arg0:cau | 0.00 | 0.00 | 0.00 | 1 |
280
+ | 9:arg1:pat | 0.00 | 0.00 | 0.00 | 4 |
281
+ | 9:arg1:tem | 0.00 | 0.00 | 0.00 | 5 |
282
+ | 9:arg2:atr | 0.00 | 0.00 | 0.00 | 3 |
283
+ | 9:arg2:ben | 0.00 | 0.00 | 0.00 | 1 |
284
+ | 9:argM:adv | 0.00 | 0.00 | 0.00 | 6 |
285
+ | 9:argM:cau | 0.00 | 0.00 | 0.00 | 1 |
286
+ | 9:argM:fin | 0.00 | 0.00 | 0.00 | 2 |
287
+ | 9:argM:loc | 0.00 | 0.00 | 0.00 | 2 |
288
+ | 9:argM:tmp | 0.00 | 0.00 | 0.00 | 1 |
289
+ | 9:root | 0.04 | 0.06 | 0.05 | 17 |
290
+ | 10:arg0:agt | 0.00 | 0.00 | 0.00 | 3 |
291
+ | 10:arg1:pat | 0.00 | 0.00 | 0.00 | 5 |
292
+ | 10:arg1:tem | 0.00 | 0.00 | 0.00 | 3 |
293
+ | 10:arg2:atr | 0.00 | 0.00 | 0.00 | 1 |
294
+ | 10:arg2:ben | 0.00 | 0.00 | 0.00 | 2 |
295
+ | 10:argM:adv | 0.00 | 0.00 | 0.00 | 3 |
296
+ | 10:argM:fin | 0.00 | 0.00 | 0.00 | 1 |
297
+ | 10:argM:tmp | 0.00 | 0.00 | 0.00 | 1 |
298
+ | 10:root | 0.00 | 0.00 | 0.00 | 12 |
299
+ | 11:arg0:agt | 0.00 | 0.00 | 0.00 | 1 |
300
+ | 11:arg0:cau | 0.00 | 0.00 | 0.00 | 1 |
301
+ | 11:arg1:pat | 0.00 | 0.00 | 0.00 | 2 |
302
+ | 11:arg1:tem | 0.00 | 0.00 | 0.00 | 4 |
303
+ | 11:arg2:atr | 0.00 | 0.00 | 0.00 | 3 |
304
+ | 11:arg2:ben | 0.00 | 0.00 | 0.00 | 1 |
305
+ | 11:argM:adv | 0.00 | 0.00 | 0.00 | 4 |
306
+ | 11:argM:loc | 0.00 | 0.00 | 0.00 | 1 |
307
+ | 11:argM:tmp | 0.00 | 0.00 | 0.00 | 1 |
308
+ | 11:root | 0.00 | 0.00 | 0.00 | 9 |
309
+ | 12:arg0:agt | 0.00 | 0.00 | 0.00 | 3 |
310
+ | 12:arg1:pat | 0.00 | 0.00 | 0.00 | 1 |
311
+ | 12:arg1:tem | 0.00 | 0.00 | 0.00 | 2 |
312
+ | 12:arg2:atr | 0.00 | 0.00 | 0.00 | 2 |
313
+ | 12:argM:adv | 0.00 | 0.00 | 0.00 | 1 |
314
+ | 12:argM:cau | 0.00 | 0.00 | 0.00 | 1 |
315
+ | 12:argM:tmp | 0.00 | 0.00 | 0.00 | 3 |
316
+ | 12:root | 0.00 | 0.00 | 0.00 | 7 |
317
+ | 13:arg0:cau | 0.00 | 0.00 | 0.00 | 1 |
318
+ | 13:arg1:tem | 0.00 | 0.00 | 0.00 | 1 |
319
+ | 13:arg2:atr | 0.00 | 0.00 | 0.00 | 1 |
320
+ | 13:argM:adv | 0.00 | 0.00 | 0.00 | 1 |
321
+ | 13:argM:atr | 0.00 | 0.00 | 0.00 | 1 |
322
+ | 13:argM:loc | 0.00 | 0.00 | 0.00 | 1 |
323
+ | 13:root | 0.00 | 0.00 | 0.00 | 4 |
324
+ | 14:arg1:pat | 0.00 | 0.00 | 0.00 | 1 |
325
+ | 14:arg2:ben | 0.00 | 0.00 | 0.00 | 1 |
326
+ | 14:argM:mnr | 0.00 | 0.00 | 0.00 | 1 |
327
+ | 14:root | 0.00 | 0.00 | 0.00 | 2 |
328
+ | micro avg | 0.83 | 0.83 | 0.83 | 15436 |
329
+ | macro avg | 0.34 | 0.36 | 0.34 | 15436 |
330
+ | weighted avg | 0.83 | 0.83 | 0.83 | 15436 |
331
+ | tot root avg | 0.48 | 0.52 | 0.49 | 5165.00 |
332
+ | tot arg0:agt avg | 0.48 | 0.52 | 0.49 | 2257.00 |
333
+ | tot arg0:cau avg | 0.45 | 0.36 | 0.39 | 166.00 |
334
+ | tot arg0:exp avg | 0.00 | 0.00 | 0.00 | 1.00 |
335
+ | tot arg0:src avg | 0.00 | 0.00 | 0.00 | 2.00 |
336
+ | tot arg0 | 0.41 | 0.40 | 0.39 | 2426.00 |
337
+ | tot arg1:ext avg | 0.00 | 0.00 | 0.00 | 5.00 |
338
+ | tot arg1:loc avg | 0.00 | 0.00 | 0.00 | 1.00 |
339
+ | tot arg1:pat avg | 0.41 | 0.46 | 0.43 | 1770.00 |
340
+ | tot arg1:tem avg | 0.45 | 0.46 | 0.45 | 1635.00 |
341
+ | tot arg1 | 0.39 | 0.42 | 0.39 | 3411.00 |
342
+ | tot arg2:atr avg | 0.41 | 0.43 | 0.41 | 794.00 |
343
+ | tot arg2:ben avg | 0.41 | 0.56 | 0.46 | 255.00 |
344
+ | tot arg2:efi avg | 0.51 | 0.34 | 0.39 | 24.00 |
345
+ | tot arg2:exp avg | 0.33 | 0.17 | 0.22 | 6.00 |
346
+ | tot arg2:ext avg | 0.23 | 0.26 | 0.24 | 33.00 |
347
+ | tot arg2:ins avg | 0.00 | 0.00 | 0.00 | 2.00 |
348
+ | tot arg2:loc avg | 0.32 | 0.26 | 0.28 | 165.00 |
349
+ | tot arg2 | 0.36 | 0.38 | 0.36 | 1279.00 |
350
+ | tot arg3:ben avg | 0.20 | 0.04 | 0.07 | 15.00 |
351
+ | tot arg3:ein avg | 0.08 | 0.17 | 0.11 | 9.00 |
352
+ | tot arg3:fin avg | 0.75 | 0.75 | 0.75 | 4.00 |
353
+ | tot arg3:ori avg | 0.17 | 0.25 | 0.19 | 21.00 |
354
+ | tot arg3 | 0.21 | 0.22 | 0.19 | 49.00 |
355
+ | tot arg4:des avg | 0.39 | 0.66 | 0.48 | 61.00 |
356
+ | tot arg4:efi avg | 0.20 | 0.15 | 0.17 | 20.00 |
357
+ | tot arg4 | 0.30 | 0.42 | 0.34 | 81.00 |
358
+ | tot argM:adv avg | 0.27 | 0.31 | 0.28 | 876.00 |
359
+ | tot argM:atr avg | 0.36 | 0.23 | 0.25 | 73.00 |
360
+ | tot argM:cau avg | 0.28 | 0.28 | 0.27 | 115.00 |
361
+ | tot argM:ext avg | 0.00 | 0.00 | 0.00 | 19.00 |
362
+ | tot argM:fin avg | 0.38 | 0.43 | 0.40 | 158.00 |
363
+ | tot argM:ins avg | 0.00 | 0.00 | 0.00 | 1.00 |
364
+ | tot argM:loc avg | 0.35 | 0.41 | 0.37 | 591.00 |
365
+ | tot argM:mnr avg | 0.29 | 0.21 | 0.24 | 186.00 |
366
+ | tot argM:tmp avg | 0.43 | 0.48 | 0.45 | 1013.00 |
367
+ | tot argM | 0.31 | 0.32 | 0.30 | 3032.00 |
368
+ | tot r0 avg | 0.64 | 0.58 | 0.59 | 5242 |
369
+ | tot r1 avg | 0.58 | 0.59 | 0.57 | 3913 |
370
+ | tot r2 avg | 0.47 | 0.50 | 0.48 | 2711 |
371
+ | tot r3 avg | 0.48 | 0.47 | 0.45 | 1626 |
372
+ | tot r4 avg | 0.48 | 0.50 | 0.48 | 892 |
373
+ | tot r5 avg | 0.38 | 0.39 | 0.36 | 487 |
374
+ | tot r6 avg | 0.25 | 0.35 | 0.28 | 216 |
375
+ | tot r7 avg | 0.25 | 0.36 | 0.29 | 135 |
376
+ | tot r8 avg | 0.06 | 0.12 | 0.08 | 71 |
377
+ | tot r9 avg | 0.00 | 0.01 | 0.00 | 49 |
378
+ | tot r10 avg | 0.00 | 0.00 | 0.00 | 31 |
379
+ | tot r11 avg | 0.00 | 0.00 | 0.00 | 27 |
380
+ | tot r12 avg | 0.00 | 0.00 | 0.00 | 20 |
381
+ | tot r13 avg | 0.00 | 0.00 | 0.00 | 10 |
382
+ | tot r14 avg | 0.00 | 0.00 | 0.00 | 5 |
383
+
384
+ ## Citation
385
+
386
+ **BibTeX:**
387
+
388
+ ```
389
+ @mastersthesis{bruton-galician-srl-23,
390
+ author = {Bruton, Micaella},
391
+ title = {BERTie Bott's Every Flavor Labels: A Tasty Guide to Developing a Semantic Role Labeling Model for Galician},
392
+ school = {Uppsala University},
393
+ year = {2023},
394
+ type = {Master's thesis},
395
+ }
396
+ ```