File size: 1,270 Bytes
0af6929
 
92e6076
 
 
 
 
 
 
 
 
 
0af6929
92e6076
 
 
 
6ab53cd
bb883a0
92e6076
 
 
 
 
 
 
 
 
 
 
dc10826
 
 
 
 
 
 
 
92e6076
 
 
4083684
92e6076
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
---
license: cc-by-nc-4.0
metrics:
- accuracy
- f1
datasets:
- orai-nlp/basqueGLUE
language:
- eu
pipeline_tag: token-classification
tags:
- NERC
---



 [ElhBERTeu](https://huggingface.co/orai-nlp/ElhBERTeu) model finetuned for NERC BasqueGlue dataset. 

 In Domain and Out-of-domain tasks' train and dev datasets were merged for training. Reported performance results are tested on the merged tests dataset after training for 10 epochs (batch size 32, learning rate  3e-5).

Results on test set:

* accuracy = 0.9785551028520322
* f1 = 0.8665899340797322
* loss = 0.11313809949144367
* precision = 0.8650511802799248
* recall = 0.8681341719077568

Per class results: 

| CLASS    | precision | recall | F1 score | support |
|----------|:---------:|:------:|:--------:|:-------:|
|      LOC |   89.43%  | 87.95% |   88.68  |   1844  |
|     MISC |   70.72%  | 67.75% |   69.20  |   502   |
|      ORG |   80.04%  | 84.32% |   82.12  |   1082  |
|      PER |   93.52%  | 94.57% |   94.04  |   1359  |
|          |           |        |          |         |
|   GLOBAL |   86.51%  | 86.81% |   86.66  |         |


Tagset: 
["O", "B-ORG", "B-PER", "I-PER", "I-ORG", "B-LOC", "I-LOC", "B-MISC", "I-MISC"]


Finetuning details: 10 epochs, batch size 32, learning rate  3e-5.