|
--- |
|
|
|
tags: |
|
- token-classification |
|
datasets: |
|
- djagatiya/ner-ontonotes-v5-eng-v4 |
|
|
|
--- |
|
|
|
# (NER) distilbert-base-uncased : conll2012_ontonotesv5-english-v4 |
|
|
|
This **distilbert-base-uncased** NER model was finetuned on **conll2012_ontonotesv5-english-v4** dataset. <br> |
|
Check out [NER-System Repository](https://github.com/djagatiya/NER-System) for more information. |
|
|
|
## Evaluation |
|
- Precision: 84.60 |
|
- Recall: 86.47 |
|
- F1-Score: 85.53 |
|
|
|
> check out this [eval.log](eval.log) file for evaluation metrics and classification report. |
|
|
|
``` |
|
precision recall f1-score support |
|
|
|
CARDINAL 0.84 0.86 0.85 935 |
|
DATE 0.83 0.88 0.85 1602 |
|
EVENT 0.57 0.57 0.57 63 |
|
FAC 0.55 0.62 0.58 135 |
|
GPE 0.95 0.92 0.94 2240 |
|
LANGUAGE 0.82 0.64 0.72 22 |
|
LAW 0.50 0.50 0.50 40 |
|
LOC 0.55 0.72 0.62 179 |
|
MONEY 0.87 0.89 0.88 314 |
|
NORP 0.85 0.89 0.87 841 |
|
ORDINAL 0.81 0.88 0.84 195 |
|
ORG 0.81 0.83 0.82 1795 |
|
PERCENT 0.87 0.89 0.88 349 |
|
PERSON 0.93 0.93 0.93 1988 |
|
PRODUCT 0.55 0.55 0.55 76 |
|
QUANTITY 0.71 0.80 0.75 105 |
|
TIME 0.59 0.66 0.62 212 |
|
WORK_OF_ART 0.42 0.44 0.43 166 |
|
|
|
micro avg 0.85 0.86 0.86 11257 |
|
macro avg 0.72 0.75 0.73 11257 |
|
weighted avg 0.85 0.86 0.86 11257 |
|
``` |
|
|