|
--- |
|
|
|
tags: |
|
- token-classification |
|
datasets: |
|
- djagatiya/ner-ontonotes-v5-eng-v4 |
|
widget: |
|
- text: "On September 1st George won 1 dollar while watching Game of Thrones." |
|
|
|
--- |
|
|
|
# (NER) deberta-base : conll2012_ontonotesv5-english-v4 |
|
|
|
This `deberta-base` NER model was finetuned on `conll2012_ontonotesv5` version `english-v4` dataset. <br> |
|
Check out [NER-System Repository](https://github.com/djagatiya/NER-System) for more information. |
|
|
|
## Dataset |
|
- conll2012_ontonotesv5 |
|
- Language : English |
|
- Version : v4 |
|
|
|
| Dataset | Examples | |
|
| --- | --- | |
|
| Training | 75187 | |
|
| Testing | 9479 | |
|
|
|
## Evaluation |
|
|
|
- Precision: 89.53 |
|
- Recall: 91.00 |
|
- F1-Score: 90.26 |
|
|
|
``` |
|
precision recall f1-score support |
|
|
|
CARDINAL 0.86 0.87 0.86 935 |
|
DATE 0.85 0.89 0.87 1602 |
|
EVENT 0.65 0.78 0.71 63 |
|
FAC 0.74 0.80 0.77 135 |
|
GPE 0.97 0.96 0.96 2240 |
|
LANGUAGE 0.83 0.68 0.75 22 |
|
LAW 0.71 0.68 0.69 40 |
|
LOC 0.74 0.77 0.76 179 |
|
MONEY 0.88 0.90 0.89 314 |
|
NORP 0.94 0.97 0.95 841 |
|
ORDINAL 0.79 0.87 0.83 195 |
|
ORG 0.92 0.92 0.92 1795 |
|
PERCENT 0.92 0.92 0.92 349 |
|
PERSON 0.95 0.95 0.95 1988 |
|
PRODUCT 0.65 0.76 0.70 76 |
|
QUANTITY 0.77 0.82 0.80 105 |
|
TIME 0.62 0.65 0.63 212 |
|
WORK_OF_ART 0.64 0.69 0.66 166 |
|
|
|
micro avg 0.90 0.91 0.90 11257 |
|
macro avg 0.80 0.83 0.81 11257 |
|
weighted avg 0.90 0.91 0.90 11257 |
|
``` |
|
|
|
## Inference Script |
|
|
|
> https://github.com/djagatiya/NER-System/blob/main/infer_pipeline.ipynb |
|
|
|
|