metadata
tags:
- spacy
- token-classification
language:
- en
license: mit
model-index:
- name: en_core_web_trf
results:
- task:
name: NER
type: token-classification
metrics:
- name: NER Precision
type: precision
value: 0.9017005601
- name: NER Recall
type: recall
value: 0.8948818109
- name: NER F Score
type: f_score
value: 0.8982782456
- task:
name: TAG
type: token-classification
metrics:
- name: TAG (XPOS) Accuracy
type: accuracy
value: 0.9781415701
- task:
name: UNLABELED_DEPENDENCIES
type: token-classification
metrics:
- name: Unlabeled Attachment Score (UAS)
type: f_score
value: 0.9519734881
- task:
name: LABELED_DEPENDENCIES
type: token-classification
metrics:
- name: Labeled Attachment Score (LAS)
type: f_score
value: 0.9386831877
- task:
name: SENTS
type: token-classification
metrics:
- name: Sentences F-Score
type: f_score
value: 0.9015817834
Details: https://spacy.io/models/en#en_core_web_trf
English transformer pipeline (roberta-base). Components: transformer, tagger, parser, ner, attribute_ruler, lemmatizer.
Feature | Description |
---|---|
Name | en_core_web_trf |
Version | 3.4.0 |
spaCy | >=3.4.0,<3.5.0 |
Default Pipeline | transformer , tagger , parser , attribute_ruler , lemmatizer , ner |
Components | transformer , tagger , parser , attribute_ruler , lemmatizer , ner |
Vectors | 0 keys, 0 unique vectors (0 dimensions) |
Sources | OntoNotes 5 (Ralph Weischedel, Martha Palmer, Mitchell Marcus, Eduard Hovy, Sameer Pradhan, Lance Ramshaw, Nianwen Xue, Ann Taylor, Jeff Kaufman, Michelle Franchini, Mohammed El-Bachouti, Robert Belvin, Ann Houston) ClearNLP Constituent-to-Dependency Conversion (Emory University) WordNet 3.0 (Princeton University) roberta-base (Yinhan Liu and Myle Ott and Naman Goyal and Jingfei Du and Mandar Joshi and Danqi Chen and Omer Levy and Mike Lewis and Luke Zettlemoyer and Veselin Stoyanov) |
License | MIT |
Author | Explosion |
Label Scheme
View label scheme (112 labels for 3 components)
Component | Labels |
---|---|
tagger |
$ , '' , , , -LRB- , -RRB- , . , : , ADD , AFX , CC , CD , DT , EX , FW , HYPH , IN , JJ , JJR , JJS , LS , MD , NFP , NN , NNP , NNPS , NNS , PDT , POS , PRP , PRP$ , RB , RBR , RBS , RP , SYM , TO , UH , VB , VBD , VBG , VBN , VBP , VBZ , WDT , WP , WP$ , WRB , XX , ```` |
parser |
ROOT , acl , acomp , advcl , advmod , agent , amod , appos , attr , aux , auxpass , case , cc , ccomp , compound , conj , csubj , csubjpass , dative , dep , det , dobj , expl , intj , mark , meta , neg , nmod , npadvmod , nsubj , nsubjpass , nummod , oprd , parataxis , pcomp , pobj , poss , preconj , predet , prep , prt , punct , quantmod , relcl , xcomp |
ner |
CARDINAL , DATE , EVENT , FAC , GPE , LANGUAGE , LAW , LOC , MONEY , NORP , ORDINAL , ORG , PERCENT , PERSON , PRODUCT , QUANTITY , TIME , WORK_OF_ART |
Accuracy
Type | Score |
---|---|
TOKEN_ACC |
99.93 |
TOKEN_P |
99.57 |
TOKEN_R |
99.58 |
TOKEN_F |
99.57 |
TAG_ACC |
97.81 |
SENTS_P |
95.30 |
SENTS_R |
85.54 |
SENTS_F |
90.16 |
DEP_UAS |
95.20 |
DEP_LAS |
93.87 |
ENTS_P |
90.17 |
ENTS_R |
89.49 |
ENTS_F |
89.83 |