DistilBERT base cased, fine-tuned for NER using the conll03 english dataset. Note that this model is sensitive to capital letters — "english" is different than "English". For the case insensitive version, please use elastic/distilbert-base-uncased-finetuned-conll03-english.
Versions
- Transformers version: 4.3.1
- Datasets version: 1.3.0
Training
$ run_ner.py \
--model_name_or_path distilbert-base-cased \
--label_all_tokens True \
--return_entity_level_metrics True \
--dataset_name conll2003 \
--output_dir /tmp/distilbert-base-cased-finetuned-conll03-english \
--do_train \
--do_eval
After training, we update the labels to match the NER specific labels from the dataset conll2003
- Downloads last month
- 41,598
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.
Model tree for elastic/distilbert-base-cased-finetuned-conll03-english
Dataset used to train elastic/distilbert-base-cased-finetuned-conll03-english
Evaluation results
- Accuracy on conll2003validation set verified0.983
- Precision on conll2003validation set verified0.986
- Recall on conll2003validation set verified0.988
- F1 on conll2003validation set verified0.987
- loss on conll2003validation set verified0.077