Edit model card

distilbert-base-uncased-finetuned-ner

This model is a fine-tuned version of distilbert-base-uncased on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.9802
  • 0 Precision: 0.9692
  • 0 Recall: 0.9435
  • 0 F1-score: 0.9562
  • 1 Precision: 0.7997
  • 1 Recall: 0.8792
  • 1 F1-score: 0.8376
  • 2 Precision: 0.6988
  • 2 Recall: 0.8028
  • 2 F1-score: 0.7472
  • 3 Precision: 0.7917
  • 3 Recall: 0.8548
  • 3 F1-score: 0.8220
  • Accuracy: 0.9252
  • Macro avg Precision: 0.8148
  • Macro avg Recall: 0.8701
  • Macro avg F1-score: 0.8407
  • Weighted avg Precision: 0.9295
  • Weighted avg Recall: 0.9252
  • Weighted avg F1-score: 0.9268

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 60

Training results

Training Loss Epoch Step Validation Loss 0 Precision 0 Recall 0 F1-score 1 Precision 1 Recall 1 F1-score 2 Precision 2 Recall 2 F1-score 3 Precision 3 Recall 3 F1-score Accuracy Macro avg Precision Macro avg Recall Macro avg F1-score Weighted avg Precision Weighted avg Recall Weighted avg F1-score
No log 1.0 67 0.4250 0.9903 0.7848 0.8756 0.5423 0.9449 0.6891 0.3268 0.8720 0.4755 0.5597 0.7669 0.6471 0.8010 0.6048 0.8421 0.6718 0.8904 0.8010 0.8249
No log 2.0 134 0.3689 0.9870 0.8424 0.9089 0.6009 0.9361 0.7319 0.48 0.8304 0.6084 0.6025 0.8957 0.7204 0.8539 0.6676 0.8761 0.7424 0.9027 0.8539 0.8664
No log 3.0 201 0.3369 0.9925 0.8111 0.8926 0.5395 0.9591 0.6905 0.4491 0.8997 0.5991 0.6213 0.9059 0.7371 0.8348 0.6506 0.8939 0.7298 0.9018 0.8348 0.8507
No log 4.0 268 0.3532 0.9863 0.8815 0.9310 0.6240 0.9343 0.7482 0.5593 0.8651 0.6793 0.7202 0.8896 0.7960 0.8859 0.7224 0.8926 0.7886 0.9164 0.8859 0.8941
No log 5.0 335 0.4184 0.9867 0.8817 0.9313 0.6243 0.9325 0.7479 0.6076 0.8304 0.7018 0.6813 0.9182 0.7822 0.8865 0.7250 0.8907 0.7908 0.9160 0.8865 0.8942
No log 6.0 402 0.4253 0.9800 0.9019 0.9393 0.6831 0.9112 0.7808 0.6247 0.8581 0.7230 0.7138 0.8875 0.7912 0.8997 0.7504 0.8897 0.8086 0.9189 0.8997 0.9051
No log 7.0 469 0.4059 0.9851 0.8892 0.9347 0.6340 0.9414 0.7577 0.5906 0.8685 0.7031 0.7428 0.8916 0.8104 0.8930 0.7381 0.8977 0.8015 0.9194 0.8930 0.9000
0.264 8.0 536 0.4724 0.9802 0.9105 0.9441 0.7097 0.8988 0.7931 0.6256 0.8616 0.7249 0.735 0.9018 0.8099 0.9067 0.7626 0.8932 0.8180 0.9230 0.9067 0.9114
0.264 9.0 603 0.4683 0.9787 0.9101 0.9432 0.6766 0.9254 0.7817 0.6231 0.8581 0.7220 0.7807 0.8589 0.8179 0.9053 0.7648 0.8881 0.8162 0.9223 0.9053 0.9102
0.264 10.0 670 0.5353 0.9728 0.9328 0.9524 0.7942 0.8703 0.8305 0.6515 0.8408 0.7341 0.7558 0.8732 0.8102 0.9189 0.7935 0.8793 0.8318 0.9270 0.9189 0.9216
0.264 11.0 737 0.5061 0.9786 0.9130 0.9447 0.7347 0.9147 0.8149 0.6010 0.8651 0.7092 0.7469 0.8753 0.8060 0.9082 0.7653 0.8920 0.8187 0.9237 0.9082 0.9128
0.264 12.0 804 0.6254 0.9703 0.9355 0.9526 0.7444 0.8792 0.8062 0.6945 0.8339 0.7579 0.7926 0.8364 0.8139 0.9188 0.8005 0.8713 0.8326 0.9255 0.9188 0.9211
0.264 13.0 871 0.6908 0.9704 0.9376 0.9537 0.7776 0.8508 0.8126 0.7096 0.8201 0.7608 0.7566 0.8773 0.8125 0.9204 0.8035 0.8714 0.8349 0.9263 0.9204 0.9225
0.264 14.0 938 0.6405 0.9716 0.9263 0.9484 0.7227 0.8934 0.7990 0.6842 0.8097 0.7417 0.7527 0.8405 0.7942 0.9119 0.7828 0.8675 0.8208 0.9212 0.9119 0.9149
0.0528 15.0 1005 0.7143 0.9718 0.9380 0.9546 0.7668 0.8703 0.8153 0.7091 0.8097 0.7561 0.7740 0.8753 0.8215 0.9218 0.8054 0.8733 0.8369 0.9278 0.9218 0.9239
0.0528 16.0 1072 0.7162 0.9694 0.9374 0.9531 0.7634 0.8828 0.8188 0.7096 0.8201 0.7608 0.7765 0.8384 0.8063 0.9201 0.8047 0.8697 0.8348 0.9258 0.9201 0.9221
0.0528 17.0 1139 0.7823 0.9661 0.9412 0.9535 0.7960 0.8455 0.8200 0.7212 0.8235 0.7690 0.7477 0.8364 0.7896 0.9200 0.8078 0.8617 0.8330 0.9244 0.9200 0.9216
0.0528 18.0 1206 0.7009 0.9729 0.9299 0.9509 0.7326 0.8810 0.8 0.6891 0.8131 0.7460 0.7685 0.8691 0.8157 0.9160 0.7908 0.8733 0.8282 0.9244 0.9160 0.9188
0.0528 19.0 1273 0.7972 0.9689 0.9384 0.9534 0.8037 0.8579 0.8299 0.6859 0.8235 0.7484 0.7487 0.8528 0.7973 0.9200 0.8018 0.8681 0.8323 0.9257 0.9200 0.9221
0.0528 20.0 1340 0.8604 0.9650 0.9482 0.9565 0.8072 0.8401 0.8233 0.7183 0.8028 0.7582 0.7859 0.8405 0.8123 0.9244 0.8191 0.8579 0.8376 0.9272 0.9244 0.9255
0.0528 21.0 1407 0.7864 0.9713 0.9372 0.9540 0.7910 0.8739 0.8304 0.6791 0.8201 0.7429 0.7595 0.8589 0.8061 0.9208 0.8002 0.8725 0.8334 0.9271 0.9208 0.9230
0.0528 22.0 1474 0.8004 0.9714 0.9384 0.9546 0.7615 0.8845 0.8184 0.7130 0.8166 0.7613 0.7767 0.8466 0.8102 0.9215 0.8056 0.8715 0.8361 0.9274 0.9215 0.9236
0.0192 23.0 1541 0.8033 0.9722 0.9386 0.9551 0.7793 0.8845 0.8286 0.6985 0.8097 0.75 0.7715 0.8630 0.8147 0.9226 0.8054 0.8739 0.8371 0.9285 0.9226 0.9247
0.0192 24.0 1608 0.8725 0.9661 0.9480 0.9570 0.8325 0.8472 0.8398 0.6939 0.8235 0.7532 0.7859 0.8405 0.8123 0.9258 0.8196 0.8648 0.8405 0.9292 0.9258 0.9271
0.0192 25.0 1675 0.8488 0.9708 0.9393 0.9548 0.8020 0.8632 0.8315 0.6879 0.8235 0.7496 0.7599 0.8671 0.8099 0.9223 0.8051 0.8733 0.8365 0.9281 0.9223 0.9243
0.0192 26.0 1742 0.8193 0.9730 0.9339 0.9531 0.7477 0.8845 0.8104 0.7122 0.8304 0.7668 0.7642 0.8548 0.8069 0.9192 0.7993 0.8759 0.8343 0.9265 0.9192 0.9217
0.0192 27.0 1809 0.8800 0.9687 0.9451 0.9567 0.8033 0.8561 0.8289 0.7156 0.8270 0.7673 0.7772 0.8487 0.8113 0.9250 0.8162 0.8692 0.8411 0.9290 0.9250 0.9265
0.0192 28.0 1876 0.8399 0.9719 0.9366 0.9539 0.7545 0.8845 0.8144 0.7178 0.8097 0.7610 0.7721 0.8589 0.8132 0.9208 0.8041 0.8724 0.8356 0.9271 0.9208 0.9229
0.0192 29.0 1943 0.9358 0.9682 0.9466 0.9573 0.8084 0.8544 0.8307 0.7191 0.8062 0.7602 0.7788 0.8569 0.8160 0.9258 0.8186 0.8660 0.8410 0.9293 0.9258 0.9272
0.0081 30.0 2010 0.8878 0.9711 0.9407 0.9556 0.8123 0.8455 0.8285 0.6965 0.8339 0.7591 0.75 0.8773 0.8087 0.9230 0.8075 0.8743 0.8380 0.9288 0.9230 0.9251
0.0081 31.0 2077 0.8791 0.9698 0.9437 0.9566 0.8046 0.8632 0.8329 0.6968 0.8270 0.7563 0.7809 0.8528 0.8152 0.9249 0.8130 0.8717 0.8403 0.9295 0.9249 0.9266
0.0081 32.0 2144 0.8428 0.9743 0.9376 0.9556 0.7850 0.8757 0.8279 0.6928 0.8270 0.7539 0.7695 0.8875 0.8243 0.9237 0.8054 0.8819 0.8404 0.9303 0.9237 0.9259
0.0081 33.0 2211 0.8576 0.9716 0.9389 0.9550 0.7852 0.8828 0.8311 0.6844 0.8478 0.7573 0.7924 0.8507 0.8205 0.9235 0.8084 0.8800 0.8410 0.9295 0.9235 0.9256
0.0081 34.0 2278 0.9266 0.9699 0.9468 0.9582 0.8156 0.8721 0.8429 0.7118 0.8374 0.7695 0.7935 0.8487 0.8202 0.9282 0.8227 0.8762 0.8477 0.9321 0.9282 0.9297
0.0081 35.0 2345 0.9253 0.9695 0.9407 0.9549 0.7776 0.8757 0.8237 0.7048 0.8097 0.7536 0.7906 0.8569 0.8224 0.9230 0.8106 0.8707 0.8387 0.9280 0.9230 0.9248
0.0081 36.0 2412 0.9402 0.9697 0.9462 0.9578 0.8087 0.8632 0.8351 0.6886 0.8339 0.7543 0.8043 0.8487 0.8259 0.9269 0.8178 0.8730 0.8433 0.9311 0.9269 0.9284
0.0081 37.0 2479 0.9513 0.9692 0.9430 0.9559 0.7977 0.8686 0.8316 0.6962 0.8166 0.7516 0.7868 0.8528 0.8184 0.9243 0.8125 0.8702 0.8394 0.9288 0.9243 0.9259
0.0044 38.0 2546 0.9609 0.9699 0.9416 0.9556 0.7958 0.8721 0.8322 0.7134 0.8097 0.7585 0.7737 0.8671 0.8177 0.9243 0.8132 0.8726 0.8410 0.9290 0.9243 0.9260
0.0044 39.0 2613 0.9623 0.9687 0.9443 0.9563 0.7967 0.8703 0.8319 0.7169 0.8062 0.7590 0.7857 0.8548 0.8188 0.9252 0.8170 0.8689 0.8415 0.9291 0.9252 0.9267
0.0044 40.0 2680 0.9215 0.9718 0.9386 0.9549 0.7882 0.8792 0.8312 0.7045 0.8166 0.7564 0.7608 0.8650 0.8096 0.9226 0.8063 0.8749 0.8380 0.9284 0.9226 0.9246
0.0044 41.0 2747 0.9658 0.9688 0.9435 0.9560 0.7901 0.8757 0.8307 0.7196 0.7993 0.7574 0.7820 0.8507 0.8149 0.9244 0.8151 0.8673 0.8397 0.9285 0.9244 0.9259
0.0044 42.0 2814 0.9644 0.9690 0.9434 0.9560 0.8016 0.8757 0.8370 0.6994 0.8131 0.752 0.7879 0.8507 0.8181 0.9249 0.8145 0.8707 0.8408 0.9292 0.9249 0.9265
0.0044 43.0 2881 0.9738 0.9685 0.9453 0.9568 0.8060 0.8632 0.8336 0.7082 0.8062 0.7540 0.7790 0.8507 0.8133 0.9250 0.8154 0.8664 0.8394 0.9289 0.9250 0.9265
0.0044 44.0 2948 0.9369 0.9707 0.9411 0.9556 0.7771 0.8917 0.8304 0.6967 0.8028 0.7460 0.7985 0.8507 0.8238 0.9240 0.8107 0.8715 0.8390 0.9291 0.9240 0.9258
0.0026 45.0 3015 0.9617 0.9702 0.9430 0.9564 0.8016 0.8757 0.8370 0.7147 0.8062 0.7577 0.7784 0.8691 0.8213 0.9256 0.8162 0.8735 0.8431 0.9301 0.9256 0.9273
0.0026 46.0 3082 0.9485 0.9700 0.9445 0.9571 0.7961 0.8810 0.8364 0.7087 0.8166 0.7588 0.7950 0.8487 0.8210 0.9262 0.8175 0.8727 0.8433 0.9305 0.9262 0.9278
0.0026 47.0 3149 0.9604 0.9697 0.9453 0.9573 0.7971 0.8792 0.8361 0.7033 0.8201 0.7572 0.8016 0.8425 0.8215 0.9264 0.8179 0.8718 0.8430 0.9305 0.9264 0.9279
0.0026 48.0 3216 0.9488 0.9700 0.9432 0.9564 0.8029 0.8757 0.8377 0.6848 0.8270 0.7492 0.7912 0.8446 0.8170 0.9249 0.8122 0.8726 0.8401 0.9297 0.9249 0.9266
0.0026 49.0 3283 0.9857 0.9692 0.9434 0.9561 0.8 0.8810 0.8385 0.7064 0.7993 0.75 0.7861 0.8569 0.8200 0.9252 0.8154 0.8701 0.8412 0.9294 0.9252 0.9267
0.0026 50.0 3350 1.0184 0.9674 0.9470 0.9571 0.8076 0.8721 0.8386 0.7090 0.7924 0.7484 0.7981 0.8487 0.8226 0.9264 0.8205 0.8650 0.8417 0.9296 0.9264 0.9277
0.0026 51.0 3417 0.9627 0.9698 0.9422 0.9558 0.8 0.8810 0.8385 0.6921 0.8166 0.7492 0.7898 0.8528 0.8201 0.9247 0.8129 0.8731 0.8409 0.9295 0.9247 0.9265
0.0026 52.0 3484 0.9409 0.9710 0.9397 0.9551 0.7921 0.8863 0.8365 0.6879 0.8235 0.7496 0.7861 0.8569 0.8200 0.9238 0.8093 0.8766 0.8403 0.9293 0.9238 0.9258
0.0019 53.0 3551 0.9855 0.9687 0.9439 0.9561 0.8003 0.8757 0.8363 0.7009 0.8028 0.7484 0.7951 0.8569 0.8248 0.9253 0.8162 0.8698 0.8414 0.9294 0.9253 0.9269
0.0019 54.0 3618 0.9728 0.9690 0.9434 0.9560 0.7990 0.8757 0.8356 0.6976 0.8062 0.7480 0.7917 0.8548 0.8220 0.9249 0.8143 0.8700 0.8404 0.9292 0.9249 0.9265
0.0019 55.0 3685 0.9698 0.9690 0.9434 0.9560 0.8016 0.8757 0.8370 0.6914 0.8062 0.7444 0.7894 0.8507 0.8189 0.9246 0.8129 0.8690 0.8391 0.9290 0.9246 0.9262
0.0019 56.0 3752 0.9834 0.9687 0.9437 0.9560 0.8013 0.8739 0.8360 0.6988 0.8028 0.7472 0.7921 0.8569 0.8232 0.9250 0.8152 0.8693 0.8406 0.9292 0.9250 0.9266
0.0019 57.0 3819 0.9646 0.9696 0.9424 0.9558 0.7961 0.8810 0.8364 0.6955 0.8062 0.7468 0.7883 0.8528 0.8193 0.9244 0.8124 0.8706 0.8396 0.9290 0.9244 0.9261
0.0019 58.0 3886 0.9961 0.9683 0.9449 0.9565 0.8026 0.8739 0.8367 0.6979 0.7993 0.7452 0.7973 0.8528 0.8241 0.9255 0.8165 0.8677 0.8406 0.9294 0.9255 0.9270
0.0019 59.0 3953 0.9789 0.9692 0.9432 0.9560 0.7971 0.8792 0.8361 0.6988 0.8028 0.7472 0.7917 0.8548 0.8220 0.9249 0.8142 0.8700 0.8403 0.9292 0.9249 0.9265
0.0013 60.0 4020 0.9802 0.9692 0.9435 0.9562 0.7997 0.8792 0.8376 0.6988 0.8028 0.7472 0.7917 0.8548 0.8220 0.9252 0.8148 0.8701 0.8407 0.9295 0.9252 0.9268

Framework versions

  • Transformers 4.38.2
  • Pytorch 2.2.1+cu121
  • Datasets 2.18.0
  • Tokenizers 0.15.2
Downloads last month
6
Safetensors
Model size
66.4M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for antoineedy/distilbert-base-uncased-finetuned-ner

Finetuned
(6757)
this model