librarian-bot's picture
Librarian Bot: Add base_model information to model
e4105c0
|
raw
history blame
29.4 kB
metadata
license: cc-by-nc-sa-4.0
tags:
  - generated_from_trainer
base_model: microsoft/layoutlmv2-base-uncased
model-index:
  - name: lmv2-g-dl-243-doc-09-13
    results: []

lmv2-g-dl-243-doc-09-13

This model is a fine-tuned version of microsoft/layoutlmv2-base-uncased on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2104
  • Address Precision: 0.725
  • Address Recall: 0.7632
  • Address F1: 0.7436
  • Address Number: 38
  • Blood Group Precision: 0.8636
  • Blood Group Recall: 0.8636
  • Blood Group F1: 0.8636
  • Blood Group Number: 22
  • Date Of Issue Precision: 0.9787
  • Date Of Issue Recall: 0.92
  • Date Of Issue F1: 0.9485
  • Date Of Issue Number: 50
  • Dob Precision: 1.0
  • Dob Recall: 0.9773
  • Dob F1: 0.9885
  • Dob Number: 44
  • Driving Licence No Precision: 0.9796
  • Driving Licence No Recall: 1.0
  • Driving Licence No F1: 0.9897
  • Driving Licence No Number: 48
  • Name Precision: 0.9388
  • Name Recall: 0.9388
  • Name F1: 0.9388
  • Name Number: 49
  • S D W Name Precision: 0.9388
  • S D W Name Recall: 0.9583
  • S D W Name F1: 0.9485
  • S D W Name Number: 48
  • Valid Till Nt Precision: 0.7826
  • Valid Till Nt Recall: 0.8780
  • Valid Till Nt F1: 0.8276
  • Valid Till Nt Number: 41
  • Valid Till T Tr Precision: 0.9231
  • Valid Till T Tr Recall: 0.8571
  • Valid Till T Tr F1: 0.8889
  • Valid Till T Tr Number: 14
  • Overall Precision: 0.9078
  • Overall Recall: 0.9181
  • Overall F1: 0.9129
  • Overall Accuracy: 0.9763

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 4e-05
  • train_batch_size: 1
  • eval_batch_size: 1
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: constant
  • num_epochs: 30

Training results

Training Loss Epoch Step Validation Loss Address Precision Address Recall Address F1 Address Number Blood Group Precision Blood Group Recall Blood Group F1 Blood Group Number Date Of Issue Precision Date Of Issue Recall Date Of Issue F1 Date Of Issue Number Dob Precision Dob Recall Dob F1 Dob Number Driving Licence No Precision Driving Licence No Recall Driving Licence No F1 Driving Licence No Number Name Precision Name Recall Name F1 Name Number S D W Name Precision S D W Name Recall S D W Name F1 S D W Name Number Valid Till Nt Precision Valid Till Nt Recall Valid Till Nt F1 Valid Till Nt Number Valid Till T Tr Precision Valid Till T Tr Recall Valid Till T Tr F1 Valid Till T Tr Number Overall Precision Overall Recall Overall F1 Overall Accuracy
1.9194 1.0 194 1.3597 0.3099 0.5789 0.4037 38 0.0 0.0 0.0 22 0.3147 0.9 0.4663 50 0.0 0.0 0.0 44 0.1059 0.1875 0.1353 48 0.0246 0.0612 0.0351 49 0.0 0.0 0.0 48 0.0 0.0 0.0 41 0.0 0.0 0.0 14 0.1876 0.2232 0.2039 0.8582
1.099 2.0 388 0.7767 0.5333 0.6316 0.5783 38 0.0 0.0 0.0 22 0.4667 0.98 0.6323 50 0.9773 0.9773 0.9773 44 0.9057 1.0 0.9505 48 0.3070 0.7143 0.4294 49 0.0 0.0 0.0 48 0.8182 0.2195 0.3462 41 0.0 0.0 0.0 14 0.5591 0.5876 0.5730 0.9016
0.6398 3.0 582 0.4892 0.5532 0.6842 0.6118 38 0.0 0.0 0.0 22 1.0 0.94 0.9691 50 1.0 0.9091 0.9524 44 0.9412 1.0 0.9697 48 0.5538 0.7347 0.6316 49 0.5714 0.75 0.6486 48 0.6863 0.8537 0.7609 41 0.0 0.0 0.0 14 0.7363 0.7571 0.7465 0.9491
0.4251 4.0 776 0.3770 0.4364 0.6316 0.5161 38 0.4706 0.3636 0.4103 22 1.0 0.9 0.9474 50 1.0 0.9773 0.9885 44 0.9412 1.0 0.9697 48 0.92 0.9388 0.9293 49 0.9333 0.875 0.9032 48 0.6182 0.8293 0.7083 41 0.0 0.0 0.0 14 0.8033 0.8192 0.8112 0.9557
0.304 5.0 970 0.2942 0.65 0.6842 0.6667 38 0.4643 0.5909 0.52 22 0.9796 0.96 0.9697 50 1.0 0.9773 0.9885 44 0.8727 1.0 0.9320 48 0.9388 0.9388 0.9388 49 0.9388 0.9583 0.9485 48 0.85 0.8293 0.8395 41 0.8 0.2857 0.4211 14 0.8603 0.8701 0.8652 0.9627
0.2297 6.0 1164 0.2497 0.5854 0.6316 0.6076 38 0.5294 0.8182 0.6429 22 1.0 0.92 0.9583 50 1.0 0.9773 0.9885 44 0.9231 1.0 0.9600 48 0.9149 0.8776 0.8958 49 0.9184 0.9375 0.9278 48 0.8462 0.8049 0.8250 41 0.7692 0.7143 0.7407 14 0.8516 0.8757 0.8635 0.9679
0.1843 7.0 1358 0.2123 0.54 0.7105 0.6136 38 0.7778 0.9545 0.8571 22 0.9792 0.94 0.9592 50 1.0 0.9773 0.9885 44 0.9796 1.0 0.9897 48 0.9348 0.8776 0.9053 49 0.9020 0.9583 0.9293 48 0.7609 0.8537 0.8046 41 0.8 0.5714 0.6667 14 0.8595 0.8983 0.8785 0.9696
0.1455 8.0 1552 0.2166 0.6316 0.6316 0.6316 38 0.8077 0.9545 0.875 22 1.0 0.94 0.9691 50 1.0 0.9773 0.9885 44 0.9216 0.9792 0.9495 48 0.9167 0.8980 0.9072 49 0.9167 0.9167 0.9167 48 0.7826 0.8780 0.8276 41 0.9286 0.9286 0.9286 14 0.8837 0.9011 0.8923 0.9696
0.1262 9.0 1746 0.2060 0.5333 0.6316 0.5783 38 0.8333 0.9091 0.8696 22 0.96 0.96 0.96 50 1.0 0.9773 0.9885 44 0.9216 0.9792 0.9495 48 0.9 0.9184 0.9091 49 0.9184 0.9375 0.9278 48 0.7826 0.8780 0.8276 41 0.5185 1.0 0.6829 14 0.8364 0.9096 0.8714 0.9661
0.109 10.0 1940 0.1912 0.6154 0.6316 0.6234 38 0.7692 0.9091 0.8333 22 0.9792 0.94 0.9592 50 1.0 0.9773 0.9885 44 0.9796 1.0 0.9897 48 0.9388 0.9388 0.9388 49 0.9362 0.9167 0.9263 48 0.8537 0.8537 0.8537 41 0.9333 1.0 0.9655 14 0.8992 0.9068 0.9030 0.9725
0.0911 11.0 2134 0.2063 0.5897 0.6053 0.5974 38 0.8 0.9091 0.8511 22 0.9412 0.96 0.9505 50 1.0 0.9773 0.9885 44 0.9412 1.0 0.9697 48 0.9184 0.9184 0.9184 49 0.9375 0.9375 0.9375 48 0.8 0.8780 0.8372 41 0.8235 1.0 0.9032 14 0.875 0.9096 0.8920 0.9690
0.0771 12.0 2328 0.2262 0.525 0.5526 0.5385 38 0.8333 0.9091 0.8696 22 1.0 0.98 0.9899 50 1.0 0.9773 0.9885 44 0.9184 0.9375 0.9278 48 0.9388 0.9388 0.9388 49 0.9375 0.9375 0.9375 48 0.8537 0.8537 0.8537 41 0.8571 0.8571 0.8571 14 0.8852 0.8927 0.8889 0.9638
0.0753 13.0 2522 0.2170 0.5714 0.6316 0.6 38 0.8 0.9091 0.8511 22 0.9796 0.96 0.9697 50 1.0 0.9773 0.9885 44 0.9412 1.0 0.9697 48 0.9375 0.9184 0.9278 49 0.9574 0.9375 0.9474 48 0.875 0.8537 0.8642 41 0.7059 0.8571 0.7742 14 0.8840 0.9040 0.8939 0.9673
0.0676 14.0 2716 0.2148 0.5610 0.6053 0.5823 38 0.8261 0.8636 0.8444 22 0.9245 0.98 0.9515 50 1.0 0.9773 0.9885 44 0.96 1.0 0.9796 48 0.9362 0.8980 0.9167 49 0.9130 0.875 0.8936 48 0.8182 0.8780 0.8471 41 0.9333 1.0 0.9655 14 0.8785 0.8983 0.8883 0.9670
0.0588 15.0 2910 0.2140 0.65 0.6842 0.6667 38 0.7241 0.9545 0.8235 22 0.9792 0.94 0.9592 50 1.0 0.9773 0.9885 44 0.9412 1.0 0.9697 48 0.8824 0.9184 0.9 49 0.9130 0.875 0.8936 48 0.8 0.8780 0.8372 41 0.9286 0.9286 0.9286 14 0.8747 0.9068 0.8904 0.9690
0.0592 16.0 3104 0.2353 0.6410 0.6579 0.6494 38 0.75 0.9545 0.84 22 0.9767 0.84 0.9032 50 1.0 0.9773 0.9885 44 0.9796 1.0 0.9897 48 0.9362 0.8980 0.9167 49 0.9574 0.9375 0.9474 48 0.8571 0.8780 0.8675 41 0.9333 1.0 0.9655 14 0.9008 0.8983 0.8996 0.9664
0.0461 17.0 3298 0.2137 0.5714 0.6316 0.6 38 0.7143 0.9091 0.8 22 0.9057 0.96 0.9320 50 1.0 0.9773 0.9885 44 0.9796 1.0 0.9897 48 0.9388 0.9388 0.9388 49 0.9184 0.9375 0.9278 48 0.8571 0.8780 0.8675 41 0.7368 1.0 0.8485 14 0.8663 0.9153 0.8901 0.9685
0.0424 18.0 3492 0.2057 0.5610 0.6053 0.5823 38 0.84 0.9545 0.8936 22 0.9423 0.98 0.9608 50 1.0 0.9773 0.9885 44 0.9796 1.0 0.9897 48 0.9388 0.9388 0.9388 49 0.9565 0.9167 0.9362 48 0.8095 0.8293 0.8193 41 0.8667 0.9286 0.8966 14 0.8867 0.9068 0.8966 0.9708
0.0389 19.0 3686 0.2400 0.6098 0.6579 0.6329 38 0.8 0.9091 0.8511 22 0.9574 0.9 0.9278 50 1.0 0.9773 0.9885 44 0.9796 1.0 0.9897 48 0.86 0.8776 0.8687 49 0.8980 0.9167 0.9072 48 0.8537 0.8537 0.8537 41 0.9167 0.7857 0.8462 14 0.8796 0.8870 0.8833 0.9670
0.0375 20.0 3880 0.2258 0.6341 0.6842 0.6582 38 0.8636 0.8636 0.8636 22 1.0 0.96 0.9796 50 1.0 0.9773 0.9885 44 0.9796 1.0 0.9897 48 0.9388 0.9388 0.9388 49 0.9565 0.9167 0.9362 48 0.8718 0.8293 0.8500 41 0.75 0.8571 0.8000 14 0.9065 0.9040 0.9052 0.9693
0.036 21.0 4074 0.2686 0.5952 0.6579 0.625 38 0.7778 0.9545 0.8571 22 1.0 0.96 0.9796 50 1.0 0.9773 0.9885 44 0.9796 1.0 0.9897 48 0.9362 0.8980 0.9167 49 0.9565 0.9167 0.9362 48 0.8684 0.8049 0.8354 41 0.7059 0.8571 0.7742 14 0.8908 0.8983 0.8945 0.9644
0.0321 22.0 4268 0.2102 0.6923 0.7105 0.7013 38 0.7778 0.9545 0.8571 22 0.96 0.96 0.96 50 1.0 0.9773 0.9885 44 0.9796 1.0 0.9897 48 0.92 0.9388 0.9293 49 0.9362 0.9167 0.9263 48 0.8182 0.8780 0.8471 41 0.8571 0.8571 0.8571 14 0.8953 0.9181 0.9066 0.9722
0.0244 23.0 4462 0.2432 0.6279 0.7105 0.6667 38 0.8 0.9091 0.8511 22 1.0 0.86 0.9247 50 1.0 0.9773 0.9885 44 0.9796 1.0 0.9897 48 0.9388 0.9388 0.9388 49 0.9362 0.9167 0.9263 48 0.7955 0.8537 0.8235 41 0.9091 0.7143 0.8 14 0.8927 0.8927 0.8927 0.9676
0.0217 24.0 4656 0.2290 0.6923 0.7105 0.7013 38 0.8261 0.8636 0.8444 22 1.0 0.92 0.9583 50 1.0 0.9773 0.9885 44 0.9796 1.0 0.9897 48 0.9388 0.9388 0.9388 49 0.9375 0.9375 0.9375 48 0.8333 0.8537 0.8434 41 0.9167 0.7857 0.8462 14 0.9117 0.9040 0.9078 0.9728
0.0215 25.0 4850 0.2677 0.5897 0.6053 0.5974 38 0.7778 0.9545 0.8571 22 0.9592 0.94 0.9495 50 1.0 0.9773 0.9885 44 0.9796 1.0 0.9897 48 0.9388 0.9388 0.9388 49 0.9375 0.9375 0.9375 48 0.7955 0.8537 0.8235 41 0.7 1.0 0.8235 14 0.875 0.9096 0.8920 0.9676
0.0258 26.0 5044 0.2356 0.6341 0.6842 0.6582 38 0.7692 0.9091 0.8333 22 0.9796 0.96 0.9697 50 1.0 0.9773 0.9885 44 0.9796 1.0 0.9897 48 0.9388 0.9388 0.9388 49 0.8980 0.9167 0.9072 48 0.7727 0.8293 0.8000 41 0.6923 0.6429 0.6667 14 0.8760 0.8983 0.8870 0.9696
0.0191 27.0 5238 0.2115 0.625 0.6579 0.6410 38 0.8696 0.9091 0.8889 22 0.9792 0.94 0.9592 50 1.0 0.9773 0.9885 44 0.9796 1.0 0.9897 48 0.9388 0.9388 0.9388 49 0.9375 0.9375 0.9375 48 0.8571 0.8780 0.8675 41 0.8 0.8571 0.8276 14 0.9020 0.9096 0.9058 0.9751
0.0238 28.0 5432 0.2104 0.725 0.7632 0.7436 38 0.8636 0.8636 0.8636 22 0.9787 0.92 0.9485 50 1.0 0.9773 0.9885 44 0.9796 1.0 0.9897 48 0.9388 0.9388 0.9388 49 0.9388 0.9583 0.9485 48 0.7826 0.8780 0.8276 41 0.9231 0.8571 0.8889 14 0.9078 0.9181 0.9129 0.9763
0.0282 29.0 5626 0.2352 0.5238 0.5789 0.5500 38 0.8696 0.9091 0.8889 22 0.9778 0.88 0.9263 50 1.0 0.9773 0.9885 44 0.9796 1.0 0.9897 48 0.9388 0.9388 0.9388 49 0.9362 0.9167 0.9263 48 0.7609 0.8537 0.8046 41 0.75 0.8571 0.8000 14 0.8722 0.8870 0.8796 0.9705
0.0157 30.0 5820 0.2614 0.6190 0.6842 0.6500 38 0.84 0.9545 0.8936 22 1.0 0.92 0.9583 50 1.0 0.9773 0.9885 44 0.9796 1.0 0.9897 48 0.9388 0.9388 0.9388 49 0.9583 0.9583 0.9583 48 0.72 0.8780 0.7912 41 0.7059 0.8571 0.7742 14 0.8780 0.9153 0.8963 0.9688

Framework versions

  • Transformers 4.22.0.dev0
  • Pytorch 1.12.1+cu113
  • Datasets 2.2.2
  • Tokenizers 0.12.1