Sebabrata's picture
update model card README.md
ac187ce
|
raw
history blame
28.1 kB
metadata
license: cc-by-nc-sa-4.0
tags:
  - generated_from_trainer
model-index:
  - name: lmv2-g-recp-992-doc-09-09
    results: []

lmv2-g-recp-992-doc-09-09

This model is a fine-tuned version of microsoft/layoutlmv2-base-uncased on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2241
  • Purchase Time Precision: 0.872
  • Purchase Time Recall: 0.8516
  • Purchase Time F1: 0.8617
  • Purchase Time Number: 128
  • Receipt Date Precision: 0.8713
  • Receipt Date Recall: 0.8817
  • Receipt Date F1: 0.8765
  • Receipt Date Number: 169
  • Sub Total Precision: 0.8211
  • Sub Total Recall: 0.7091
  • Sub Total F1: 0.7610
  • Sub Total Number: 110
  • Supplier Address Precision: 0.7009
  • Supplier Address Recall: 0.7193
  • Supplier Address F1: 0.7100
  • Supplier Address Number: 114
  • Supplier Name Precision: 0.7442
  • Supplier Name Recall: 0.7191
  • Supplier Name F1: 0.7314
  • Supplier Name Number: 267
  • Tip Amount Precision: 0.6667
  • Tip Amount Recall: 1.0
  • Tip Amount F1: 0.8
  • Tip Amount Number: 2
  • Total Precision: 0.8436
  • Total Recall: 0.8251
  • Total F1: 0.8343
  • Total Number: 183
  • Total Tax Amount Precision: 0.8361
  • Total Tax Amount Recall: 0.7846
  • Total Tax Amount F1: 0.8095
  • Total Tax Amount Number: 65
  • Overall Precision: 0.8067
  • Overall Recall: 0.7842
  • Overall F1: 0.7953
  • Overall Accuracy: 0.9728

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 4e-05
  • train_batch_size: 1
  • eval_batch_size: 1
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: constant
  • num_epochs: 30

Training results

Training Loss Epoch Step Validation Loss Purchase Time Precision Purchase Time Recall Purchase Time F1 Purchase Time Number Receipt Date Precision Receipt Date Recall Receipt Date F1 Receipt Date Number Sub Total Precision Sub Total Recall Sub Total F1 Sub Total Number Supplier Address Precision Supplier Address Recall Supplier Address F1 Supplier Address Number Supplier Name Precision Supplier Name Recall Supplier Name F1 Supplier Name Number Tip Amount Precision Tip Amount Recall Tip Amount F1 Tip Amount Number Total Precision Total Recall Total F1 Total Number Total Tax Amount Precision Total Tax Amount Recall Total Tax Amount F1 Total Tax Amount Number Overall Precision Overall Recall Overall F1 Overall Accuracy
0.9017 1.0 793 0.3748 0.0 0.0 0.0 128 0.5 0.0710 0.1244 169 0.0 0.0 0.0 110 0.4632 0.5526 0.504 114 0.3724 0.2022 0.2621 267 0.0 0.0 0.0 2 0.7387 0.4481 0.5578 183 0.0 0.0 0.0 65 0.4637 0.2033 0.2827 0.9330
0.2651 2.0 1586 0.2025 0.8 0.8438 0.8213 128 0.8274 0.8225 0.8249 169 0.4 0.0182 0.0348 110 0.5329 0.7105 0.6090 114 0.5886 0.6592 0.6219 267 0.0 0.0 0.0 2 0.5720 0.8470 0.6828 183 1.0 0.0308 0.0597 65 0.6424 0.6387 0.6406 0.9624
0.1403 3.0 2379 0.1585 0.8248 0.8828 0.8528 128 0.7897 0.9112 0.8462 169 0.7054 0.7182 0.7117 110 0.5931 0.7544 0.6641 114 0.6288 0.6217 0.6252 267 0.0 0.0 0.0 2 0.7877 0.7705 0.7790 183 0.8276 0.7385 0.7805 65 0.7220 0.7582 0.7397 0.9683
0.0935 4.0 3172 0.1771 0.7891 0.7891 0.7891 128 0.6474 0.7278 0.6852 169 0.8205 0.5818 0.6809 110 0.6074 0.7193 0.6586 114 0.6548 0.6891 0.6715 267 0.0 0.0 0.0 2 0.8476 0.7596 0.8012 183 0.75 0.2308 0.3529 65 0.7108 0.6821 0.6962 0.9648
0.0684 5.0 3965 0.1552 0.9237 0.8516 0.8862 128 0.8362 0.8757 0.8555 169 0.7629 0.6727 0.7150 110 0.6029 0.7193 0.6560 114 0.7167 0.6442 0.6785 267 0.0 0.0 0.0 2 0.8128 0.8306 0.8216 183 0.7937 0.7692 0.7813 65 0.7731 0.7582 0.7656 0.9696
0.0491 6.0 4758 0.1702 0.8760 0.8828 0.8794 128 0.8352 0.8698 0.8522 169 0.8056 0.7909 0.7982 110 0.5894 0.7807 0.6717 114 0.6844 0.6742 0.6792 267 0.0 0.0 0.0 2 0.8778 0.8634 0.8705 183 0.9074 0.7538 0.8235 65 0.7757 0.7929 0.7842 0.9703
0.0472 7.0 5551 0.2037 0.8952 0.8672 0.8810 128 0.8876 0.8876 0.8876 169 0.8 0.7273 0.7619 110 0.6557 0.7018 0.6780 114 0.7953 0.6404 0.7095 267 0.0 0.0 0.0 2 0.8795 0.7978 0.8367 183 0.9394 0.4769 0.6327 65 0.8278 0.7408 0.7819 0.9701
0.0361 8.0 6344 0.1862 0.875 0.8203 0.8468 128 0.7978 0.8402 0.8184 169 0.7739 0.8091 0.7911 110 0.6512 0.7368 0.6914 114 0.6906 0.6854 0.6880 267 0.0 0.0 0.0 2 0.8486 0.8579 0.8533 183 0.6780 0.6154 0.6452 65 0.7612 0.7707 0.7659 0.9701
0.0318 9.0 7137 0.1889 0.9 0.8438 0.8710 128 0.8743 0.8639 0.8690 169 0.875 0.6364 0.7368 110 0.6417 0.6754 0.6581 114 0.6914 0.6966 0.6940 267 0.0 0.0 0.0 2 0.7833 0.8689 0.8238 183 0.7797 0.7077 0.7419 65 0.7772 0.7630 0.7701 0.9697
0.3481 10.0 7930 0.7581 0.0 0.0 0.0 128 0.0 0.0 0.0 169 0.0 0.0 0.0 110 0.0 0.0 0.0 114 0.0 0.0 0.0 267 0.0 0.0 0.0 2 0.0 0.0 0.0 183 0.0 0.0 0.0 65 0.0 0.0 0.0 0.8967
0.7157 11.0 8723 0.7634 0.0 0.0 0.0 128 0.0 0.0 0.0 169 0.0 0.0 0.0 110 0.0 0.0 0.0 114 0.0 0.0 0.0 267 0.0 0.0 0.0 2 0.0 0.0 0.0 183 0.0 0.0 0.0 65 0.0 0.0 0.0 0.8967
0.7136 12.0 9516 0.7611 0.0 0.0 0.0 128 0.0 0.0 0.0 169 0.0 0.0 0.0 110 0.0 0.0 0.0 114 0.0 0.0 0.0 267 0.0 0.0 0.0 2 0.0 0.0 0.0 183 0.0 0.0 0.0 65 0.0 0.0 0.0 0.8967
0.1095 13.0 10309 0.1744 0.8284 0.8672 0.8473 128 0.8531 0.8935 0.8728 169 0.7717 0.6455 0.7030 110 0.5662 0.6754 0.6160 114 0.6424 0.6929 0.6667 267 0.0 0.0 0.0 2 0.8211 0.8525 0.8365 183 0.8214 0.7077 0.7603 65 0.7428 0.7678 0.7551 0.9698
0.0316 14.0 11102 0.1812 0.8943 0.8594 0.8765 128 0.8409 0.8757 0.8580 169 0.8415 0.6273 0.7188 110 0.5714 0.6667 0.6154 114 0.6279 0.7079 0.6655 267 1.0 0.5 0.6667 2 0.8256 0.8798 0.8519 183 0.8136 0.7385 0.7742 65 0.7495 0.7726 0.7609 0.9703
0.0226 15.0 11895 0.2132 0.8843 0.8359 0.8594 128 0.8476 0.8225 0.8348 169 0.7525 0.6909 0.7204 110 0.5804 0.7281 0.6459 114 0.6679 0.6929 0.6801 267 0.2 0.5 0.2857 2 0.8571 0.8525 0.8548 183 0.4835 0.6769 0.5641 65 0.7297 0.7620 0.7455 0.9672
0.0241 16.0 12688 0.1962 0.8984 0.8984 0.8984 128 0.8613 0.8817 0.8713 169 0.6615 0.7818 0.7167 110 0.6 0.7368 0.6614 114 0.6431 0.7154 0.6773 267 0.0833 0.5 0.1429 2 0.8795 0.7978 0.8367 183 0.7727 0.7846 0.7786 65 0.7401 0.7929 0.7656 0.9709
0.0155 17.0 13481 0.1995 0.8906 0.8906 0.8906 128 0.8678 0.8935 0.8805 169 0.7438 0.8182 0.7792 110 0.6042 0.7632 0.6744 114 0.6193 0.7678 0.6856 267 1.0 0.5 0.6667 2 0.8325 0.8689 0.8503 183 0.8644 0.7846 0.8226 65 0.7467 0.8266 0.7846 0.9696
0.0165 18.0 14274 0.2402 0.8966 0.8125 0.8525 128 0.8293 0.8047 0.8168 169 0.8118 0.6273 0.7077 110 0.5766 0.6930 0.6295 114 0.7220 0.6517 0.6850 267 1.0 1.0 1.0 2 0.8603 0.8415 0.8508 183 0.7826 0.5538 0.6486 65 0.7773 0.7264 0.7510 0.9683
0.0721 19.0 15067 0.2718 0.3506 0.6328 0.4513 128 0.7268 0.7870 0.7557 169 0.7742 0.4364 0.5581 110 0.5271 0.5965 0.5597 114 0.5294 0.5056 0.5172 267 0.0 0.0 0.0 2 0.7526 0.7978 0.7745 183 0.7414 0.6615 0.6992 65 0.5881 0.6301 0.6084 0.9564
0.0136 20.0 15860 0.2213 0.8651 0.8516 0.8583 128 0.8555 0.8757 0.8655 169 0.8191 0.7 0.7549 110 0.6103 0.7281 0.664 114 0.6977 0.6742 0.6857 267 1.0 0.5 0.6667 2 0.8571 0.8197 0.8380 183 0.7656 0.7538 0.7597 65 0.7760 0.7678 0.7719 0.9697
0.0111 21.0 16653 0.2241 0.872 0.8516 0.8617 128 0.8713 0.8817 0.8765 169 0.8211 0.7091 0.7610 110 0.7009 0.7193 0.7100 114 0.7442 0.7191 0.7314 267 0.6667 1.0 0.8 2 0.8436 0.8251 0.8343 183 0.8361 0.7846 0.8095 65 0.8067 0.7842 0.7953 0.9728
0.011 22.0 17446 0.2206 0.7770 0.8984 0.8333 128 0.8270 0.9053 0.8644 169 0.8586 0.7727 0.8134 110 0.5985 0.6930 0.6423 114 0.6618 0.6742 0.6679 267 0.0 0.0 0.0 2 0.8870 0.8579 0.8722 183 0.7391 0.7846 0.7612 65 0.7579 0.7900 0.7736 0.9697
0.0104 23.0 18239 0.2571 0.9310 0.8438 0.8852 128 0.875 0.8698 0.8724 169 0.8316 0.7182 0.7707 110 0.6417 0.6754 0.6581 114 0.7386 0.6667 0.7008 267 0.1429 0.5 0.2222 2 0.8579 0.8579 0.8579 183 0.7812 0.7692 0.7752 65 0.8018 0.7678 0.7844 0.9705
0.0132 24.0 19032 0.2252 0.8810 0.8672 0.8740 128 0.8297 0.8935 0.8604 169 0.7607 0.8091 0.7841 110 0.6074 0.7193 0.6586 114 0.6578 0.7416 0.6972 267 0.3333 1.0 0.5 2 0.8659 0.8470 0.8564 183 0.7966 0.7231 0.7581 65 0.7557 0.8044 0.7793 0.9717
0.0114 25.0 19825 0.2303 0.8917 0.8359 0.8629 128 0.8947 0.9053 0.9000 169 0.8144 0.7182 0.7633 110 0.6296 0.7456 0.6827 114 0.6937 0.7041 0.6989 267 1.0 0.5 0.6667 2 0.8533 0.8579 0.8556 183 0.8913 0.6308 0.7387 65 0.7912 0.7813 0.7862 0.9705
0.0121 26.0 20618 0.2485 0.8810 0.8672 0.8740 128 0.8793 0.9053 0.8921 169 0.8667 0.7091 0.7800 110 0.5926 0.7018 0.6426 114 0.7446 0.6442 0.6908 267 0.25 0.5 0.3333 2 0.8361 0.8361 0.8361 183 0.7581 0.7231 0.7402 65 0.7910 0.7659 0.7783 0.9705
0.0124 27.0 21411 0.2280 0.8504 0.8438 0.8471 128 0.8391 0.8639 0.8513 169 0.8119 0.7455 0.7773 110 0.6435 0.6491 0.6463 114 0.6259 0.6891 0.6560 267 0.4 1.0 0.5714 2 0.8548 0.8689 0.8618 183 0.8627 0.6769 0.7586 65 0.7588 0.7697 0.7642 0.9702
0.0111 28.0 22204 0.2728 0.8917 0.8359 0.8629 128 0.8704 0.8343 0.8520 169 0.9059 0.7 0.7897 110 0.5833 0.6754 0.6260 114 0.6618 0.6816 0.6716 267 1.0 0.5 0.6667 2 0.8713 0.8142 0.8418 183 0.8837 0.5846 0.7037 65 0.7806 0.7437 0.7617 0.9692
0.0079 29.0 22997 0.2596 0.8661 0.8594 0.8627 128 0.8817 0.8817 0.8817 169 0.7436 0.7909 0.7665 110 0.616 0.6754 0.6444 114 0.6794 0.6667 0.6730 267 1.0 1.0 1.0 2 0.8681 0.8634 0.8658 183 0.8727 0.7385 0.8 65 0.7786 0.7794 0.7790 0.9705
0.0076 30.0 23790 0.2476 0.8088 0.8594 0.8333 128 0.8889 0.8994 0.8941 169 0.7909 0.7909 0.7909 110 0.6397 0.7632 0.6960 114 0.6727 0.6929 0.6827 267 0.3333 1.0 0.5 2 0.8641 0.8689 0.8665 183 0.6512 0.8615 0.7417 65 0.7591 0.8073 0.7824 0.9705

Framework versions

  • Transformers 4.22.0.dev0
  • Pytorch 1.12.1+cu113
  • Datasets 2.2.2
  • Tokenizers 0.12.1