lmv2-g-rai2-732-doc-10-07
This model is a fine-tuned version of microsoft/layoutlmv2-base-uncased on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.0222
- Dob Key Precision: 0.7810
- Dob Key Recall: 0.8359
- Dob Key F1: 0.8075
- Dob Key Number: 128
- Dob Value Precision: 0.9767
- Dob Value Recall: 0.9618
- Dob Value F1: 0.9692
- Dob Value Number: 131
- Doctor Name Key Precision: 0.5714
- Doctor Name Key Recall: 0.6197
- Doctor Name Key F1: 0.5946
- Doctor Name Key Number: 71
- Doctor Name Value Precision: 0.8861
- Doctor Name Value Recall: 0.9091
- Doctor Name Value F1: 0.8974
- Doctor Name Value Number: 77
- Patient Name Key Precision: 0.7639
- Patient Name Key Recall: 0.7639
- Patient Name Key F1: 0.7639
- Patient Name Key Number: 144
- Patient Name Value Precision: 0.9595
- Patient Name Value Recall: 0.9530
- Patient Name Value F1: 0.9562
- Patient Name Value Number: 149
- Overall Precision: 0.8389
- Overall Recall: 0.8557
- Overall F1: 0.8472
- Overall Accuracy: 0.9939
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 4e-05
- train_batch_size: 1
- eval_batch_size: 1
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: constant
- num_epochs: 30
Training results
Training Loss | Epoch | Step | Validation Loss | Dob Key Precision | Dob Key Recall | Dob Key F1 | Dob Key Number | Dob Value Precision | Dob Value Recall | Dob Value F1 | Dob Value Number | Doctor Name Key Precision | Doctor Name Key Recall | Doctor Name Key F1 | Doctor Name Key Number | Doctor Name Value Precision | Doctor Name Value Recall | Doctor Name Value F1 | Doctor Name Value Number | Patient Name Key Precision | Patient Name Key Recall | Patient Name Key F1 | Patient Name Key Number | Patient Name Value Precision | Patient Name Value Recall | Patient Name Value F1 | Patient Name Value Number | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
0.6913 | 1.0 | 585 | 0.1911 | 0.0 | 0.0 | 0.0 | 128 | 0.0 | 0.0 | 0.0 | 131 | 0.0 | 0.0 | 0.0 | 71 | 0.0 | 0.0 | 0.0 | 77 | 0.0 | 0.0 | 0.0 | 144 | 0.0 | 0.0 | 0.0 | 149 | 0.0 | 0.0 | 0.0 | 0.9671 |
0.1083 | 2.0 | 1170 | 0.0588 | 0.7692 | 0.7812 | 0.7752 | 128 | 0.9615 | 0.9542 | 0.9579 | 131 | 0.5645 | 0.4930 | 0.5263 | 71 | 0.9178 | 0.8701 | 0.8933 | 77 | 0.7397 | 0.75 | 0.7448 | 144 | 0.9346 | 0.9597 | 0.9470 | 149 | 0.8329 | 0.8257 | 0.8293 | 0.9936 |
0.0515 | 3.0 | 1755 | 0.0377 | 0.7803 | 0.8047 | 0.7923 | 128 | 0.875 | 0.9084 | 0.8914 | 131 | 0.5867 | 0.6197 | 0.6027 | 71 | 0.9231 | 0.9351 | 0.9290 | 77 | 0.6903 | 0.7431 | 0.7157 | 144 | 0.9732 | 0.9732 | 0.9732 | 149 | 0.8138 | 0.8429 | 0.8281 | 0.9942 |
0.0348 | 4.0 | 2340 | 0.0287 | 0.7710 | 0.7891 | 0.7799 | 128 | 0.9545 | 0.9618 | 0.9582 | 131 | 0.5789 | 0.6197 | 0.5986 | 71 | 0.8391 | 0.9481 | 0.8902 | 77 | 0.75 | 0.75 | 0.75 | 144 | 0.9603 | 0.9732 | 0.9667 | 149 | 0.8280 | 0.8529 | 0.8403 | 0.9940 |
0.0279 | 5.0 | 2925 | 0.0262 | 0.6541 | 0.8125 | 0.7247 | 128 | 0.9466 | 0.9466 | 0.9466 | 131 | 0.6027 | 0.6197 | 0.6111 | 71 | 0.6915 | 0.8442 | 0.7602 | 77 | 0.7397 | 0.75 | 0.7448 | 144 | 0.9062 | 0.9732 | 0.9385 | 149 | 0.7733 | 0.8429 | 0.8066 | 0.9938 |
0.0232 | 6.0 | 3510 | 0.0263 | 0.6603 | 0.8047 | 0.7254 | 128 | 0.9767 | 0.9618 | 0.9692 | 131 | 0.6143 | 0.6056 | 0.6099 | 71 | 0.9315 | 0.8831 | 0.9067 | 77 | 0.75 | 0.75 | 0.75 | 144 | 0.9728 | 0.9597 | 0.9662 | 149 | 0.8220 | 0.8443 | 0.8330 | 0.9941 |
0.0204 | 7.0 | 4095 | 0.0220 | 0.8062 | 0.8125 | 0.8093 | 128 | 0.9767 | 0.9618 | 0.9692 | 131 | 0.5946 | 0.6197 | 0.6069 | 71 | 0.8590 | 0.8701 | 0.8645 | 77 | 0.75 | 0.75 | 0.75 | 144 | 0.9603 | 0.9732 | 0.9667 | 149 | 0.8426 | 0.8486 | 0.8456 | 0.9945 |
0.0202 | 8.0 | 4680 | 0.0217 | 0.7591 | 0.8125 | 0.7849 | 128 | 0.9769 | 0.9695 | 0.9732 | 131 | 0.6027 | 0.6197 | 0.6111 | 71 | 0.8625 | 0.8961 | 0.8790 | 77 | 0.5989 | 0.7361 | 0.6604 | 144 | 0.9664 | 0.9664 | 0.9664 | 149 | 0.7962 | 0.8486 | 0.8216 | 0.9939 |
0.0174 | 9.0 | 5265 | 0.0219 | 0.8062 | 0.8125 | 0.8093 | 128 | 0.9695 | 0.9695 | 0.9695 | 131 | 0.5714 | 0.6197 | 0.5946 | 71 | 0.8452 | 0.9221 | 0.8820 | 77 | 0.75 | 0.75 | 0.75 | 144 | 0.9177 | 0.9732 | 0.9446 | 149 | 0.8285 | 0.8557 | 0.8419 | 0.9938 |
0.0172 | 10.0 | 5850 | 0.0209 | 0.7803 | 0.8047 | 0.7923 | 128 | 0.9695 | 0.9695 | 0.9695 | 131 | 0.5789 | 0.6197 | 0.5986 | 71 | 0.8675 | 0.9351 | 0.9 | 77 | 0.75 | 0.75 | 0.75 | 144 | 0.9667 | 0.9732 | 0.9699 | 149 | 0.8366 | 0.8557 | 0.8460 | 0.9941 |
0.0174 | 11.0 | 6435 | 0.0206 | 0.8062 | 0.8125 | 0.8093 | 128 | 0.9767 | 0.9618 | 0.9692 | 131 | 0.3554 | 0.6056 | 0.4479 | 71 | 0.8889 | 0.9351 | 0.9114 | 77 | 0.75 | 0.75 | 0.75 | 144 | 0.9664 | 0.9664 | 0.9664 | 149 | 0.7928 | 0.8529 | 0.8217 | 0.9944 |
0.0156 | 12.0 | 7020 | 0.0219 | 0.7132 | 0.7578 | 0.7348 | 128 | 0.9767 | 0.9618 | 0.9692 | 131 | 0.4884 | 0.5915 | 0.5350 | 71 | 0.9114 | 0.9351 | 0.9231 | 77 | 0.5856 | 0.7361 | 0.6523 | 144 | 0.9732 | 0.9732 | 0.9732 | 149 | 0.7737 | 0.84 | 0.8055 | 0.9936 |
0.0158 | 13.0 | 7605 | 0.0203 | 0.6190 | 0.8125 | 0.7027 | 128 | 0.9767 | 0.9618 | 0.9692 | 131 | 0.6027 | 0.6197 | 0.6111 | 71 | 0.9114 | 0.9351 | 0.9231 | 77 | 0.6044 | 0.7639 | 0.6748 | 144 | 0.96 | 0.9664 | 0.9632 | 149 | 0.7682 | 0.8571 | 0.8103 | 0.9940 |
0.0152 | 14.0 | 8190 | 0.0195 | 0.7252 | 0.7422 | 0.7336 | 128 | 0.9767 | 0.9618 | 0.9692 | 131 | 0.6027 | 0.6197 | 0.6111 | 71 | 0.8765 | 0.9221 | 0.8987 | 77 | 0.75 | 0.75 | 0.75 | 144 | 0.9664 | 0.9664 | 0.9664 | 149 | 0.8317 | 0.84 | 0.8358 | 0.9940 |
0.0158 | 15.0 | 8775 | 0.0204 | 0.7252 | 0.7422 | 0.7336 | 128 | 0.9767 | 0.9618 | 0.9692 | 131 | 0.5658 | 0.6056 | 0.5850 | 71 | 0.8861 | 0.9091 | 0.8974 | 77 | 0.6424 | 0.7361 | 0.6861 | 144 | 0.9664 | 0.9664 | 0.9664 | 149 | 0.8011 | 0.8343 | 0.8174 | 0.9939 |
0.0172 | 16.0 | 9360 | 0.0218 | 0.5741 | 0.7266 | 0.6414 | 128 | 0.9767 | 0.9618 | 0.9692 | 131 | 0.5676 | 0.5915 | 0.5793 | 71 | 0.8987 | 0.9221 | 0.9103 | 77 | 0.6145 | 0.7639 | 0.6811 | 144 | 0.9664 | 0.9664 | 0.9664 | 149 | 0.7591 | 0.8371 | 0.7962 | 0.9929 |
0.0137 | 17.0 | 9945 | 0.0220 | 0.8 | 0.8125 | 0.8062 | 128 | 0.9767 | 0.9618 | 0.9692 | 131 | 0.5946 | 0.6197 | 0.6069 | 71 | 0.8734 | 0.8961 | 0.8846 | 77 | 0.6485 | 0.7431 | 0.6926 | 144 | 0.9536 | 0.9664 | 0.9600 | 149 | 0.8159 | 0.8486 | 0.8319 | 0.9942 |
0.0143 | 18.0 | 10530 | 0.0222 | 0.7810 | 0.8359 | 0.8075 | 128 | 0.9767 | 0.9618 | 0.9692 | 131 | 0.5714 | 0.6197 | 0.5946 | 71 | 0.8861 | 0.9091 | 0.8974 | 77 | 0.7639 | 0.7639 | 0.7639 | 144 | 0.9595 | 0.9530 | 0.9562 | 149 | 0.8389 | 0.8557 | 0.8472 | 0.9939 |
0.0137 | 19.0 | 11115 | 0.0222 | 0.7557 | 0.7734 | 0.7645 | 128 | 0.9767 | 0.9618 | 0.9692 | 131 | 0.5395 | 0.5775 | 0.5578 | 71 | 0.8608 | 0.8831 | 0.8718 | 77 | 0.6732 | 0.7153 | 0.6936 | 144 | 0.96 | 0.9664 | 0.9632 | 149 | 0.8092 | 0.83 | 0.8195 | 0.9935 |
0.0126 | 20.0 | 11700 | 0.0234 | 0.7907 | 0.7969 | 0.7938 | 128 | 0.9612 | 0.9466 | 0.9538 | 131 | 0.5309 | 0.6056 | 0.5658 | 71 | 0.8372 | 0.9351 | 0.8834 | 77 | 0.5917 | 0.6944 | 0.6390 | 144 | 0.96 | 0.9664 | 0.9632 | 149 | 0.7863 | 0.8357 | 0.8102 | 0.9933 |
0.0122 | 21.0 | 12285 | 0.0228 | 0.6350 | 0.6797 | 0.6566 | 128 | 0.9767 | 0.9618 | 0.9692 | 131 | 0.5 | 0.6197 | 0.5535 | 71 | 0.8659 | 0.9221 | 0.8931 | 77 | 0.6824 | 0.7014 | 0.6918 | 144 | 0.9536 | 0.9664 | 0.9600 | 149 | 0.7796 | 0.8186 | 0.7986 | 0.9930 |
0.0114 | 22.0 | 12870 | 0.0230 | 0.7863 | 0.8047 | 0.7954 | 128 | 0.9615 | 0.9542 | 0.9579 | 131 | 0.38 | 0.5352 | 0.4444 | 71 | 0.875 | 0.9091 | 0.8917 | 77 | 0.6689 | 0.7014 | 0.6847 | 144 | 0.9533 | 0.9597 | 0.9565 | 149 | 0.7817 | 0.8286 | 0.8044 | 0.9938 |
0.0112 | 23.0 | 13455 | 0.0259 | 0.5038 | 0.5156 | 0.5097 | 128 | 0.9690 | 0.9542 | 0.9615 | 131 | 0.5811 | 0.6056 | 0.5931 | 71 | 0.8987 | 0.9221 | 0.9103 | 77 | 0.7361 | 0.7361 | 0.7361 | 144 | 0.9664 | 0.9664 | 0.9664 | 149 | 0.7861 | 0.7929 | 0.7895 | 0.9925 |
0.0108 | 24.0 | 14040 | 0.0280 | 0.6591 | 0.6797 | 0.6692 | 128 | 0.9690 | 0.9542 | 0.9615 | 131 | 0.4681 | 0.6197 | 0.5333 | 71 | 0.8481 | 0.8701 | 0.8590 | 77 | 0.7241 | 0.7292 | 0.7266 | 144 | 0.94 | 0.9463 | 0.9431 | 149 | 0.7805 | 0.8129 | 0.7964 | 0.9929 |
0.0085 | 25.0 | 14625 | 0.0277 | 0.6618 | 0.7031 | 0.6818 | 128 | 0.9615 | 0.9542 | 0.9579 | 131 | 0.4607 | 0.5775 | 0.5125 | 71 | 0.8642 | 0.9091 | 0.8861 | 77 | 0.6733 | 0.7014 | 0.6871 | 144 | 0.96 | 0.9664 | 0.9632 | 149 | 0.7758 | 0.8157 | 0.7953 | 0.9932 |
0.0081 | 26.0 | 15210 | 0.0255 | 0.7634 | 0.7812 | 0.7722 | 128 | 0.9690 | 0.9542 | 0.9615 | 131 | 0.5783 | 0.6761 | 0.6234 | 71 | 0.9125 | 0.9481 | 0.9299 | 77 | 0.6918 | 0.7014 | 0.6966 | 144 | 0.9470 | 0.9597 | 0.9533 | 149 | 0.8194 | 0.8429 | 0.8310 | 0.9940 |
0.0063 | 27.0 | 15795 | 0.0302 | 0.6977 | 0.7031 | 0.7004 | 128 | 0.9767 | 0.9618 | 0.9692 | 131 | 0.5325 | 0.5775 | 0.5541 | 71 | 0.8519 | 0.8961 | 0.8734 | 77 | 0.7222 | 0.7222 | 0.7222 | 144 | 0.9470 | 0.9597 | 0.9533 | 149 | 0.8059 | 0.8186 | 0.8122 | 0.9933 |
0.005 | 28.0 | 16380 | 0.0306 | 0.7829 | 0.7891 | 0.7860 | 128 | 0.9767 | 0.9618 | 0.9692 | 131 | 0.5679 | 0.6479 | 0.6053 | 71 | 0.8193 | 0.8831 | 0.85 | 77 | 0.7172 | 0.7222 | 0.7197 | 144 | 0.96 | 0.9664 | 0.9632 | 149 | 0.8215 | 0.8414 | 0.8313 | 0.9942 |
0.0158 | 29.0 | 16965 | 0.2296 | 0.8 | 0.0312 | 0.0602 | 128 | 1.0 | 0.0153 | 0.0301 | 131 | 0.0 | 0.0 | 0.0 | 71 | 0.0 | 0.0 | 0.0 | 77 | 0.1724 | 0.0694 | 0.0990 | 144 | 0.0616 | 0.1477 | 0.0870 | 149 | 0.0862 | 0.0543 | 0.0666 | 0.9655 |
0.1003 | 30.0 | 17550 | 0.0310 | 0.7385 | 0.75 | 0.7442 | 128 | 0.9542 | 0.9542 | 0.9542 | 131 | 0.6184 | 0.6620 | 0.6395 | 71 | 0.8675 | 0.9351 | 0.9 | 77 | 0.75 | 0.75 | 0.75 | 144 | 0.9664 | 0.9664 | 0.9664 | 149 | 0.8303 | 0.8457 | 0.8379 | 0.9940 |
Framework versions
- Transformers 4.23.0.dev0
- Pytorch 1.12.1+cu113
- Datasets 2.2.2
- Tokenizers 0.13.1
- Downloads last month
- 8
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.