Edit model card

lmv2ubiai-pan8doc-06-11

This model is a fine-tuned version of microsoft/layoutlmv2-base-uncased on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.9633
  • Dob Precision: 1.0
  • Dob Recall: 1.0
  • Dob F1: 1.0
  • Dob Number: 2
  • Fname Precision: 0.6667
  • Fname Recall: 1.0
  • Fname F1: 0.8
  • Fname Number: 2
  • Name Precision: 1.0
  • Name Recall: 1.0
  • Name F1: 1.0
  • Name Number: 2
  • Pan Precision: 1.0
  • Pan Recall: 1.0
  • Pan F1: 1.0
  • Pan Number: 2
  • Overall Precision: 0.8889
  • Overall Recall: 1.0
  • Overall F1: 0.9412
  • Overall Accuracy: 0.9821

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 4e-05
  • train_batch_size: 1
  • eval_batch_size: 1
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: constant
  • num_epochs: 30

Training results

Training Loss Epoch Step Validation Loss Dob Precision Dob Recall Dob F1 Dob Number Fname Precision Fname Recall Fname F1 Fname Number Name Precision Name Recall Name F1 Name Number Pan Precision Pan Recall Pan F1 Pan Number Overall Precision Overall Recall Overall F1 Overall Accuracy
2.1195 1.0 6 1.7519 0.0 0.0 0.0 2 0.0 0.0 0.0 2 0.0 0.0 0.0 2 0.0 0.0 0.0 2 0.0 0.0 0.0 0.7857
1.6994 2.0 12 1.5117 0.0 0.0 0.0 2 0.0 0.0 0.0 2 0.0 0.0 0.0 2 0.0 0.0 0.0 2 0.0 0.0 0.0 0.7857
1.5521 3.0 18 1.4130 0.0 0.0 0.0 2 0.0 0.0 0.0 2 0.0 0.0 0.0 2 0.0 0.0 0.0 2 0.0 0.0 0.0 0.7857
1.4726 4.0 24 1.3410 0.0 0.0 0.0 2 0.0 0.0 0.0 2 0.0 0.0 0.0 2 0.0 0.0 0.0 2 0.0 0.0 0.0 0.7857
1.395 5.0 30 1.2693 0.0 0.0 0.0 2 0.0 0.0 0.0 2 0.0 0.0 0.0 2 0.0 0.0 0.0 2 0.0 0.0 0.0 0.7857
1.3131 6.0 36 1.2079 1.0 1.0 1.0 2 0.1667 0.5 0.25 2 0.0 0.0 0.0 2 0.0 0.0 0.0 2 0.3 0.375 0.3333 0.8929
1.2474 7.0 42 1.1495 1.0 1.0 1.0 2 0.2 0.5 0.2857 2 0.0 0.0 0.0 2 1.0 1.0 1.0 2 0.4167 0.625 0.5 0.9286
1.1869 8.0 48 1.0942 1.0 1.0 1.0 2 0.2 0.5 0.2857 2 0.0 0.0 0.0 2 1.0 1.0 1.0 2 0.4167 0.625 0.5 0.9286
1.1369 9.0 54 1.0453 1.0 1.0 1.0 2 0.4 1.0 0.5714 2 0.0 0.0 0.0 2 1.0 1.0 1.0 2 0.5455 0.75 0.6316 0.9464
1.0882 10.0 60 1.0054 1.0 1.0 1.0 2 0.5 1.0 0.6667 2 0.5 0.5 0.5 2 1.0 1.0 1.0 2 0.7 0.875 0.7778 0.9643
1.0482 11.0 66 0.9633 1.0 1.0 1.0 2 0.6667 1.0 0.8 2 1.0 1.0 1.0 2 1.0 1.0 1.0 2 0.8889 1.0 0.9412 0.9821
1.017 12.0 72 0.9368 1.0 1.0 1.0 2 0.6667 1.0 0.8 2 1.0 1.0 1.0 2 1.0 1.0 1.0 2 0.8889 1.0 0.9412 0.9643
0.9825 13.0 78 0.9139 1.0 1.0 1.0 2 0.6667 1.0 0.8 2 1.0 1.0 1.0 2 1.0 1.0 1.0 2 0.8889 1.0 0.9412 0.9821
0.9459 14.0 84 0.8837 1.0 1.0 1.0 2 0.6667 1.0 0.8 2 1.0 1.0 1.0 2 1.0 1.0 1.0 2 0.8889 1.0 0.9412 0.9643
0.9155 15.0 90 0.8472 1.0 1.0 1.0 2 1.0 1.0 1.0 2 0.5 0.5 0.5 2 1.0 1.0 1.0 2 0.875 0.875 0.875 0.9643
0.8819 16.0 96 0.8231 1.0 1.0 1.0 2 1.0 1.0 1.0 2 0.5 0.5 0.5 2 1.0 1.0 1.0 2 0.875 0.875 0.875 0.9643
0.8523 17.0 102 0.7957 1.0 1.0 1.0 2 1.0 1.0 1.0 2 0.6667 1.0 0.8 2 1.0 1.0 1.0 2 0.8889 1.0 0.9412 0.9821
0.8251 18.0 108 0.7681 1.0 1.0 1.0 2 1.0 1.0 1.0 2 0.5 0.5 0.5 2 1.0 1.0 1.0 2 0.875 0.875 0.875 0.9643
0.7982 19.0 114 0.7533 1.0 1.0 1.0 2 1.0 1.0 1.0 2 0.5 0.5 0.5 2 1.0 1.0 1.0 2 0.875 0.875 0.875 0.9643
0.7762 20.0 120 0.7283 1.0 1.0 1.0 2 1.0 1.0 1.0 2 0.5 0.5 0.5 2 1.0 1.0 1.0 2 0.875 0.875 0.875 0.9643
0.7558 21.0 126 0.7114 1.0 1.0 1.0 2 1.0 1.0 1.0 2 0.5 0.5 0.5 2 1.0 1.0 1.0 2 0.875 0.875 0.875 0.9643
0.7346 22.0 132 0.6889 1.0 1.0 1.0 2 1.0 1.0 1.0 2 0.5 0.5 0.5 2 1.0 1.0 1.0 2 0.875 0.875 0.875 0.9643
0.7116 23.0 138 0.6697 1.0 1.0 1.0 2 1.0 1.0 1.0 2 0.5 0.5 0.5 2 1.0 1.0 1.0 2 0.875 0.875 0.875 0.9643
0.6898 24.0 144 0.6593 1.0 1.0 1.0 2 1.0 1.0 1.0 2 0.5 0.5 0.5 2 1.0 1.0 1.0 2 0.875 0.875 0.875 0.9643
0.6748 25.0 150 0.6356 1.0 1.0 1.0 2 1.0 1.0 1.0 2 0.5 0.5 0.5 2 1.0 1.0 1.0 2 0.875 0.875 0.875 0.9643
0.6487 26.0 156 0.6142 1.0 1.0 1.0 2 1.0 1.0 1.0 2 0.5 0.5 0.5 2 1.0 1.0 1.0 2 0.875 0.875 0.875 0.9643
0.6312 27.0 162 0.6008 1.0 1.0 1.0 2 1.0 1.0 1.0 2 0.5 0.5 0.5 2 1.0 1.0 1.0 2 0.875 0.875 0.875 0.9643
0.6156 28.0 168 0.5855 1.0 1.0 1.0 2 1.0 1.0 1.0 2 0.5 0.5 0.5 2 1.0 1.0 1.0 2 0.875 0.875 0.875 0.9643
0.5961 29.0 174 0.5625 1.0 1.0 1.0 2 1.0 1.0 1.0 2 0.5 0.5 0.5 2 1.0 1.0 1.0 2 0.875 0.875 0.875 0.9643
0.5781 30.0 180 0.5553 1.0 1.0 1.0 2 1.0 1.0 1.0 2 0.5 0.5 0.5 2 1.0 1.0 1.0 2 0.875 0.875 0.875 0.9643

Framework versions

  • Transformers 4.20.0.dev0
  • Pytorch 1.11.0+cu113
  • Datasets 2.2.2
  • Tokenizers 0.12.1
Downloads last month
8
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.