ethangclark commited on
Commit
19dde24
1 Parent(s): 732066a

End of training

Browse files
README.md CHANGED
@@ -15,13 +15,13 @@ should probably proofread and complete it, then remove this comment. -->
15
 
16
  This model is a fine-tuned version of [microsoft/layoutlm-base-uncased](https://huggingface.co/microsoft/layoutlm-base-uncased) on the None dataset.
17
  It achieves the following results on the evaluation set:
18
- - Loss: 0.6103
19
- - Answer: {'precision': 0.46564885496183206, 'recall': 0.6630434782608695, 'f1': 0.547085201793722, 'number': 92}
20
- - Header: {'precision': 0.34146341463414637, 'recall': 0.4375, 'f1': 0.3835616438356165, 'number': 32}
21
- - Overall Precision: 0.4360
22
- - Overall Recall: 0.6048
23
- - Overall F1: 0.5068
24
- - Overall Accuracy: 0.8784
25
 
26
  ## Model description
27
 
@@ -50,23 +50,23 @@ The following hyperparameters were used during training:
50
 
51
  ### Training results
52
 
53
- | Training Loss | Epoch | Step | Validation Loss | Answer | Header | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
54
- |:-------------:|:-----:|:----:|:---------------:|:----------------------------------------------------------------------------------------------------------:|:----------------------------------------------------------------------------------------------:|:-----------------:|:--------------:|:----------:|:----------------:|
55
- | 1.1168 | 1.0 | 2 | 0.9441 | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 92} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 32} | 0.0 | 0.0 | 0.0 | 0.8182 |
56
- | 0.6159 | 2.0 | 4 | 0.8352 | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 92} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 32} | 0.0 | 0.0 | 0.0 | 0.8182 |
57
- | 0.495 | 3.0 | 6 | 0.7209 | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 92} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 32} | 0.0 | 0.0 | 0.0 | 0.8182 |
58
- | 0.4034 | 4.0 | 8 | 0.6673 | {'precision': 0.45454545454545453, 'recall': 0.21739130434782608, 'f1': 0.29411764705882354, 'number': 92} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 32} | 0.4545 | 0.1613 | 0.2381 | 0.8412 |
59
- | 0.2859 | 5.0 | 10 | 0.6247 | {'precision': 0.4536082474226804, 'recall': 0.4782608695652174, 'f1': 0.4656084656084656, 'number': 92} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 32} | 0.4536 | 0.3548 | 0.3982 | 0.8617 |
60
- | 0.2644 | 6.0 | 12 | 0.6132 | {'precision': 0.4027777777777778, 'recall': 0.6304347826086957, 'f1': 0.4915254237288136, 'number': 92} | {'precision': 0.3333333333333333, 'recall': 0.03125, 'f1': 0.05714285714285714, 'number': 32} | 0.4014 | 0.4758 | 0.4354 | 0.8668 |
61
- | 0.3123 | 7.0 | 14 | 0.6367 | {'precision': 0.3954802259887006, 'recall': 0.7608695652173914, 'f1': 0.5204460966542751, 'number': 92} | {'precision': 0.25, 'recall': 0.03125, 'f1': 0.05555555555555555, 'number': 32} | 0.3923 | 0.5726 | 0.4656 | 0.8592 |
62
- | 0.3486 | 8.0 | 16 | 0.6209 | {'precision': 0.4166666666666667, 'recall': 0.7608695652173914, 'f1': 0.5384615384615384, 'number': 92} | {'precision': 0.125, 'recall': 0.03125, 'f1': 0.05, 'number': 32} | 0.4034 | 0.5726 | 0.4733 | 0.8656 |
63
- | 0.3703 | 9.0 | 18 | 0.5961 | {'precision': 0.4339622641509434, 'recall': 0.75, 'f1': 0.549800796812749, 'number': 92} | {'precision': 0.5, 'recall': 0.15625, 'f1': 0.23809523809523808, 'number': 32} | 0.4379 | 0.5968 | 0.5051 | 0.8771 |
64
- | 0.2037 | 10.0 | 20 | 0.5895 | {'precision': 0.423841059602649, 'recall': 0.6956521739130435, 'f1': 0.5267489711934157, 'number': 92} | {'precision': 0.42857142857142855, 'recall': 0.1875, 'f1': 0.26086956521739124, 'number': 32} | 0.4242 | 0.5645 | 0.4844 | 0.8796 |
65
- | 0.1851 | 11.0 | 22 | 0.5810 | {'precision': 0.4461538461538462, 'recall': 0.6304347826086957, 'f1': 0.5225225225225225, 'number': 92} | {'precision': 0.38235294117647056, 'recall': 0.40625, 'f1': 0.393939393939394, 'number': 32} | 0.4329 | 0.5726 | 0.4931 | 0.8809 |
66
- | 0.1832 | 12.0 | 24 | 0.5900 | {'precision': 0.46153846153846156, 'recall': 0.6521739130434783, 'f1': 0.5405405405405406, 'number': 92} | {'precision': 0.35135135135135137, 'recall': 0.40625, 'f1': 0.37681159420289856, 'number': 32} | 0.4371 | 0.5887 | 0.5017 | 0.8809 |
67
- | 0.1588 | 13.0 | 26 | 0.6012 | {'precision': 0.46153846153846156, 'recall': 0.6521739130434783, 'f1': 0.5405405405405406, 'number': 92} | {'precision': 0.35, 'recall': 0.4375, 'f1': 0.38888888888888884, 'number': 32} | 0.4353 | 0.5968 | 0.5034 | 0.8796 |
68
- | 0.1622 | 14.0 | 28 | 0.6075 | {'precision': 0.46564885496183206, 'recall': 0.6630434782608695, 'f1': 0.547085201793722, 'number': 92} | {'precision': 0.34146341463414637, 'recall': 0.4375, 'f1': 0.3835616438356165, 'number': 32} | 0.4360 | 0.6048 | 0.5068 | 0.8784 |
69
- | 0.2028 | 15.0 | 30 | 0.6103 | {'precision': 0.46564885496183206, 'recall': 0.6630434782608695, 'f1': 0.547085201793722, 'number': 92} | {'precision': 0.34146341463414637, 'recall': 0.4375, 'f1': 0.3835616438356165, 'number': 32} | 0.4360 | 0.6048 | 0.5068 | 0.8784 |
70
 
71
 
72
  ### Framework versions
 
15
 
16
  This model is a fine-tuned version of [microsoft/layoutlm-base-uncased](https://huggingface.co/microsoft/layoutlm-base-uncased) on the None dataset.
17
  It achieves the following results on the evaluation set:
18
+ - Loss: 0.6097
19
+ - Answer: {'precision': 0.43703703703703706, 'recall': 0.6413043478260869, 'f1': 0.5198237885462555, 'number': 92}
20
+ - Header: {'precision': 0.2894736842105263, 'recall': 0.34375, 'f1': 0.3142857142857143, 'number': 32}
21
+ - Overall Precision: 0.4046
22
+ - Overall Recall: 0.5645
23
+ - Overall F1: 0.4714
24
+ - Overall Accuracy: 0.8656
25
 
26
  ## Model description
27
 
 
50
 
51
  ### Training results
52
 
53
+ | Training Loss | Epoch | Step | Validation Loss | Answer | Header | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
54
+ |:-------------:|:-----:|:----:|:---------------:|:---------------------------------------------------------------------------------------------------------:|:----------------------------------------------------------------------------------------------:|:-----------------:|:--------------:|:----------:|:----------------:|
55
+ | 1.4561 | 1.0 | 2 | 1.0789 | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 92} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 32} | 0.0 | 0.0 | 0.0 | 0.8182 |
56
+ | 0.7649 | 2.0 | 4 | 0.9219 | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 92} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 32} | 0.0 | 0.0 | 0.0 | 0.8182 |
57
+ | 0.5601 | 3.0 | 6 | 0.8338 | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 92} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 32} | 0.0 | 0.0 | 0.0 | 0.8182 |
58
+ | 0.4611 | 4.0 | 8 | 0.7533 | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 92} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 32} | 0.0 | 0.0 | 0.0 | 0.8182 |
59
+ | 0.3306 | 5.0 | 10 | 0.6861 | {'precision': 0.75, 'recall': 0.03260869565217391, 'f1': 0.06249999999999999, 'number': 92} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 32} | 0.75 | 0.0242 | 0.0469 | 0.8207 |
60
+ | 0.3001 | 6.0 | 12 | 0.6509 | {'precision': 0.43243243243243246, 'recall': 0.5217391304347826, 'f1': 0.47290640394088673, 'number': 92} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 32} | 0.4324 | 0.3871 | 0.4085 | 0.8592 |
61
+ | 0.3436 | 7.0 | 14 | 0.6713 | {'precision': 0.33689839572192515, 'recall': 0.6847826086956522, 'f1': 0.45161290322580644, 'number': 92} | {'precision': 0.14285714285714285, 'recall': 0.03125, 'f1': 0.05128205128205128, 'number': 32} | 0.3299 | 0.5161 | 0.4025 | 0.8284 |
62
+ | 0.3624 | 8.0 | 16 | 0.6454 | {'precision': 0.3516483516483517, 'recall': 0.6956521739130435, 'f1': 0.46715328467153283, 'number': 92} | {'precision': 0.4, 'recall': 0.0625, 'f1': 0.10810810810810811, 'number': 32} | 0.3529 | 0.5323 | 0.4244 | 0.8387 |
63
+ | 0.4258 | 9.0 | 18 | 0.6192 | {'precision': 0.3668639053254438, 'recall': 0.6739130434782609, 'f1': 0.475095785440613, 'number': 92} | {'precision': 0.5555555555555556, 'recall': 0.15625, 'f1': 0.24390243902439024, 'number': 32} | 0.3764 | 0.5403 | 0.4437 | 0.8528 |
64
+ | 0.2221 | 10.0 | 20 | 0.6282 | {'precision': 0.36942675159235666, 'recall': 0.6304347826086957, 'f1': 0.465863453815261, 'number': 92} | {'precision': 0.3181818181818182, 'recall': 0.21875, 'f1': 0.25925925925925924, 'number': 32} | 0.3631 | 0.5242 | 0.4290 | 0.8476 |
65
+ | 0.2069 | 11.0 | 22 | 0.6241 | {'precision': 0.40559440559440557, 'recall': 0.6304347826086957, 'f1': 0.4936170212765958, 'number': 92} | {'precision': 0.34375, 'recall': 0.34375, 'f1': 0.34375, 'number': 32} | 0.3943 | 0.5565 | 0.4615 | 0.8592 |
66
+ | 0.2035 | 12.0 | 24 | 0.6218 | {'precision': 0.4084507042253521, 'recall': 0.6304347826086957, 'f1': 0.49572649572649574, 'number': 92} | {'precision': 0.3125, 'recall': 0.3125, 'f1': 0.3125, 'number': 32} | 0.3908 | 0.5484 | 0.4564 | 0.8604 |
67
+ | 0.1729 | 13.0 | 26 | 0.6175 | {'precision': 0.41843971631205673, 'recall': 0.6413043478260869, 'f1': 0.5064377682403434, 'number': 92} | {'precision': 0.3125, 'recall': 0.3125, 'f1': 0.3125, 'number': 32} | 0.3988 | 0.5565 | 0.4646 | 0.8643 |
68
+ | 0.1759 | 14.0 | 28 | 0.6127 | {'precision': 0.427536231884058, 'recall': 0.6413043478260869, 'f1': 0.5130434782608696, 'number': 92} | {'precision': 0.3142857142857143, 'recall': 0.34375, 'f1': 0.3283582089552239, 'number': 32} | 0.4046 | 0.5645 | 0.4714 | 0.8656 |
69
+ | 0.2299 | 15.0 | 30 | 0.6097 | {'precision': 0.43703703703703706, 'recall': 0.6413043478260869, 'f1': 0.5198237885462555, 'number': 92} | {'precision': 0.2894736842105263, 'recall': 0.34375, 'f1': 0.3142857142857143, 'number': 32} | 0.4046 | 0.5645 | 0.4714 | 0.8656 |
70
 
71
 
72
  ### Framework versions
logs/events.out.tfevents.1711224357.ethanmbp.lan.39274.0 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:674e9bba7ff914813349a646b2d3c103d561ecfcda208fa276a99b59df9e314e
3
- size 5462
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:3774f741097178647c9658ec7f2952898e66f9979ddceeba5955f9b0f2324803
3
+ size 15638
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:7109b91da636314bd42f973d45d1f6db3cc228dacd3f3616a3ea417ce881cb81
3
  size 450552060
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:fbf3481b3790e964e56b1cc461c23b3cc27859a66a18b9b3bbee2af32521321f
3
  size 450552060