layoutlm-funsd

This model is a fine-tuned version of microsoft/layoutlm-base-uncased on the funsd dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7064
  • Answer: {'precision': 0.70801317233809, 'recall': 0.7972805933250927, 'f1': 0.75, 'number': 809}
  • Header: {'precision': 0.36, 'recall': 0.37815126050420167, 'f1': 0.3688524590163934, 'number': 119}
  • Question: {'precision': 0.7894736842105263, 'recall': 0.8309859154929577, 'f1': 0.8096980786825252, 'number': 1065}
  • Overall Precision: 0.7302
  • Overall Recall: 0.7903
  • Overall F1: 0.7590
  • Overall Accuracy: 0.8069

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 3e-05
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • num_epochs: 15
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Answer Header Question Overall Precision Overall Recall Overall F1 Overall Accuracy
1.7527 1.0 10 1.5609 {'precision': 0.027744270205066344, 'recall': 0.02843016069221261, 'f1': 0.02808302808302808, 'number': 809} {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} {'precision': 0.24188129899216126, 'recall': 0.2028169014084507, 'f1': 0.22063329928498468, 'number': 1065} 0.1388 0.1199 0.1287 0.3775
1.4189 2.0 20 1.1905 {'precision': 0.23932729624838292, 'recall': 0.22867737948084055, 'f1': 0.23388116308470291, 'number': 809} {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} {'precision': 0.43356164383561646, 'recall': 0.5943661971830986, 'f1': 0.5013861386138613, 'number': 1065} 0.3663 0.4104 0.3871 0.6099
1.0743 3.0 30 0.9279 {'precision': 0.5135722041259501, 'recall': 0.584672435105068, 'f1': 0.5468208092485548, 'number': 809} {'precision': 0.09090909090909091, 'recall': 0.008403361344537815, 'f1': 0.015384615384615384, 'number': 119} {'precision': 0.5574144486692015, 'recall': 0.6882629107981221, 'f1': 0.6159663865546218, 'number': 1065} 0.5372 0.6056 0.5693 0.7197
0.8323 4.0 40 0.7781 {'precision': 0.6182965299684543, 'recall': 0.7268232385661311, 'f1': 0.6681818181818182, 'number': 809} {'precision': 0.14814814814814814, 'recall': 0.06722689075630252, 'f1': 0.09248554913294797, 'number': 119} {'precision': 0.6731255265374895, 'recall': 0.7502347417840376, 'f1': 0.7095914742451155, 'number': 1065} 0.6364 0.6999 0.6667 0.7627
0.6658 5.0 50 0.7131 {'precision': 0.6371308016877637, 'recall': 0.7466007416563659, 'f1': 0.6875355719977233, 'number': 809} {'precision': 0.21686746987951808, 'recall': 0.15126050420168066, 'f1': 0.1782178217821782, 'number': 119} {'precision': 0.6829066886870355, 'recall': 0.7765258215962442, 'f1': 0.726713532513181, 'number': 1065} 0.6463 0.7270 0.6843 0.7812
0.5559 6.0 60 0.7018 {'precision': 0.6437941473259334, 'recall': 0.788627935723115, 'f1': 0.7088888888888889, 'number': 809} {'precision': 0.2653061224489796, 'recall': 0.2184873949579832, 'f1': 0.23963133640552997, 'number': 119} {'precision': 0.7235555555555555, 'recall': 0.7643192488262911, 'f1': 0.7433789954337899, 'number': 1065} 0.6676 0.7416 0.7026 0.7797
0.4847 7.0 70 0.6667 {'precision': 0.6787234042553192, 'recall': 0.788627935723115, 'f1': 0.729559748427673, 'number': 809} {'precision': 0.23853211009174313, 'recall': 0.2184873949579832, 'f1': 0.2280701754385965, 'number': 119} {'precision': 0.7450643776824034, 'recall': 0.8150234741784037, 'f1': 0.77847533632287, 'number': 1065} 0.6920 0.7687 0.7283 0.7982
0.4247 8.0 80 0.6833 {'precision': 0.6836518046709129, 'recall': 0.796044499381953, 'f1': 0.7355796687607081, 'number': 809} {'precision': 0.2578125, 'recall': 0.2773109243697479, 'f1': 0.26720647773279355, 'number': 119} {'precision': 0.7610008628127696, 'recall': 0.828169014084507, 'f1': 0.7931654676258992, 'number': 1065} 0.6994 0.7822 0.7385 0.7961
0.3796 9.0 90 0.6774 {'precision': 0.7042716319824753, 'recall': 0.7948084054388134, 'f1': 0.7468060394889663, 'number': 809} {'precision': 0.28688524590163933, 'recall': 0.29411764705882354, 'f1': 0.2904564315352697, 'number': 119} {'precision': 0.7781785392245266, 'recall': 0.8103286384976526, 'f1': 0.7939282428702852, 'number': 1065} 0.7188 0.7732 0.7450 0.8022
0.361 10.0 100 0.6885 {'precision': 0.7047413793103449, 'recall': 0.8084054388133498, 'f1': 0.7530224525043179, 'number': 809} {'precision': 0.30833333333333335, 'recall': 0.31092436974789917, 'f1': 0.3096234309623431, 'number': 119} {'precision': 0.7742504409171076, 'recall': 0.8244131455399061, 'f1': 0.7985447930877672, 'number': 1065} 0.7191 0.7873 0.7516 0.8045
0.3089 11.0 110 0.6921 {'precision': 0.7141292442497261, 'recall': 0.8059332509270705, 'f1': 0.7572590011614402, 'number': 809} {'precision': 0.3358208955223881, 'recall': 0.37815126050420167, 'f1': 0.3557312252964427, 'number': 119} {'precision': 0.7929792979297929, 'recall': 0.8272300469483568, 'f1': 0.8097426470588235, 'number': 1065} 0.7312 0.7918 0.7603 0.8038
0.295 12.0 120 0.6928 {'precision': 0.7082872928176795, 'recall': 0.792336217552534, 'f1': 0.7479579929988331, 'number': 809} {'precision': 0.33070866141732286, 'recall': 0.35294117647058826, 'f1': 0.34146341463414637, 'number': 119} {'precision': 0.7917414721723519, 'recall': 0.828169014084507, 'f1': 0.8095456631482332, 'number': 1065} 0.7293 0.7852 0.7562 0.8064
0.278 13.0 130 0.7052 {'precision': 0.6988082340195017, 'recall': 0.7972805933250927, 'f1': 0.7448036951501155, 'number': 809} {'precision': 0.34615384615384615, 'recall': 0.37815126050420167, 'f1': 0.36144578313253006, 'number': 119} {'precision': 0.7985546522131888, 'recall': 0.8300469483568075, 'f1': 0.8139963167587477, 'number': 1065} 0.7287 0.7898 0.7580 0.8048
0.2603 14.0 140 0.7044 {'precision': 0.7056892778993435, 'recall': 0.7972805933250927, 'f1': 0.7486941381311665, 'number': 809} {'precision': 0.3492063492063492, 'recall': 0.3697478991596639, 'f1': 0.35918367346938773, 'number': 119} {'precision': 0.7852706299911268, 'recall': 0.8309859154929577, 'f1': 0.8074817518248176, 'number': 1065} 0.7263 0.7898 0.7567 0.8074
0.258 15.0 150 0.7064 {'precision': 0.70801317233809, 'recall': 0.7972805933250927, 'f1': 0.75, 'number': 809} {'precision': 0.36, 'recall': 0.37815126050420167, 'f1': 0.3688524590163934, 'number': 119} {'precision': 0.7894736842105263, 'recall': 0.8309859154929577, 'f1': 0.8096980786825252, 'number': 1065} 0.7302 0.7903 0.7590 0.8069

Framework versions

  • Transformers 4.46.2
  • Pytorch 2.5.1+cpu
  • Datasets 3.1.0
  • Tokenizers 0.20.3
Downloads last month
108
Safetensors
Model size
113M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for coeus-ek/layoutlm-funsd

Finetuned
(151)
this model