paturi1710's picture
update model card README.md
a611171
|
raw
history blame
13.8 kB
metadata
license: apache-2.0
tags:
  - generated_from_trainer
datasets:
  - imagefolder
model-index:
  - name: fb-detr-aug-table_detection_v1.0
    results: []

fb-detr-aug-table_detection_v1.0

This model is a fine-tuned version of facebook/detr-resnet-50 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3284

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 16
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 300

Training results

Training Loss Epoch Step Validation Loss
2.1127 1.21 20 1.6338
1.8818 2.42 40 1.0008
1.6752 3.64 60 1.4500
1.516 4.85 80 0.9846
1.328 6.06 100 1.0746
1.2713 7.27 120 1.1575
1.1762 8.48 140 0.7001
1.1547 9.7 160 1.0982
1.2178 10.91 180 1.2437
1.0999 12.12 200 0.9853
1.1452 13.33 220 0.8249
1.0528 14.55 240 0.7035
1.0157 15.76 260 0.7584
0.9898 16.97 280 0.7169
0.9011 18.18 300 0.9833
0.9248 19.39 320 0.5799
0.9295 20.61 340 0.7567
0.8687 21.82 360 0.8273
0.9934 23.03 380 0.7053
0.9039 24.24 400 0.7121
0.9244 25.45 420 0.7668
0.8525 26.67 440 0.8034
0.8996 27.88 460 0.7558
0.9486 29.09 480 0.6570
0.9838 30.3 500 0.6775
1.0131 31.52 520 0.6643
0.911 32.73 540 0.6673
0.9749 33.94 560 0.7285
0.9277 35.15 580 0.5660
0.885 36.36 600 0.6928
0.8128 37.58 620 0.6517
0.8082 38.79 640 0.6254
0.8702 40.0 660 0.7354
0.8563 41.21 680 0.6653
0.8147 42.42 700 0.7279
0.7741 43.64 720 0.8649
0.7128 44.85 740 0.6545
0.7806 46.06 760 0.6264
0.7497 47.27 780 0.6577
0.687 48.48 800 0.6218
0.761 49.7 820 0.8314
0.7987 50.91 840 0.6444
0.7357 52.12 860 0.6575
0.7023 53.33 880 0.5817
0.6802 54.55 900 0.6244
0.7285 55.76 920 0.5916
0.6959 56.97 940 0.5081
0.6638 58.18 960 0.5037
0.6957 59.39 980 0.5085
0.6571 60.61 1000 0.4837
0.6837 61.82 1020 0.6387
0.7012 63.03 1040 0.4773
0.7139 64.24 1060 0.5028
0.7234 65.45 1080 0.5678
0.7228 66.67 1100 0.6430
0.6973 67.88 1120 0.6091
0.7096 69.09 1140 0.4702
0.6688 70.3 1160 0.5281
0.6378 71.52 1180 0.5869
0.6533 72.73 1200 0.5513
0.5966 73.94 1220 0.5030
0.6459 75.15 1240 0.5056
0.6496 76.36 1260 0.5982
0.7562 77.58 1280 0.4316
0.6744 78.79 1300 0.5127
0.725 80.0 1320 0.4750
0.6317 81.21 1340 0.5916
0.6138 82.42 1360 0.5602
0.5979 83.64 1380 0.5578
0.6455 84.85 1400 0.5035
0.6428 86.06 1420 0.4647
0.6101 87.27 1440 0.5262
0.6003 88.48 1460 0.4931
0.6019 89.7 1480 0.4655
0.609 90.91 1500 0.5081
0.6059 92.12 1520 0.4959
0.5952 93.33 1540 0.4069
0.6115 94.55 1560 0.5783
0.6277 95.76 1580 0.5889
0.6392 96.97 1600 0.5349
0.6003 98.18 1620 0.4729
0.6195 99.39 1640 0.4943
0.6209 100.61 1660 0.5134
0.6042 101.82 1680 0.5111
0.5964 103.03 1700 0.4301
0.5716 104.24 1720 0.4129
0.5466 105.45 1740 0.5458
0.5679 106.67 1760 0.5224
0.5754 107.88 1780 0.4612
0.543 109.09 1800 0.4411
0.5434 110.3 1820 0.3614
0.5682 111.52 1840 0.4925
0.6027 112.73 1860 0.4388
0.5683 113.94 1880 0.4456
0.5566 115.15 1900 0.4899
0.5738 116.36 1920 0.4500
0.5494 117.58 1940 0.4949
0.5848 118.79 1960 0.4078
0.6483 120.0 1980 0.4234
0.5738 121.21 2000 0.6240
0.5656 122.42 2020 0.6076
0.52 123.64 2040 0.4267
0.5692 124.85 2060 0.4629
0.5728 126.06 2080 0.4723
0.6444 127.27 2100 0.4098
0.565 128.48 2120 0.4331
0.5484 129.7 2140 0.4324
0.5164 130.91 2160 0.4289
0.5354 132.12 2180 0.3927
0.5332 133.33 2200 0.3951
0.4956 134.55 2220 0.4877
0.5107 135.76 2240 0.5421
0.5192 136.97 2260 0.4340
0.4702 138.18 2280 0.5052
0.4863 139.39 2300 0.4147
0.4977 140.61 2320 0.4434
0.5222 141.82 2340 0.4550
0.5292 143.03 2360 0.4839
0.5376 144.24 2380 0.3728
0.4915 145.45 2400 0.4733
0.4641 146.67 2420 0.3470
0.5144 147.88 2440 0.3606
0.4891 149.09 2460 0.4212
0.4758 150.3 2480 0.6014
0.4901 151.52 2500 0.3525
0.4809 152.73 2520 0.4205
0.486 153.94 2540 0.3663
0.4943 155.15 2560 0.5401
0.4857 156.36 2580 0.4914
0.4898 157.58 2600 0.4820
0.4783 158.79 2620 0.4178
0.4941 160.0 2640 0.4133
0.4607 161.21 2660 0.3855
0.4797 162.42 2680 0.3911
0.4874 163.64 2700 0.3821
0.4799 164.85 2720 0.4532
0.4683 166.06 2740 0.4442
0.4843 167.27 2760 0.3532
0.4781 168.48 2780 0.5200
0.4561 169.7 2800 0.4211
0.4745 170.91 2820 0.4610
0.4872 172.12 2840 0.3453
0.4299 173.33 2860 0.4454
0.4609 174.55 2880 0.3775
0.4318 175.76 2900 0.4044
0.4429 176.97 2920 0.5326
0.4521 178.18 2940 0.3521
0.46 179.39 2960 0.4162
0.4858 180.61 2980 0.4760
0.4483 181.82 3000 0.3208
0.4553 183.03 3020 0.3736
0.4497 184.24 3040 0.3852
0.4487 185.45 3060 0.4270
0.4646 186.67 3080 0.4376
0.4538 187.88 3100 0.4299
0.4915 189.09 3120 0.2842
0.4194 190.3 3140 0.4162
0.4571 191.52 3160 0.4434
0.4228 192.73 3180 0.6554
0.4345 193.94 3200 0.2984
0.4424 195.15 3220 0.3035
0.4259 196.36 3240 0.4230
0.4161 197.58 3260 0.2558
0.405 198.79 3280 0.3711
0.4385 200.0 3300 0.2988
0.4034 201.21 3320 0.4759
0.4203 202.42 3340 0.3641
0.4559 203.64 3360 0.3186
0.4457 204.85 3380 0.3593
0.4072 206.06 3400 0.3301
0.4254 207.27 3420 0.2779
0.4153 208.48 3440 0.3963
0.4259 209.7 3460 0.3817
0.4273 210.91 3480 0.3069
0.3945 212.12 3500 0.3477
0.3849 213.33 3520 0.3495
0.3944 214.55 3540 0.4825
0.3881 215.76 3560 0.3790
0.3856 216.97 3580 0.2898
0.4108 218.18 3600 0.3521
0.4194 219.39 3620 0.2938
0.3683 220.61 3640 0.2290
0.4111 221.82 3660 0.3704
0.4078 223.03 3680 0.3231
0.3852 224.24 3700 0.2568
0.407 225.45 3720 0.4309
0.3753 226.67 3740 0.3829
0.3963 227.88 3760 0.3988
0.3683 229.09 3780 0.3014
0.3786 230.3 3800 0.2988
0.3705 231.52 3820 0.3167
0.3822 232.73 3840 0.3800
0.3496 233.94 3860 0.3660
0.407 235.15 3880 0.3476
0.3938 236.36 3900 0.3337
0.3526 237.58 3920 0.3130
0.3815 238.79 3940 0.2702
0.3677 240.0 3960 0.3134
0.4319 241.21 3980 0.3871
0.401 242.42 4000 0.4471
0.3538 243.64 4020 0.3134
0.3605 244.85 4040 0.2553
0.3585 246.06 4060 0.2506
0.3879 247.27 4080 0.3194
0.3638 248.48 4100 0.4381
0.3649 249.7 4120 0.3818
0.3529 250.91 4140 0.2432
0.3841 252.12 4160 0.2769
0.3755 253.33 4180 0.3376
0.3504 254.55 4200 0.2689
0.3653 255.76 4220 0.2874
0.3614 256.97 4240 0.4095
0.3909 258.18 4260 0.2556
0.3547 259.39 4280 0.4043
0.3613 260.61 4300 0.2781
0.3268 261.82 4320 0.2558
0.367 263.03 4340 0.3386
0.3317 264.24 4360 0.2605
0.3733 265.45 4380 0.2535
0.3878 266.67 4400 0.2325
0.3596 267.88 4420 0.2849
0.3482 269.09 4440 0.2811
0.3609 270.3 4460 0.3282
0.373 271.52 4480 0.4058
0.3792 272.73 4500 0.2404
0.3563 273.94 4520 0.3351
0.3215 275.15 4540 0.4536
0.3389 276.36 4560 0.4224
0.354 277.58 4580 0.3298
0.3616 278.79 4600 0.3443
0.3629 280.0 4620 0.3889
0.3443 281.21 4640 0.3653
0.3407 282.42 4660 0.2257
0.3178 283.64 4680 0.3924
0.3364 284.85 4700 0.3184
0.3356 286.06 4720 0.3177
0.3711 287.27 4740 0.3729
0.3422 288.48 4760 0.2495
0.3375 289.7 4780 0.2142
0.3271 290.91 4800 0.3284

Framework versions

  • Transformers 4.30.2
  • Pytorch 2.0.1+cu118
  • Datasets 2.13.0
  • Tokenizers 0.11.0