3una commited on
Commit
582d532
1 Parent(s): c1bfd15

Model save

Browse files
README.md CHANGED
@@ -22,7 +22,7 @@ model-index:
22
  metrics:
23
  - name: Accuracy
24
  type: accuracy
25
- value: 0.7011494252873564
26
  ---
27
 
28
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
@@ -32,8 +32,8 @@ should probably proofread and complete it, then remove this comment. -->
32
 
33
  This model is a fine-tuned version of [microsoft/beit-base-patch16-224-pt22k-ft22k](https://huggingface.co/microsoft/beit-base-patch16-224-pt22k-ft22k) on the imagefolder dataset.
34
  It achieves the following results on the evaluation set:
35
- - Loss: 0.8313
36
- - Accuracy: 0.7011
37
 
38
  ## Model description
39
 
@@ -61,52 +61,112 @@ The following hyperparameters were used during training:
61
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
62
  - lr_scheduler_type: linear
63
  - lr_scheduler_warmup_ratio: 0.1
64
- - num_epochs: 40
65
 
66
  ### Training results
67
 
68
- | Training Loss | Epoch | Step | Validation Loss | Accuracy |
69
- |:-------------:|:-----:|:----:|:---------------:|:--------:|
70
- | 1.7483 | 1.0 | 202 | 1.7005 | 0.3386 |
71
- | 1.4419 | 2.0 | 404 | 1.3213 | 0.5315 |
72
- | 1.2917 | 3.0 | 606 | 1.1559 | 0.5785 |
73
- | 1.2437 | 4.0 | 808 | 1.0729 | 0.6162 |
74
- | 1.1635 | 5.0 | 1010 | 1.0161 | 0.6311 |
75
- | 1.1087 | 6.0 | 1212 | 0.9862 | 0.6465 |
76
- | 1.0964 | 7.0 | 1414 | 0.9901 | 0.6440 |
77
- | 1.0895 | 8.0 | 1616 | 0.9410 | 0.6555 |
78
- | 1.0384 | 9.0 | 1818 | 0.9221 | 0.6628 |
79
- | 1.0333 | 10.0 | 2020 | 0.9142 | 0.6681 |
80
- | 1.0016 | 11.0 | 2222 | 0.9081 | 0.6681 |
81
- | 0.9503 | 12.0 | 2424 | 0.9013 | 0.6712 |
82
- | 0.9804 | 13.0 | 2626 | 0.8937 | 0.6771 |
83
- | 0.9712 | 14.0 | 2828 | 0.8809 | 0.6830 |
84
- | 1.0151 | 15.0 | 3030 | 0.8704 | 0.6855 |
85
- | 0.9739 | 16.0 | 3232 | 0.8886 | 0.6775 |
86
- | 0.9267 | 17.0 | 3434 | 0.8653 | 0.6855 |
87
- | 0.9428 | 18.0 | 3636 | 0.8633 | 0.6848 |
88
- | 0.9654 | 19.0 | 3838 | 0.8697 | 0.6809 |
89
- | 0.9256 | 20.0 | 4040 | 0.8559 | 0.6855 |
90
- | 0.9345 | 21.0 | 4242 | 0.8533 | 0.6883 |
91
- | 0.9479 | 22.0 | 4444 | 0.8548 | 0.6907 |
92
- | 0.8829 | 23.0 | 4646 | 0.8461 | 0.6851 |
93
- | 0.8999 | 24.0 | 4848 | 0.8399 | 0.6883 |
94
- | 0.9047 | 25.0 | 5050 | 0.8403 | 0.6973 |
95
- | 0.9415 | 26.0 | 5252 | 0.8437 | 0.6952 |
96
- | 0.937 | 27.0 | 5454 | 0.8393 | 0.6931 |
97
- | 0.8692 | 28.0 | 5656 | 0.8331 | 0.6977 |
98
- | 0.9396 | 29.0 | 5858 | 0.8418 | 0.6973 |
99
- | 0.8712 | 30.0 | 6060 | 0.8392 | 0.6921 |
100
- | 0.9426 | 31.0 | 6262 | 0.8324 | 0.7011 |
101
- | 0.884 | 32.0 | 6464 | 0.8325 | 0.6959 |
102
- | 0.8433 | 33.0 | 6666 | 0.8300 | 0.6987 |
103
- | 0.8869 | 34.0 | 6868 | 0.8328 | 0.6963 |
104
- | 0.89 | 35.0 | 7070 | 0.8324 | 0.6973 |
105
- | 0.8639 | 36.0 | 7272 | 0.8317 | 0.6956 |
106
- | 0.8844 | 37.0 | 7474 | 0.8315 | 0.6970 |
107
- | 0.8621 | 38.0 | 7676 | 0.8334 | 0.6991 |
108
- | 0.8942 | 39.0 | 7878 | 0.8350 | 0.6998 |
109
- | 0.8609 | 40.0 | 8080 | 0.8313 | 0.7011 |
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
110
 
111
 
112
  ### Framework versions
 
22
  metrics:
23
  - name: Accuracy
24
  type: accuracy
25
+ value: 0.7081156391501219
26
  ---
27
 
28
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
 
32
 
33
  This model is a fine-tuned version of [microsoft/beit-base-patch16-224-pt22k-ft22k](https://huggingface.co/microsoft/beit-base-patch16-224-pt22k-ft22k) on the imagefolder dataset.
34
  It achieves the following results on the evaluation set:
35
+ - Loss: 0.8366
36
+ - Accuracy: 0.7081
37
 
38
  ## Model description
39
 
 
61
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
62
  - lr_scheduler_type: linear
63
  - lr_scheduler_warmup_ratio: 0.1
64
+ - num_epochs: 100
65
 
66
  ### Training results
67
 
68
+ | Training Loss | Epoch | Step | Validation Loss | Accuracy |
69
+ |:-------------:|:-----:|:-----:|:---------------:|:--------:|
70
+ | 1.8119 | 1.0 | 202 | 1.7993 | 0.3079 |
71
+ | 1.6155 | 2.0 | 404 | 1.5446 | 0.4302 |
72
+ | 1.4279 | 3.0 | 606 | 1.3084 | 0.5301 |
73
+ | 1.3222 | 4.0 | 808 | 1.1817 | 0.5590 |
74
+ | 1.2532 | 5.0 | 1010 | 1.1026 | 0.5789 |
75
+ | 1.2019 | 6.0 | 1212 | 1.0432 | 0.5998 |
76
+ | 1.2037 | 7.0 | 1414 | 1.0030 | 0.6137 |
77
+ | 1.1757 | 8.0 | 1616 | 0.9873 | 0.6235 |
78
+ | 1.1359 | 9.0 | 1818 | 0.9377 | 0.6423 |
79
+ | 1.1282 | 10.0 | 2020 | 0.9231 | 0.6486 |
80
+ | 1.1019 | 11.0 | 2222 | 0.9011 | 0.6562 |
81
+ | 1.0494 | 12.0 | 2424 | 0.8968 | 0.6545 |
82
+ | 0.9951 | 13.0 | 2626 | 0.8876 | 0.6607 |
83
+ | 1.0121 | 14.0 | 2828 | 0.8720 | 0.6695 |
84
+ | 1.0571 | 15.0 | 3030 | 0.8776 | 0.6691 |
85
+ | 1.0049 | 16.0 | 3232 | 0.8627 | 0.6733 |
86
+ | 0.988 | 17.0 | 3434 | 0.8639 | 0.6719 |
87
+ | 0.9955 | 18.0 | 3636 | 0.8397 | 0.6806 |
88
+ | 0.9381 | 19.0 | 3838 | 0.8430 | 0.6820 |
89
+ | 0.9911 | 20.0 | 4040 | 0.8370 | 0.6837 |
90
+ | 0.9305 | 21.0 | 4242 | 0.8373 | 0.6837 |
91
+ | 0.9653 | 22.0 | 4444 | 0.8283 | 0.6883 |
92
+ | 0.9134 | 23.0 | 4646 | 0.8289 | 0.6879 |
93
+ | 0.9098 | 24.0 | 4848 | 0.8365 | 0.6837 |
94
+ | 0.8761 | 25.0 | 5050 | 0.8190 | 0.6869 |
95
+ | 0.9067 | 26.0 | 5252 | 0.8303 | 0.6876 |
96
+ | 0.8765 | 27.0 | 5454 | 0.8188 | 0.6942 |
97
+ | 0.8486 | 28.0 | 5656 | 0.8142 | 0.6959 |
98
+ | 0.9357 | 29.0 | 5858 | 0.8114 | 0.6984 |
99
+ | 0.9037 | 30.0 | 6060 | 0.8150 | 0.6917 |
100
+ | 0.8758 | 31.0 | 6262 | 0.8165 | 0.6931 |
101
+ | 0.8688 | 32.0 | 6464 | 0.8061 | 0.6994 |
102
+ | 0.8736 | 33.0 | 6666 | 0.8056 | 0.6994 |
103
+ | 0.8785 | 34.0 | 6868 | 0.8045 | 0.6991 |
104
+ | 0.8292 | 35.0 | 7070 | 0.8095 | 0.6987 |
105
+ | 0.8407 | 36.0 | 7272 | 0.8096 | 0.6956 |
106
+ | 0.8609 | 37.0 | 7474 | 0.8137 | 0.6984 |
107
+ | 0.9055 | 38.0 | 7676 | 0.8054 | 0.7018 |
108
+ | 0.8355 | 39.0 | 7878 | 0.8080 | 0.6980 |
109
+ | 0.8391 | 40.0 | 8080 | 0.8087 | 0.6966 |
110
+ | 0.7987 | 41.0 | 8282 | 0.8041 | 0.6998 |
111
+ | 0.818 | 42.0 | 8484 | 0.8070 | 0.7039 |
112
+ | 0.7836 | 43.0 | 8686 | 0.8091 | 0.7025 |
113
+ | 0.8348 | 44.0 | 8888 | 0.8047 | 0.7025 |
114
+ | 0.8205 | 45.0 | 9090 | 0.8076 | 0.7025 |
115
+ | 0.8023 | 46.0 | 9292 | 0.8056 | 0.7053 |
116
+ | 0.8241 | 47.0 | 9494 | 0.8022 | 0.7039 |
117
+ | 0.763 | 48.0 | 9696 | 0.8079 | 0.6994 |
118
+ | 0.7422 | 49.0 | 9898 | 0.8062 | 0.7039 |
119
+ | 0.7762 | 50.0 | 10100 | 0.8090 | 0.6998 |
120
+ | 0.7786 | 51.0 | 10302 | 0.8122 | 0.6994 |
121
+ | 0.8027 | 52.0 | 10504 | 0.8129 | 0.7043 |
122
+ | 0.7966 | 53.0 | 10706 | 0.8094 | 0.7039 |
123
+ | 0.8103 | 54.0 | 10908 | 0.8107 | 0.7039 |
124
+ | 0.7827 | 55.0 | 11110 | 0.8126 | 0.7057 |
125
+ | 0.7949 | 56.0 | 11312 | 0.8104 | 0.7119 |
126
+ | 0.7511 | 57.0 | 11514 | 0.8122 | 0.7050 |
127
+ | 0.7727 | 58.0 | 11716 | 0.8123 | 0.7078 |
128
+ | 0.7723 | 59.0 | 11918 | 0.8194 | 0.7015 |
129
+ | 0.7796 | 60.0 | 12120 | 0.8193 | 0.7053 |
130
+ | 0.7768 | 61.0 | 12322 | 0.8159 | 0.7029 |
131
+ | 0.7604 | 62.0 | 12524 | 0.8081 | 0.7085 |
132
+ | 0.7784 | 63.0 | 12726 | 0.8169 | 0.7106 |
133
+ | 0.7235 | 64.0 | 12928 | 0.8131 | 0.7015 |
134
+ | 0.7384 | 65.0 | 13130 | 0.8149 | 0.7085 |
135
+ | 0.6638 | 66.0 | 13332 | 0.8192 | 0.7078 |
136
+ | 0.6998 | 67.0 | 13534 | 0.8243 | 0.7113 |
137
+ | 0.7249 | 68.0 | 13736 | 0.8200 | 0.7015 |
138
+ | 0.6809 | 69.0 | 13938 | 0.8140 | 0.7081 |
139
+ | 0.701 | 70.0 | 14140 | 0.8177 | 0.7095 |
140
+ | 0.7122 | 71.0 | 14342 | 0.8245 | 0.7053 |
141
+ | 0.7269 | 72.0 | 14544 | 0.8245 | 0.7050 |
142
+ | 0.6973 | 73.0 | 14746 | 0.8207 | 0.7095 |
143
+ | 0.7241 | 74.0 | 14948 | 0.8210 | 0.7057 |
144
+ | 0.7397 | 75.0 | 15150 | 0.8230 | 0.7060 |
145
+ | 0.6832 | 76.0 | 15352 | 0.8308 | 0.7057 |
146
+ | 0.7213 | 77.0 | 15554 | 0.8256 | 0.7025 |
147
+ | 0.7115 | 78.0 | 15756 | 0.8291 | 0.7057 |
148
+ | 0.688 | 79.0 | 15958 | 0.8337 | 0.7088 |
149
+ | 0.6997 | 80.0 | 16160 | 0.8312 | 0.7060 |
150
+ | 0.6924 | 81.0 | 16362 | 0.8321 | 0.7053 |
151
+ | 0.7382 | 82.0 | 16564 | 0.8340 | 0.7050 |
152
+ | 0.7513 | 83.0 | 16766 | 0.8320 | 0.7015 |
153
+ | 0.656 | 84.0 | 16968 | 0.8389 | 0.7053 |
154
+ | 0.6503 | 85.0 | 17170 | 0.8321 | 0.7085 |
155
+ | 0.6661 | 86.0 | 17372 | 0.8355 | 0.7092 |
156
+ | 0.7026 | 87.0 | 17574 | 0.8339 | 0.7088 |
157
+ | 0.76 | 88.0 | 17776 | 0.8361 | 0.7092 |
158
+ | 0.696 | 89.0 | 17978 | 0.8343 | 0.7106 |
159
+ | 0.6713 | 90.0 | 18180 | 0.8337 | 0.7106 |
160
+ | 0.6621 | 91.0 | 18382 | 0.8349 | 0.7057 |
161
+ | 0.7042 | 92.0 | 18584 | 0.8360 | 0.7085 |
162
+ | 0.7087 | 93.0 | 18786 | 0.8353 | 0.7085 |
163
+ | 0.64 | 94.0 | 18988 | 0.8371 | 0.7088 |
164
+ | 0.659 | 95.0 | 19190 | 0.8376 | 0.7071 |
165
+ | 0.6246 | 96.0 | 19392 | 0.8376 | 0.7088 |
166
+ | 0.6797 | 97.0 | 19594 | 0.8368 | 0.7092 |
167
+ | 0.6652 | 98.0 | 19796 | 0.8376 | 0.7092 |
168
+ | 0.629 | 99.0 | 19998 | 0.8370 | 0.7088 |
169
+ | 0.6762 | 100.0 | 20200 | 0.8366 | 0.7081 |
170
 
171
 
172
  ### Framework versions
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:8ef20d7b8b425023ddee5a9784a1728301986054062613d1bd9a61b0f4db9ebf
3
  size 343095708
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:48f476b1c60411d917ef6e42bfda7c2c2a25680b3a93da26a4259f7955058ae7
3
  size 343095708
runs/Jan05_01-53-19_2bac6663ad64/events.out.tfevents.1704419624.2bac6663ad64.2551.0 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:a6550116901433b00aa56962522b21caacbe8f46a570d7ba5bc51b225652646c
3
- size 348400
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:7ca0748262730a11f48508a8881b9dbf9713a264c0e2d017ee50b3233e2a472e
3
+ size 355978