davanstrien HF staff commited on
Commit
a079407
1 Parent(s): dc563b1

Model save

Browse files
README.md CHANGED
@@ -3,8 +3,6 @@ library_name: transformers
3
  license: apache-2.0
4
  base_model: timm/mobilenetv3_large_100.miil_in21k
5
  tags:
6
- - image-classification
7
- - vision
8
  - generated_from_trainer
9
  metrics:
10
  - accuracy
@@ -18,10 +16,10 @@ should probably proofread and complete it, then remove this comment. -->
18
 
19
  # test-timm
20
 
21
- This model is a fine-tuned version of [timm/mobilenetv3_large_100.miil_in21k](https://huggingface.co/timm/mobilenetv3_large_100.miil_in21k) on the davanstrien/zenodo-presentations-open-labels dataset.
22
  It achieves the following results on the evaluation set:
23
- - Loss: 0.4904
24
- - Accuracy: 0.7874
25
 
26
  ## Model description
27
 
@@ -41,67 +39,217 @@ More information needed
41
 
42
  The following hyperparameters were used during training:
43
  - learning_rate: 2e-05
44
- - train_batch_size: 64
45
- - eval_batch_size: 64
46
  - seed: 1337
47
  - optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
48
  - lr_scheduler_type: linear
49
- - num_epochs: 50.0
50
 
51
  ### Training results
52
 
53
- | Training Loss | Epoch | Step | Validation Loss | Accuracy |
54
- |:-------------:|:-----:|:----:|:---------------:|:--------:|
55
- | 0.6794 | 1.0 | 23 | 0.6560 | 0.6063 |
56
- | 0.6215 | 2.0 | 46 | 0.5833 | 0.7362 |
57
- | 0.5784 | 3.0 | 69 | 0.5490 | 0.7598 |
58
- | 0.5347 | 4.0 | 92 | 0.5306 | 0.7638 |
59
- | 0.5307 | 5.0 | 115 | 0.5235 | 0.7638 |
60
- | 0.5391 | 6.0 | 138 | 0.5090 | 0.7677 |
61
- | 0.48 | 7.0 | 161 | 0.5108 | 0.7717 |
62
- | 0.473 | 8.0 | 184 | 0.5028 | 0.7756 |
63
- | 0.5014 | 9.0 | 207 | 0.5054 | 0.7717 |
64
- | 0.496 | 10.0 | 230 | 0.5040 | 0.7717 |
65
- | 0.4688 | 11.0 | 253 | 0.4972 | 0.7677 |
66
- | 0.4943 | 12.0 | 276 | 0.4977 | 0.7638 |
67
- | 0.5012 | 13.0 | 299 | 0.5057 | 0.7717 |
68
- | 0.4639 | 14.0 | 322 | 0.5010 | 0.7717 |
69
- | 0.4709 | 15.0 | 345 | 0.4949 | 0.7795 |
70
- | 0.4888 | 16.0 | 368 | 0.4955 | 0.7835 |
71
- | 0.4594 | 17.0 | 391 | 0.4986 | 0.7717 |
72
- | 0.4745 | 18.0 | 414 | 0.5011 | 0.7677 |
73
- | 0.4667 | 19.0 | 437 | 0.4928 | 0.7756 |
74
- | 0.4551 | 20.0 | 460 | 0.5055 | 0.7795 |
75
- | 0.4657 | 21.0 | 483 | 0.4928 | 0.7756 |
76
- | 0.4818 | 22.0 | 506 | 0.5002 | 0.7756 |
77
- | 0.4633 | 23.0 | 529 | 0.4946 | 0.7835 |
78
- | 0.4779 | 24.0 | 552 | 0.4942 | 0.7795 |
79
- | 0.4718 | 25.0 | 575 | 0.4963 | 0.7835 |
80
- | 0.4511 | 26.0 | 598 | 0.5011 | 0.7717 |
81
- | 0.4798 | 27.0 | 621 | 0.4904 | 0.7874 |
82
- | 0.4868 | 28.0 | 644 | 0.4982 | 0.7835 |
83
- | 0.4653 | 29.0 | 667 | 0.4988 | 0.7874 |
84
- | 0.4613 | 30.0 | 690 | 0.4985 | 0.7795 |
85
- | 0.4675 | 31.0 | 713 | 0.5060 | 0.7717 |
86
- | 0.4587 | 32.0 | 736 | 0.5059 | 0.7717 |
87
- | 0.464 | 33.0 | 759 | 0.5042 | 0.7795 |
88
- | 0.4374 | 34.0 | 782 | 0.5063 | 0.7677 |
89
- | 0.4864 | 35.0 | 805 | 0.5040 | 0.7677 |
90
- | 0.4354 | 36.0 | 828 | 0.5109 | 0.7717 |
91
- | 0.4655 | 37.0 | 851 | 0.5107 | 0.7717 |
92
- | 0.4691 | 38.0 | 874 | 0.5093 | 0.7677 |
93
- | 0.4826 | 39.0 | 897 | 0.5044 | 0.7717 |
94
- | 0.4577 | 40.0 | 920 | 0.5000 | 0.7795 |
95
- | 0.4636 | 41.0 | 943 | 0.4963 | 0.7717 |
96
- | 0.4361 | 42.0 | 966 | 0.4958 | 0.7717 |
97
- | 0.4534 | 43.0 | 989 | 0.5008 | 0.7795 |
98
- | 0.4559 | 44.0 | 1012 | 0.5025 | 0.7795 |
99
- | 0.4189 | 45.0 | 1035 | 0.5014 | 0.7756 |
100
- | 0.4861 | 46.0 | 1058 | 0.5004 | 0.7677 |
101
- | 0.4709 | 47.0 | 1081 | 0.5005 | 0.7795 |
102
- | 0.4726 | 48.0 | 1104 | 0.5008 | 0.7717 |
103
- | 0.4441 | 49.0 | 1127 | 0.4988 | 0.7756 |
104
- | 0.4579 | 50.0 | 1150 | 0.5000 | 0.7756 |
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
105
 
106
 
107
  ### Framework versions
 
3
  license: apache-2.0
4
  base_model: timm/mobilenetv3_large_100.miil_in21k
5
  tags:
 
 
6
  - generated_from_trainer
7
  metrics:
8
  - accuracy
 
16
 
17
  # test-timm
18
 
19
+ This model is a fine-tuned version of [timm/mobilenetv3_large_100.miil_in21k](https://huggingface.co/timm/mobilenetv3_large_100.miil_in21k) on an unknown dataset.
20
  It achieves the following results on the evaluation set:
21
+ - Loss: 0.5182
22
+ - Accuracy: 0.7992
23
 
24
  ## Model description
25
 
 
39
 
40
  The following hyperparameters were used during training:
41
  - learning_rate: 2e-05
42
+ - train_batch_size: 128
43
+ - eval_batch_size: 128
44
  - seed: 1337
45
  - optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
46
  - lr_scheduler_type: linear
47
+ - num_epochs: 200.0
48
 
49
  ### Training results
50
 
51
+ | Training Loss | Epoch | Step | Accuracy | Validation Loss |
52
+ |:-------------:|:-----:|:----:|:--------:|:---------------:|
53
+ | 0.6794 | 1.0 | 23 | 0.6063 | 0.6560 |
54
+ | 0.6215 | 2.0 | 46 | 0.7362 | 0.5833 |
55
+ | 0.5784 | 3.0 | 69 | 0.7598 | 0.5490 |
56
+ | 0.5347 | 4.0 | 92 | 0.7638 | 0.5306 |
57
+ | 0.5307 | 5.0 | 115 | 0.7638 | 0.5235 |
58
+ | 0.5391 | 6.0 | 138 | 0.7677 | 0.5090 |
59
+ | 0.48 | 7.0 | 161 | 0.7717 | 0.5108 |
60
+ | 0.473 | 8.0 | 184 | 0.7756 | 0.5028 |
61
+ | 0.5014 | 9.0 | 207 | 0.7717 | 0.5054 |
62
+ | 0.496 | 10.0 | 230 | 0.7717 | 0.5040 |
63
+ | 0.4688 | 11.0 | 253 | 0.7677 | 0.4972 |
64
+ | 0.4943 | 12.0 | 276 | 0.7638 | 0.4977 |
65
+ | 0.5012 | 13.0 | 299 | 0.7717 | 0.5057 |
66
+ | 0.4639 | 14.0 | 322 | 0.7717 | 0.5010 |
67
+ | 0.4709 | 15.0 | 345 | 0.7795 | 0.4949 |
68
+ | 0.4888 | 16.0 | 368 | 0.7835 | 0.4955 |
69
+ | 0.4594 | 17.0 | 391 | 0.7717 | 0.4986 |
70
+ | 0.4745 | 18.0 | 414 | 0.7677 | 0.5011 |
71
+ | 0.4667 | 19.0 | 437 | 0.7756 | 0.4928 |
72
+ | 0.4551 | 20.0 | 460 | 0.7795 | 0.5055 |
73
+ | 0.4657 | 21.0 | 483 | 0.7756 | 0.4928 |
74
+ | 0.4818 | 22.0 | 506 | 0.7756 | 0.5002 |
75
+ | 0.4633 | 23.0 | 529 | 0.7835 | 0.4946 |
76
+ | 0.4779 | 24.0 | 552 | 0.7795 | 0.4942 |
77
+ | 0.4718 | 25.0 | 575 | 0.7835 | 0.4963 |
78
+ | 0.4511 | 26.0 | 598 | 0.7717 | 0.5011 |
79
+ | 0.4798 | 27.0 | 621 | 0.7874 | 0.4904 |
80
+ | 0.4868 | 28.0 | 644 | 0.7835 | 0.4982 |
81
+ | 0.4653 | 29.0 | 667 | 0.7874 | 0.4988 |
82
+ | 0.4613 | 30.0 | 690 | 0.7795 | 0.4985 |
83
+ | 0.4675 | 31.0 | 713 | 0.7717 | 0.5060 |
84
+ | 0.4587 | 32.0 | 736 | 0.7717 | 0.5059 |
85
+ | 0.464 | 33.0 | 759 | 0.7795 | 0.5042 |
86
+ | 0.4374 | 34.0 | 782 | 0.7677 | 0.5063 |
87
+ | 0.4864 | 35.0 | 805 | 0.7677 | 0.5040 |
88
+ | 0.4354 | 36.0 | 828 | 0.7717 | 0.5109 |
89
+ | 0.4655 | 37.0 | 851 | 0.7717 | 0.5107 |
90
+ | 0.4691 | 38.0 | 874 | 0.7677 | 0.5093 |
91
+ | 0.4826 | 39.0 | 897 | 0.7717 | 0.5044 |
92
+ | 0.4577 | 40.0 | 920 | 0.7795 | 0.5000 |
93
+ | 0.4636 | 41.0 | 943 | 0.7717 | 0.4963 |
94
+ | 0.4361 | 42.0 | 966 | 0.7717 | 0.4958 |
95
+ | 0.4534 | 43.0 | 989 | 0.7795 | 0.5008 |
96
+ | 0.4559 | 44.0 | 1012 | 0.7795 | 0.5025 |
97
+ | 0.4189 | 45.0 | 1035 | 0.7756 | 0.5014 |
98
+ | 0.4861 | 46.0 | 1058 | 0.7677 | 0.5004 |
99
+ | 0.4709 | 47.0 | 1081 | 0.7795 | 0.5005 |
100
+ | 0.4726 | 48.0 | 1104 | 0.7717 | 0.5008 |
101
+ | 0.4441 | 49.0 | 1127 | 0.7756 | 0.4988 |
102
+ | 0.4579 | 50.0 | 1150 | 0.7756 | 0.5000 |
103
+ | 0.4366 | 51.0 | 1173 | 0.4980 | 0.7756 |
104
+ | 0.4467 | 52.0 | 1196 | 0.4947 | 0.7795 |
105
+ | 0.4797 | 53.0 | 1219 | 0.4950 | 0.7756 |
106
+ | 0.4544 | 54.0 | 1242 | 0.4998 | 0.7717 |
107
+ | 0.4466 | 55.0 | 1265 | 0.4980 | 0.7795 |
108
+ | 0.4599 | 56.0 | 1288 | 0.4963 | 0.7835 |
109
+ | 0.4458 | 57.0 | 1311 | 0.4956 | 0.7874 |
110
+ | 0.4296 | 58.0 | 1334 | 0.4994 | 0.7874 |
111
+ | 0.4415 | 59.0 | 1357 | 0.4998 | 0.7835 |
112
+ | 0.4036 | 60.0 | 1380 | 0.4996 | 0.7795 |
113
+ | 0.4406 | 61.0 | 1403 | 0.5022 | 0.7913 |
114
+ | 0.4235 | 62.0 | 1426 | 0.5018 | 0.7913 |
115
+ | 0.4492 | 63.0 | 1449 | 0.4964 | 0.8031 |
116
+ | 0.4065 | 64.0 | 1472 | 0.4953 | 0.7874 |
117
+ | 0.4474 | 65.0 | 1495 | 0.4897 | 0.7913 |
118
+ | 0.4605 | 66.0 | 1518 | 0.5039 | 0.7795 |
119
+ | 0.436 | 67.0 | 1541 | 0.5024 | 0.7756 |
120
+ | 0.4746 | 68.0 | 1564 | 0.5007 | 0.7874 |
121
+ | 0.4555 | 69.0 | 1587 | 0.5054 | 0.7874 |
122
+ | 0.433 | 70.0 | 1610 | 0.4974 | 0.7874 |
123
+ | 0.4503 | 71.0 | 1633 | 0.5096 | 0.7795 |
124
+ | 0.4424 | 72.0 | 1656 | 0.5040 | 0.7756 |
125
+ | 0.4331 | 73.0 | 1679 | 0.5056 | 0.7913 |
126
+ | 0.4263 | 74.0 | 1702 | 0.5026 | 0.7874 |
127
+ | 0.4305 | 75.0 | 1725 | 0.5033 | 0.7835 |
128
+ | 0.4271 | 76.0 | 1748 | 0.5015 | 0.7874 |
129
+ | 0.4635 | 77.0 | 1771 | 0.4988 | 0.7913 |
130
+ | 0.4212 | 78.0 | 1794 | 0.4994 | 0.7913 |
131
+ | 0.4154 | 79.0 | 1817 | 0.5044 | 0.7874 |
132
+ | 0.4288 | 80.0 | 1840 | 0.5033 | 0.7913 |
133
+ | 0.4211 | 81.0 | 1863 | 0.5050 | 0.7835 |
134
+ | 0.4022 | 82.0 | 1886 | 0.5021 | 0.7835 |
135
+ | 0.4477 | 83.0 | 1909 | 0.5096 | 0.7756 |
136
+ | 0.4091 | 84.0 | 1932 | 0.5017 | 0.7913 |
137
+ | 0.4284 | 85.0 | 1955 | 0.5094 | 0.7795 |
138
+ | 0.4317 | 86.0 | 1978 | 0.5056 | 0.7874 |
139
+ | 0.4011 | 87.0 | 2001 | 0.4992 | 0.7953 |
140
+ | 0.4043 | 88.0 | 2024 | 0.5106 | 0.7874 |
141
+ | 0.4233 | 89.0 | 2047 | 0.5083 | 0.7835 |
142
+ | 0.4383 | 90.0 | 2070 | 0.5016 | 0.7913 |
143
+ | 0.4328 | 91.0 | 2093 | 0.5062 | 0.7874 |
144
+ | 0.3978 | 92.0 | 2116 | 0.5026 | 0.7874 |
145
+ | 0.4052 | 93.0 | 2139 | 0.4964 | 0.7913 |
146
+ | 0.3938 | 94.0 | 2162 | 0.5036 | 0.7874 |
147
+ | 0.393 | 95.0 | 2185 | 0.5102 | 0.7835 |
148
+ | 0.4294 | 96.0 | 2208 | 0.5003 | 0.7874 |
149
+ | 0.4122 | 97.0 | 2231 | 0.5013 | 0.7913 |
150
+ | 0.4207 | 98.0 | 2254 | 0.5076 | 0.7874 |
151
+ | 0.4127 | 99.0 | 2277 | 0.5040 | 0.7835 |
152
+ | 0.441 | 100.0 | 2300 | 0.5022 | 0.7835 |
153
+ | 0.3938 | 101.0 | 2323 | 0.4975 | 0.7992 |
154
+ | 0.4109 | 102.0 | 2346 | 0.5019 | 0.7913 |
155
+ | 0.4299 | 103.0 | 2369 | 0.5060 | 0.7874 |
156
+ | 0.4148 | 104.0 | 2392 | 0.5038 | 0.7874 |
157
+ | 0.4179 | 105.0 | 2415 | 0.5064 | 0.7835 |
158
+ | 0.4352 | 106.0 | 2438 | 0.5059 | 0.7874 |
159
+ | 0.4027 | 107.0 | 2461 | 0.5025 | 0.7953 |
160
+ | 0.4002 | 108.0 | 2484 | 0.5020 | 0.7874 |
161
+ | 0.3988 | 109.0 | 2507 | 0.5063 | 0.7874 |
162
+ | 0.4095 | 110.0 | 2530 | 0.5034 | 0.7913 |
163
+ | 0.4001 | 111.0 | 2553 | 0.5054 | 0.7874 |
164
+ | 0.4201 | 112.0 | 2576 | 0.5076 | 0.7992 |
165
+ | 0.4134 | 113.0 | 2599 | 0.5070 | 0.7953 |
166
+ | 0.3614 | 114.0 | 2622 | 0.5033 | 0.7835 |
167
+ | 0.3928 | 115.0 | 2645 | 0.5043 | 0.7874 |
168
+ | 0.435 | 116.0 | 2668 | 0.4999 | 0.7874 |
169
+ | 0.4162 | 117.0 | 2691 | 0.5132 | 0.7874 |
170
+ | 0.4078 | 118.0 | 2714 | 0.5088 | 0.7795 |
171
+ | 0.4025 | 119.0 | 2737 | 0.5075 | 0.7835 |
172
+ | 0.4096 | 120.0 | 2760 | 0.5023 | 0.7835 |
173
+ | 0.3879 | 121.0 | 2783 | 0.5063 | 0.7835 |
174
+ | 0.4033 | 122.0 | 2806 | 0.5001 | 0.7874 |
175
+ | 0.3927 | 123.0 | 2829 | 0.5087 | 0.7795 |
176
+ | 0.3803 | 124.0 | 2852 | 0.5150 | 0.7913 |
177
+ | 0.4248 | 125.0 | 2875 | 0.5150 | 0.7835 |
178
+ | 0.3874 | 126.0 | 2898 | 0.5158 | 0.7874 |
179
+ | 0.3646 | 127.0 | 2921 | 0.4980 | 0.8031 |
180
+ | 0.4115 | 128.0 | 2944 | 0.5077 | 0.7913 |
181
+ | 0.385 | 129.0 | 2967 | 0.5153 | 0.7913 |
182
+ | 0.4064 | 130.0 | 2990 | 0.5114 | 0.7953 |
183
+ | 0.4168 | 131.0 | 3013 | 0.5057 | 0.7992 |
184
+ | 0.4319 | 132.0 | 3036 | 0.5041 | 0.7953 |
185
+ | 0.4234 | 133.0 | 3059 | 0.5119 | 0.7992 |
186
+ | 0.3721 | 134.0 | 3082 | 0.5118 | 0.7874 |
187
+ | 0.3709 | 135.0 | 3105 | 0.5078 | 0.7913 |
188
+ | 0.4149 | 136.0 | 3128 | 0.5164 | 0.7795 |
189
+ | 0.416 | 137.0 | 3151 | 0.5123 | 0.7835 |
190
+ | 0.406 | 138.0 | 3174 | 0.5116 | 0.7913 |
191
+ | 0.3613 | 139.0 | 3197 | 0.5170 | 0.7913 |
192
+ | 0.3786 | 140.0 | 3220 | 0.5099 | 0.8031 |
193
+ | 0.3976 | 141.0 | 3243 | 0.5111 | 0.7913 |
194
+ | 0.371 | 142.0 | 3266 | 0.5081 | 0.7953 |
195
+ | 0.4056 | 143.0 | 3289 | 0.5098 | 0.7913 |
196
+ | 0.4214 | 144.0 | 3312 | 0.5085 | 0.7953 |
197
+ | 0.3832 | 145.0 | 3335 | 0.5084 | 0.7953 |
198
+ | 0.3762 | 146.0 | 3358 | 0.5061 | 0.7913 |
199
+ | 0.4118 | 147.0 | 3381 | 0.5111 | 0.7992 |
200
+ | 0.3866 | 148.0 | 3404 | 0.5092 | 0.8071 |
201
+ | 0.3869 | 149.0 | 3427 | 0.5122 | 0.7953 |
202
+ | 0.3734 | 150.0 | 3450 | 0.5117 | 0.7953 |
203
+ | 0.4061 | 151.0 | 3473 | 0.5095 | 0.7913 |
204
+ | 0.3705 | 152.0 | 3496 | 0.5171 | 0.7953 |
205
+ | 0.3873 | 153.0 | 3519 | 0.5179 | 0.7953 |
206
+ | 0.3927 | 154.0 | 3542 | 0.5117 | 0.7992 |
207
+ | 0.3807 | 155.0 | 3565 | 0.5133 | 0.7953 |
208
+ | 0.3761 | 156.0 | 3588 | 0.5140 | 0.7913 |
209
+ | 0.3964 | 157.0 | 3611 | 0.5118 | 0.7953 |
210
+ | 0.39 | 158.0 | 3634 | 0.5122 | 0.8031 |
211
+ | 0.3943 | 159.0 | 3657 | 0.5126 | 0.8031 |
212
+ | 0.3417 | 160.0 | 3680 | 0.5097 | 0.7992 |
213
+ | 0.3996 | 161.0 | 3703 | 0.5048 | 0.7913 |
214
+ | 0.4 | 162.0 | 3726 | 0.5148 | 0.7953 |
215
+ | 0.4051 | 163.0 | 3749 | 0.5150 | 0.7874 |
216
+ | 0.3973 | 164.0 | 3772 | 0.5037 | 0.8031 |
217
+ | 0.3963 | 165.0 | 3795 | 0.5048 | 0.7953 |
218
+ | 0.3568 | 166.0 | 3818 | 0.5168 | 0.7913 |
219
+ | 0.3995 | 167.0 | 3841 | 0.5096 | 0.7913 |
220
+ | 0.3628 | 168.0 | 3864 | 0.5102 | 0.7953 |
221
+ | 0.3836 | 169.0 | 3887 | 0.5133 | 0.7953 |
222
+ | 0.3646 | 170.0 | 3910 | 0.5099 | 0.8031 |
223
+ | 0.3789 | 171.0 | 3933 | 0.5151 | 0.7874 |
224
+ | 0.3832 | 172.0 | 3956 | 0.5149 | 0.8031 |
225
+ | 0.3476 | 173.0 | 3979 | 0.5178 | 0.7835 |
226
+ | 0.3806 | 174.0 | 4002 | 0.5081 | 0.7992 |
227
+ | 0.4053 | 175.0 | 4025 | 0.5100 | 0.7874 |
228
+ | 0.3986 | 176.0 | 4048 | 0.5189 | 0.7992 |
229
+ | 0.3827 | 177.0 | 4071 | 0.5129 | 0.7992 |
230
+ | 0.3892 | 178.0 | 4094 | 0.5099 | 0.7874 |
231
+ | 0.3955 | 179.0 | 4117 | 0.5212 | 0.7992 |
232
+ | 0.4077 | 180.0 | 4140 | 0.5102 | 0.7953 |
233
+ | 0.3579 | 181.0 | 4163 | 0.5100 | 0.7953 |
234
+ | 0.3666 | 182.0 | 4186 | 0.5248 | 0.7835 |
235
+ | 0.3746 | 183.0 | 4209 | 0.5220 | 0.7874 |
236
+ | 0.3867 | 184.0 | 4232 | 0.5173 | 0.7913 |
237
+ | 0.4024 | 185.0 | 4255 | 0.5248 | 0.7874 |
238
+ | 0.4014 | 186.0 | 4278 | 0.5085 | 0.7913 |
239
+ | 0.3445 | 187.0 | 4301 | 0.5137 | 0.8031 |
240
+ | 0.382 | 188.0 | 4324 | 0.5213 | 0.7913 |
241
+ | 0.3673 | 189.0 | 4347 | 0.5242 | 0.7913 |
242
+ | 0.3631 | 190.0 | 4370 | 0.5146 | 0.7913 |
243
+ | 0.393 | 191.0 | 4393 | 0.5098 | 0.7835 |
244
+ | 0.3806 | 192.0 | 4416 | 0.5134 | 0.7992 |
245
+ | 0.3789 | 193.0 | 4439 | 0.5127 | 0.7992 |
246
+ | 0.3717 | 194.0 | 4462 | 0.5184 | 0.7913 |
247
+ | 0.361 | 195.0 | 4485 | 0.5186 | 0.7835 |
248
+ | 0.3722 | 196.0 | 4508 | 0.5107 | 0.7953 |
249
+ | 0.3551 | 197.0 | 4531 | 0.5175 | 0.7953 |
250
+ | 0.3649 | 198.0 | 4554 | 0.5136 | 0.7992 |
251
+ | 0.3749 | 199.0 | 4577 | 0.5193 | 0.7913 |
252
+ | 0.3782 | 200.0 | 4600 | 0.5182 | 0.7992 |
253
 
254
 
255
  ### Framework versions
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:198a6a9d5d77fe2218f6f877d6f3922bcc8218b0218192f47692be9a74266997
3
  size 16947712
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:4a0ba9260d9bdf2d9fcd9708b5ef55285b53c65a664da2fb328907ed7c1a5f96
3
  size 16947712
runs/Oct11_10-24-55_ed9849b3ed7e/events.out.tfevents.1728642300.ed9849b3ed7e.26007.0 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:269bdde91b4e088a50f8114b998ec50deca08cbaabb37fe08f1b6a6ff5118581
3
- size 126325
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:fc4d8aab80a4a83feaae1578f468e94784f1827d29af80b2f0441254c64f05de
3
+ size 127002