--- library_name: transformers license: apache-2.0 base_model: microsoft/conditional-detr-resnet-50 tags: - generated_from_trainer model-index: - name: detraaa_finetuned_cppe5 results: [] --- # detraaa_finetuned_cppe5 This model is a fine-tuned version of [microsoft/conditional-detr-resnet-50](https://huggingface.co/microsoft/conditional-detr-resnet-50) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 1.8720 - Map: 0.0239 - Map 50: 0.0667 - Map 75: 0.0105 - Map Small: 0.0079 - Map Medium: 0.0505 - Map Large: 0.0493 - Mar 1: 0.067 - Mar 10: 0.1645 - Mar 100: 0.2185 - Mar Small: 0.1159 - Mar Medium: 0.2718 - Mar Large: 0.2179 - Map Bone-fracture: -1.0 - Mar 100 Bone-fracture: -1.0 - Map Angle: 0.0372 - Mar 100 Angle: 0.1583 - Map Fracture: 0.0068 - Mar 100 Fracture: 0.2237 - Map Line: 0.0042 - Mar 100 Line: 0.1829 - Map Messed Up Angle: 0.0473 - Mar 100 Messed Up Angle: 0.3091 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments - lr_scheduler_type: cosine - num_epochs: 30 ### Training results | Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Bone-fracture | Mar 100 Bone-fracture | Map Angle | Mar 100 Angle | Map Fracture | Mar 100 Fracture | Map Line | Mar 100 Line | Map Messed Up Angle | Mar 100 Messed Up Angle | |:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:-----------------:|:---------------------:|:---------:|:-------------:|:------------:|:----------------:|:--------:|:------------:|:-------------------:|:-----------------------:| | No log | 1.0 | 41 | 3.8122 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | No log | 2.0 | 82 | 2.9585 | 0.0001 | 0.0005 | 0.0 | 0.0 | 0.0002 | 0.0 | 0.0 | 0.003 | 0.0051 | 0.0 | 0.0127 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0085 | 0.0 | 0.0029 | 0.0002 | 0.0091 | | No log | 3.0 | 123 | 2.7378 | 0.0001 | 0.0005 | 0.0 | 0.0001 | 0.0002 | 0.0 | 0.0 | 0.0045 | 0.0096 | 0.0111 | 0.0117 | 0.0 | 0.0 | 0.0 | 0.0002 | 0.0356 | 0.0 | 0.0029 | 0.0 | 0.0 | | No log | 4.0 | 164 | 2.5667 | 0.0001 | 0.0004 | 0.0 | 0.0001 | 0.0001 | 0.0 | 0.0 | 0.005 | 0.0224 | 0.0152 | 0.0362 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0001 | 0.0458 | 0.0002 | 0.0257 | 0.0 | 0.0182 | | No log | 5.0 | 205 | 2.3972 | 0.0004 | 0.0019 | 0.0 | 0.0002 | 0.001 | 0.0 | 0.0007 | 0.0171 | 0.0465 | 0.0242 | 0.0815 | 0.0 | 0.0 | 0.0 | 0.0004 | 0.0898 | 0.0005 | 0.06 | 0.0005 | 0.0364 | | No log | 6.0 | 246 | 2.4845 | 0.0001 | 0.0004 | 0.0 | 0.0002 | 0.0001 | 0.0 | 0.0 | 0.0051 | 0.0234 | 0.0305 | 0.0283 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0001 | 0.0593 | 0.0001 | 0.0343 | 0.0 | 0.0 | | No log | 7.0 | 287 | 2.2863 | 0.0001 | 0.0005 | 0.0 | 0.0002 | 0.0001 | 0.0003 | 0.0 | 0.0043 | 0.0262 | 0.0354 | 0.025 | 0.025 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0003 | 0.0847 | 0.0001 | 0.02 | 0.0 | 0.0 | | No log | 8.0 | 328 | 2.2480 | 0.0002 | 0.0011 | 0.0 | 0.0005 | 0.0003 | 0.0 | 0.003 | 0.0077 | 0.0338 | 0.0339 | 0.0408 | 0.0 | 0.0 | 0.0 | 0.0005 | 0.0864 | 0.0004 | 0.0486 | 0.0 | 0.0 | | No log | 9.0 | 369 | 2.1551 | 0.0008 | 0.0037 | 0.0001 | 0.0024 | 0.0006 | 0.0 | 0.0024 | 0.0273 | 0.0544 | 0.0425 | 0.084 | 0.0 | 0.0 | 0.0 | 0.0028 | 0.1424 | 0.0004 | 0.0571 | 0.0001 | 0.0182 | | No log | 10.0 | 410 | 2.1461 | 0.0005 | 0.0019 | 0.0 | 0.0014 | 0.0005 | 0.0 | 0.0078 | 0.0208 | 0.0513 | 0.0485 | 0.065 | 0.0 | 0.0 | 0.0 | 0.0014 | 0.1424 | 0.0005 | 0.0629 | 0.0 | 0.0 | | No log | 11.0 | 451 | 2.1594 | 0.0013 | 0.0053 | 0.0 | 0.0032 | 0.0012 | 0.0009 | 0.0114 | 0.0346 | 0.0684 | 0.0577 | 0.0975 | 0.0625 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0042 | 0.1475 | 0.0002 | 0.0714 | 0.0006 | 0.0545 | | No log | 12.0 | 492 | 2.0050 | 0.0014 | 0.0066 | 0.0 | 0.0037 | 0.0007 | 0.0028 | 0.0045 | 0.0435 | 0.0702 | 0.078 | 0.0742 | 0.0875 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0048 | 0.1661 | 0.0007 | 0.1057 | 0.0001 | 0.0091 | | 4.8365 | 13.0 | 533 | 2.0575 | 0.0022 | 0.0087 | 0.0002 | 0.002 | 0.0086 | 0.0064 | 0.0083 | 0.054 | 0.1006 | 0.0781 | 0.1375 | 0.1232 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0023 | 0.161 | 0.0012 | 0.1143 | 0.0051 | 0.1273 | | 4.8365 | 14.0 | 574 | 1.9988 | 0.0071 | 0.0222 | 0.0032 | 0.0025 | 0.0151 | 0.0061 | 0.014 | 0.0919 | 0.1275 | 0.0879 | 0.1797 | 0.0679 | -1.0 | -1.0 | 0.004 | 0.0333 | 0.0023 | 0.1864 | 0.0016 | 0.1629 | 0.0205 | 0.1273 | | 4.8365 | 15.0 | 615 | 1.9491 | 0.0039 | 0.0146 | 0.0008 | 0.0034 | 0.0095 | 0.0033 | 0.0171 | 0.0772 | 0.1198 | 0.0751 | 0.1762 | 0.0929 | 0.0048 | 0.0417 | 0.0038 | 0.1949 | 0.0008 | 0.0971 | 0.006 | 0.1455 | | 4.8365 | 16.0 | 656 | 1.9354 | 0.0048 | 0.0245 | 0.0005 | 0.0052 | 0.0106 | 0.0087 | 0.013 | 0.1011 | 0.1515 | 0.102 | 0.1874 | 0.0679 | -1.0 | -1.0 | 0.0087 | 0.0833 | 0.0051 | 0.1966 | 0.0011 | 0.1171 | 0.0042 | 0.2091 | | 4.8365 | 17.0 | 697 | 1.9822 | 0.0042 | 0.0174 | 0.0013 | 0.0039 | 0.0075 | 0.0195 | 0.0231 | 0.0862 | 0.1299 | 0.1128 | 0.1565 | 0.0786 | -1.0 | -1.0 | 0.0018 | 0.0083 | 0.0048 | 0.1932 | 0.002 | 0.1543 | 0.0084 | 0.1636 | | 4.8365 | 18.0 | 738 | 1.9518 | 0.0135 | 0.0331 | 0.0024 | 0.0027 | 0.0299 | 0.0263 | 0.0362 | 0.1359 | 0.1954 | 0.0739 | 0.2899 | 0.1839 | -1.0 | -1.0 | 0.0198 | 0.15 | 0.0037 | 0.1932 | 0.0021 | 0.12 | 0.0283 | 0.3182 | | 4.8365 | 19.0 | 779 | 1.9752 | 0.008 | 0.042 | 0.0019 | 0.0034 | 0.0202 | 0.0085 | 0.0251 | 0.1405 | 0.173 | 0.1069 | 0.2424 | 0.0732 | -1.0 | -1.0 | 0.0098 | 0.15 | 0.0033 | 0.1424 | 0.0022 | 0.1543 | 0.0166 | 0.2455 | | 4.8365 | 20.0 | 820 | 1.9388 | 0.0155 | 0.0519 | 0.0013 | 0.0047 | 0.034 | 0.0288 | 0.0459 | 0.164 | 0.2209 | 0.135 | 0.2656 | 0.1857 | -1.0 | -1.0 | 0.0176 | 0.1 | 0.0044 | 0.1763 | 0.0034 | 0.18 | 0.0367 | 0.4273 | | 4.8365 | 21.0 | 861 | 1.9225 | 0.0265 | 0.0666 | 0.0035 | 0.0045 | 0.0545 | 0.0311 | 0.0765 | 0.163 | 0.2146 | 0.1159 | 0.2651 | 0.1625 | -1.0 | -1.0 | 0.0395 | 0.15 | 0.0048 | 0.2 | 0.0028 | 0.1629 | 0.0588 | 0.3455 | | 4.8365 | 22.0 | 902 | 1.9158 | 0.0151 | 0.0549 | 0.0072 | 0.006 | 0.0398 | 0.0326 | 0.0542 | 0.1634 | 0.1978 | 0.1108 | 0.2455 | 0.1143 | -1.0 | -1.0 | 0.0272 | 0.15 | 0.0044 | 0.1932 | 0.0033 | 0.1571 | 0.0256 | 0.2909 | | 4.8365 | 23.0 | 943 | 1.8807 | 0.0203 | 0.0553 | 0.0033 | 0.0078 | 0.0464 | 0.0478 | 0.055 | 0.1664 | 0.2098 | 0.1313 | 0.2568 | 0.1768 | -1.0 | -1.0 | 0.0263 | 0.15 | 0.0069 | 0.2 | 0.0034 | 0.18 | 0.0447 | 0.3091 | | 4.8365 | 24.0 | 984 | 1.8769 | 0.0265 | 0.0645 | 0.0091 | 0.0075 | 0.0546 | 0.0487 | 0.0709 | 0.1737 | 0.221 | 0.1134 | 0.275 | 0.2125 | -1.0 | -1.0 | 0.0379 | 0.1667 | 0.0065 | 0.2068 | 0.0043 | 0.1743 | 0.0574 | 0.3364 | | 1.6844 | 25.0 | 1025 | 1.8751 | 0.0261 | 0.0661 | 0.0091 | 0.0078 | 0.0532 | 0.0568 | 0.0707 | 0.173 | 0.2203 | 0.1155 | 0.2708 | 0.2429 | -1.0 | -1.0 | 0.0371 | 0.175 | 0.0066 | 0.2169 | 0.0041 | 0.18 | 0.0565 | 0.3091 | | 1.6844 | 26.0 | 1066 | 1.8840 | 0.0223 | 0.062 | 0.0068 | 0.0078 | 0.0453 | 0.0272 | 0.0578 | 0.1576 | 0.2145 | 0.1277 | 0.2713 | 0.1768 | -1.0 | -1.0 | 0.0332 | 0.1417 | 0.007 | 0.2237 | 0.0037 | 0.1743 | 0.0454 | 0.3182 | | 1.6844 | 27.0 | 1107 | 1.8819 | 0.0273 | 0.0703 | 0.0115 | 0.0077 | 0.056 | 0.0447 | 0.0707 | 0.1692 | 0.2239 | 0.1149 | 0.2657 | 0.2321 | -1.0 | -1.0 | 0.0419 | 0.1583 | 0.0068 | 0.2203 | 0.0039 | 0.1714 | 0.0565 | 0.3455 | | 1.6844 | 28.0 | 1148 | 1.8702 | 0.024 | 0.0669 | 0.0105 | 0.0079 | 0.0483 | 0.0487 | 0.067 | 0.1658 | 0.2173 | 0.1169 | 0.2666 | 0.2179 | -1.0 | -1.0 | 0.0381 | 0.1667 | 0.0069 | 0.2254 | 0.0041 | 0.1771 | 0.047 | 0.3 | | 1.6844 | 29.0 | 1189 | 1.8712 | 0.024 | 0.0667 | 0.0105 | 0.0079 | 0.0485 | 0.0493 | 0.067 | 0.1666 | 0.2206 | 0.1159 | 0.2754 | 0.2179 | -1.0 | -1.0 | 0.0378 | 0.1667 | 0.0068 | 0.2237 | 0.0041 | 0.1829 | 0.0473 | 0.3091 | | 1.6844 | 30.0 | 1230 | 1.8720 | 0.0239 | 0.0667 | 0.0105 | 0.0079 | 0.0505 | 0.0493 | 0.067 | 0.1645 | 0.2185 | 0.1159 | 0.2718 | 0.2179 | -1.0 | -1.0 | 0.0372 | 0.1583 | 0.0068 | 0.2237 | 0.0042 | 0.1829 | 0.0473 | 0.3091 | ### Framework versions - Transformers 4.46.3 - Pytorch 2.5.1+cpu - Datasets 3.1.0 - Tokenizers 0.20.3