license_Plate_Recognizer
This model is a fine-tuned version of microsoft/trocr-small-printed on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.3905
- Cer: 0.0305
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 50
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Cer |
---|---|---|---|---|
0.8435 | 1.0 | 128 | 0.5212 | 0.0798 |
0.326 | 2.0 | 256 | 0.4098 | 0.0609 |
0.2681 | 3.0 | 384 | 0.5235 | 0.0793 |
0.2984 | 4.0 | 512 | 0.3912 | 0.0590 |
0.1671 | 5.0 | 640 | 0.3756 | 0.0461 |
0.1827 | 6.0 | 768 | 0.3875 | 0.0518 |
0.1368 | 7.0 | 896 | 0.3851 | 0.0566 |
0.1304 | 8.0 | 1024 | 0.3917 | 0.0540 |
0.148 | 9.0 | 1152 | 0.4448 | 0.0648 |
0.108 | 10.0 | 1280 | 0.3771 | 0.0441 |
0.0756 | 11.0 | 1408 | 0.3486 | 0.0428 |
0.0891 | 12.0 | 1536 | 0.3960 | 0.0504 |
0.0832 | 13.0 | 1664 | 0.3739 | 0.0438 |
0.0685 | 14.0 | 1792 | 0.3805 | 0.0428 |
0.0642 | 15.0 | 1920 | 0.3541 | 0.0416 |
0.0621 | 16.0 | 2048 | 0.3981 | 0.0457 |
0.0335 | 17.0 | 2176 | 0.4031 | 0.0447 |
0.0376 | 18.0 | 2304 | 0.4305 | 0.0520 |
0.0534 | 19.0 | 2432 | 0.4086 | 0.0432 |
0.0314 | 20.0 | 2560 | 0.4166 | 0.0408 |
0.0219 | 21.0 | 2688 | 0.4393 | 0.0409 |
0.0425 | 22.0 | 2816 | 0.4522 | 0.0476 |
0.0186 | 23.0 | 2944 | 0.4166 | 0.0395 |
0.0311 | 24.0 | 3072 | 0.3867 | 0.0373 |
0.0191 | 25.0 | 3200 | 0.3832 | 0.0411 |
0.0143 | 26.0 | 3328 | 0.3954 | 0.0382 |
0.0201 | 27.0 | 3456 | 0.3959 | 0.0388 |
0.0196 | 28.0 | 3584 | 0.4099 | 0.0371 |
0.014 | 29.0 | 3712 | 0.4205 | 0.0388 |
0.0079 | 30.0 | 3840 | 0.4231 | 0.0381 |
0.0082 | 31.0 | 3968 | 0.4497 | 0.0400 |
0.0067 | 32.0 | 4096 | 0.4340 | 0.0361 |
0.0191 | 33.0 | 4224 | 0.4181 | 0.0352 |
0.0059 | 34.0 | 4352 | 0.4159 | 0.0351 |
0.0049 | 35.0 | 4480 | 0.4076 | 0.0328 |
0.0073 | 36.0 | 4608 | 0.4093 | 0.0333 |
0.0025 | 37.0 | 4736 | 0.3903 | 0.0335 |
0.0056 | 38.0 | 4864 | 0.4182 | 0.0370 |
0.0029 | 39.0 | 4992 | 0.3996 | 0.0332 |
0.0012 | 40.0 | 5120 | 0.3948 | 0.0325 |
0.0041 | 41.0 | 5248 | 0.4099 | 0.0315 |
0.0048 | 42.0 | 5376 | 0.4057 | 0.0319 |
0.0016 | 43.0 | 5504 | 0.4029 | 0.0316 |
0.0012 | 44.0 | 5632 | 0.3905 | 0.0305 |
0.0015 | 45.0 | 5760 | 0.4043 | 0.0310 |
0.0014 | 46.0 | 5888 | 0.3977 | 0.0310 |
0.0028 | 47.0 | 6016 | 0.4055 | 0.0317 |
0.0012 | 48.0 | 6144 | 0.4048 | 0.0311 |
0.0048 | 49.0 | 6272 | 0.4018 | 0.0305 |
0.0009 | 50.0 | 6400 | 0.4035 | 0.0306 |
Framework versions
- Transformers 4.44.2
- Pytorch 2.6.0+cu124
- Datasets 3.0.1
- Tokenizers 0.19.1
- Downloads last month
- 581
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
the model is not deployed on the HF Inference API.
Model tree for PawanKrGunjan/license_plate_recognizer
Base model
microsoft/trocr-small-printed