Configuration Parsing
Warning:
In adapter_config.json: "peft.task_type" must be a string
SmolVLM-Base-smolvlmf_car_quality_10e_l8
This model is a fine-tuned version of HuggingFaceTB/SmolVLM-Base on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.0080
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 8
- eval_batch_size: 4
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 16
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 100
- num_epochs: 10
Training results
| Training Loss | Epoch | Step | Validation Loss |
|---|---|---|---|
| 0.6493 | 0.1 | 30 | 0.4807 |
| 0.0258 | 0.2 | 60 | 0.0101 |
| 0.0064 | 0.3 | 90 | 0.0066 |
| 0.0046 | 0.4 | 120 | 0.0048 |
| 0.0041 | 0.5 | 150 | 0.0040 |
| 0.0038 | 0.6 | 180 | 0.0043 |
| 0.0049 | 0.7 | 210 | 0.0041 |
| 0.0029 | 0.8 | 240 | 0.0035 |
| 0.0032 | 0.9 | 270 | 0.0038 |
| 0.004 | 1.0 | 300 | 0.0035 |
| 0.0033 | 1.1 | 330 | 0.0046 |
| 0.003 | 1.2 | 360 | 0.0052 |
| 0.0033 | 1.3 | 390 | 0.0034 |
| 0.0026 | 1.4 | 420 | 0.0034 |
| 0.0032 | 1.5 | 450 | 0.0032 |
| 0.0031 | 1.6 | 480 | 0.0033 |
| 0.003 | 1.7 | 510 | 0.0030 |
| 0.0034 | 1.8 | 540 | 0.0033 |
| 0.003 | 1.9 | 570 | 0.0032 |
| 0.0032 | 2.0 | 600 | 0.0039 |
| 0.0027 | 2.1 | 630 | 0.0035 |
| 0.0026 | 2.2 | 660 | 0.0036 |
| 0.0026 | 2.3 | 690 | 0.0031 |
| 0.0033 | 2.4 | 720 | 0.0030 |
| 0.003 | 2.5 | 750 | 0.0031 |
| 0.0031 | 2.6 | 780 | 0.0031 |
| 0.0023 | 2.7 | 810 | 0.0031 |
| 0.0028 | 2.8 | 840 | 0.0032 |
| 0.0022 | 2.9 | 870 | 0.0040 |
| 0.0026 | 3.0 | 900 | 0.0034 |
| 0.0021 | 3.1 | 930 | 0.0035 |
| 0.0025 | 3.2 | 960 | 0.0033 |
| 0.0021 | 3.3 | 990 | 0.0045 |
| 0.0025 | 3.4 | 1020 | 0.0031 |
| 0.0022 | 3.5 | 1050 | 0.0037 |
| 0.0021 | 3.6 | 1080 | 0.0043 |
| 0.0025 | 3.7 | 1110 | 0.0032 |
| 0.0027 | 3.8 | 1140 | 0.0039 |
| 0.0023 | 3.9 | 1170 | 0.0043 |
| 0.0023 | 4.0 | 1200 | 0.0032 |
| 0.0017 | 4.1 | 1230 | 0.0039 |
| 0.0018 | 4.2 | 1260 | 0.0032 |
| 0.002 | 4.3 | 1290 | 0.0031 |
| 0.002 | 4.4 | 1320 | 0.0032 |
| 0.0019 | 4.5 | 1350 | 0.0036 |
| 0.0019 | 4.6 | 1380 | 0.0035 |
| 0.0019 | 4.7 | 1410 | 0.0040 |
| 0.0022 | 4.8 | 1440 | 0.0037 |
| 0.0019 | 4.9 | 1470 | 0.0044 |
| 0.0015 | 5.0 | 1500 | 0.0039 |
| 0.001 | 5.1 | 1530 | 0.0034 |
| 0.0013 | 5.2 | 1560 | 0.0052 |
| 0.0015 | 5.3 | 1590 | 0.0032 |
| 0.0012 | 5.4 | 1620 | 0.0042 |
| 0.002 | 5.5 | 1650 | 0.0037 |
| 0.0017 | 5.6 | 1680 | 0.0041 |
| 0.0018 | 5.7 | 1710 | 0.0038 |
| 0.0017 | 5.8 | 1740 | 0.0054 |
| 0.0014 | 5.9 | 1770 | 0.0042 |
| 0.0017 | 6.0 | 1800 | 0.0041 |
| 0.0011 | 6.1 | 1830 | 0.0038 |
| 0.0006 | 6.2 | 1860 | 0.0042 |
| 0.0009 | 6.3 | 1890 | 0.0048 |
| 0.0011 | 6.4 | 1920 | 0.0051 |
| 0.0007 | 6.5 | 1950 | 0.0047 |
| 0.0011 | 6.6 | 1980 | 0.0057 |
| 0.001 | 6.7 | 2010 | 0.0049 |
| 0.0008 | 6.8 | 2040 | 0.0045 |
| 0.0011 | 6.9 | 2070 | 0.0052 |
| 0.0014 | 7.0 | 2100 | 0.0055 |
| 0.0006 | 7.1 | 2130 | 0.0058 |
| 0.0006 | 7.2 | 2160 | 0.0054 |
| 0.0003 | 7.3 | 2190 | 0.0056 |
| 0.0005 | 7.4 | 2220 | 0.0067 |
| 0.0006 | 7.5 | 2250 | 0.0051 |
| 0.0003 | 7.6 | 2280 | 0.0058 |
| 0.0006 | 7.7 | 2310 | 0.0065 |
| 0.0004 | 7.8 | 2340 | 0.0056 |
| 0.0004 | 7.9 | 2370 | 0.0061 |
| 0.0004 | 8.0 | 2400 | 0.0060 |
| 0.0002 | 8.1 | 2430 | 0.0067 |
| 0.0002 | 8.2 | 2460 | 0.0071 |
| 0.0001 | 8.3 | 2490 | 0.0078 |
| 0.0001 | 8.4 | 2520 | 0.0069 |
| 0.0001 | 8.5 | 2550 | 0.0077 |
| 0.0004 | 8.6 | 2580 | 0.0073 |
| 0.0003 | 8.7 | 2610 | 0.0070 |
| 0.0001 | 8.8 | 2640 | 0.0074 |
| 0.0001 | 8.9 | 2670 | 0.0075 |
| 0.0001 | 9.0 | 2700 | 0.0075 |
| 0.0001 | 9.1 | 2730 | 0.0075 |
| 0.0 | 9.2 | 2760 | 0.0078 |
| 0.0 | 9.3 | 2790 | 0.0078 |
| 0.0001 | 9.4 | 2820 | 0.0080 |
| 0.0001 | 9.5 | 2850 | 0.0079 |
| 0.0001 | 9.6 | 2880 | 0.0080 |
| 0.0 | 9.7 | 2910 | 0.0080 |
| 0.0001 | 9.8 | 2940 | 0.0080 |
| 0.0 | 9.9 | 2970 | 0.0082 |
| 0.0 | 10.0 | 3000 | 0.0080 |
Framework versions
- PEFT 0.15.2
- Transformers 4.52.3
- Pytorch 2.6.0
- Datasets 3.6.0
- Tokenizers 0.21.1
- Downloads last month
- -
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for DamianBoborzi/SmolVLM-Base-smolvlmf_car_quality_10e_l8
Base model
HuggingFaceTB/SmolLM2-1.7B
Quantized
HuggingFaceTB/SmolLM2-1.7B-Instruct
Finetuned
HuggingFaceTB/SmolVLM-Base