zephyr-7b-gemma-sft
This model is a fine-tuned version of google/gemma-7b on the HuggingFaceH4/deita-10k-v0-sft dataset. It achieves the following results on the evaluation set:
- Loss: 1.1043
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- distributed_type: multi-GPU
- num_devices: 8
- gradient_accumulation_steps: 4
- total_train_batch_size: 128
- total_eval_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 3
Training results
| Training Loss | Epoch | Step | Validation Loss |
|---|---|---|---|
| 1.5388 | 0.0835 | 25 | 1.5511 |
| 1.0986 | 0.1669 | 50 | 1.1441 |
| 1.0678 | 0.2504 | 75 | 1.0955 |
| 1.0528 | 0.3339 | 100 | 1.0925 |
| 1.0934 | 0.4174 | 125 | 1.1267 |
| 1.0648 | 0.5008 | 150 | 1.1338 |
| 1.0634 | 0.5843 | 175 | 1.1430 |
| 1.0406 | 0.6678 | 200 | 1.1117 |
| 1.0121 | 0.7513 | 225 | 1.0639 |
| 0.9433 | 0.8347 | 250 | 1.0389 |
| 0.9858 | 0.9182 | 275 | 1.0390 |
| 0.9786 | 1.0017 | 300 | 1.0929 |
| 0.7899 | 1.0851 | 325 | 1.1224 |
| 0.7882 | 1.1686 | 350 | 1.0742 |
| 0.7399 | 1.2521 | 375 | 1.0683 |
| 0.7628 | 1.3356 | 400 | 1.0628 |
| 0.7569 | 1.4190 | 425 | 1.0546 |
| 0.7535 | 1.5025 | 450 | 1.0615 |
| 0.7363 | 1.5860 | 475 | 1.0576 |
| 0.7552 | 1.6694 | 500 | 1.0607 |
| 0.7437 | 1.7529 | 525 | 1.0607 |
| 0.7519 | 1.8364 | 550 | 1.0658 |
| 0.7625 | 1.9199 | 575 | 1.0657 |
| 0.6916 | 2.0033 | 600 | 1.0868 |
| 0.4481 | 2.0868 | 625 | 1.1412 |
| 0.4438 | 2.1703 | 650 | 1.1363 |
| 0.4339 | 2.2538 | 675 | 1.1245 |
| 0.4356 | 2.3372 | 700 | 1.1193 |
| 0.4268 | 2.4207 | 725 | 1.1166 |
| 0.4259 | 2.5042 | 750 | 1.1127 |
| 0.4069 | 2.5876 | 775 | 1.1079 |
| 0.4018 | 2.6711 | 800 | 1.1046 |
| 0.4181 | 2.7546 | 825 | 1.1036 |
| 0.4007 | 2.8381 | 850 | 1.1050 |
| 0.3953 | 2.9215 | 875 | 1.1041 |
Framework versions
- Transformers 4.45.2
- Pytorch 2.5.1+rocm6.2
- Datasets 3.2.0
- Tokenizers 0.20.3
- Downloads last month
- 40
Model tree for li-muyang/zephyr-7b-gemma-sft
Base model
google/gemma-7b