Mistral_Sparse_refined_web_50p_2024-03-29
This model is a fine-tuned version of mistralai/Mistral-7B-v0.1 on the None dataset. It achieves the following results on the evaluation set:
- Loss: 2.3556
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 1
- eval_batch_size: 1
- seed: 0
- distributed_type: multi-GPU
- num_devices: 4
- gradient_accumulation_steps: 4
- total_train_batch_size: 16
- total_eval_batch_size: 4
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- training_steps: 1600
Training results
Training Loss | Epoch | Step | Validation Loss |
---|---|---|---|
2.4102 | 0.0 | 25 | 2.6367 |
2.4844 | 0.01 | 50 | 2.5737 |
2.3601 | 0.01 | 75 | 2.5236 |
2.4524 | 0.02 | 100 | 2.4934 |
2.3586 | 0.02 | 125 | 2.4753 |
2.3004 | 0.03 | 150 | 2.4581 |
2.2308 | 0.03 | 175 | 2.4502 |
2.3163 | 0.04 | 200 | 2.4406 |
2.2026 | 0.04 | 225 | 2.4344 |
2.226 | 0.05 | 250 | 2.4264 |
2.3057 | 0.05 | 275 | 2.4255 |
2.288 | 0.06 | 300 | 2.4206 |
2.2401 | 0.06 | 325 | 2.4145 |
2.2386 | 0.07 | 350 | 2.4100 |
2.2824 | 0.07 | 375 | 2.4115 |
2.2264 | 0.08 | 400 | 2.4107 |
2.1135 | 0.08 | 425 | 2.4099 |
2.4512 | 0.09 | 450 | 2.4079 |
2.2329 | 0.09 | 475 | 2.4077 |
2.2044 | 0.09 | 500 | 2.4042 |
2.2461 | 0.1 | 525 | 2.4053 |
2.2781 | 0.1 | 550 | 2.4035 |
2.3529 | 0.11 | 575 | 2.4003 |
2.3395 | 0.11 | 600 | 2.4021 |
2.212 | 0.12 | 625 | 2.4006 |
2.344 | 0.12 | 650 | 2.4004 |
2.1826 | 0.13 | 675 | 2.4015 |
2.1783 | 0.13 | 700 | 2.3991 |
2.1863 | 0.14 | 725 | 2.3952 |
2.2943 | 0.14 | 750 | 2.3952 |
2.2353 | 0.15 | 775 | 2.3940 |
2.3216 | 0.15 | 800 | 2.3959 |
2.2007 | 0.16 | 825 | 2.3935 |
2.1674 | 0.16 | 850 | 2.3950 |
2.2554 | 0.17 | 875 | 2.3952 |
2.2562 | 0.17 | 900 | 2.3932 |
2.2707 | 0.17 | 925 | 2.3939 |
2.3025 | 0.18 | 950 | 2.3895 |
2.3242 | 0.18 | 975 | 2.3899 |
2.2643 | 0.19 | 1000 | 2.3903 |
2.2352 | 0.19 | 1025 | 2.3895 |
2.2249 | 0.2 | 1050 | 2.3883 |
2.1267 | 0.2 | 1075 | 2.3878 |
2.2937 | 0.21 | 1100 | 2.3880 |
2.2461 | 0.21 | 1125 | 2.3870 |
2.2747 | 0.22 | 1150 | 2.3891 |
2.1852 | 0.22 | 1175 | 2.3873 |
2.2889 | 0.23 | 1200 | 2.3866 |
2.1805 | 0.23 | 1225 | 2.3863 |
2.1848 | 0.24 | 1250 | 2.3885 |
2.2034 | 0.24 | 1275 | 2.3875 |
2.2227 | 0.25 | 1300 | 2.3875 |
2.3218 | 0.25 | 1325 | 2.3852 |
2.3232 | 0.26 | 1350 | 2.3842 |
2.2248 | 0.26 | 1375 | 2.3874 |
2.3093 | 0.26 | 1400 | 2.3821 |
2.3094 | 0.27 | 1425 | 2.3849 |
2.2672 | 0.27 | 1450 | 2.3820 |
2.2513 | 0.28 | 1475 | 2.3801 |
2.3726 | 0.28 | 1500 | 2.3799 |
2.1227 | 0.29 | 1525 | 2.3807 |
2.3133 | 0.29 | 1550 | 2.3807 |
2.3258 | 0.3 | 1575 | 2.3802 |
2.23 | 0.3 | 1600 | 2.3829 |
Framework versions
- Transformers 4.36.2
- Pytorch 2.1.2+cu121
- Datasets 2.15.0
- Tokenizers 0.15.0
- Downloads last month
- 4
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API:
The HF Inference API does not support model that require custom code execution.
Model tree for thrunlab/Mistral_Sparse_refined_web_50p_2024-03-29
Base model
mistralai/Mistral-7B-v0.1