Edit model card

v0_mistral_lora_last_n_train_3

This model is a fine-tuned version of peiyi9979/math-shepherd-mistral-7b-prm on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2838
  • Accuracy: 0.8991
  • Precision: 0.8701
  • Recall: 0.67
  • F1: 0.7571

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • distributed_type: multi-GPU
  • num_devices: 4
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 64
  • total_eval_batch_size: 32
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: cosine
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 1

Training results

Training Loss Epoch Step Validation Loss Accuracy Precision Recall F1
0.8403 0.0054 5 0.6496 0.6315 0.3113 0.47 0.3745
0.7673 0.0109 10 0.6456 0.6385 0.3176 0.47 0.3790
0.8495 0.0163 15 0.6429 0.6432 0.3194 0.46 0.3770
0.7675 0.0217 20 0.6363 0.6549 0.3333 0.47 0.3900
0.7347 0.0271 25 0.6235 0.6831 0.3529 0.42 0.3836
0.7087 0.0326 30 0.6053 0.7160 0.3846 0.35 0.3665
0.7302 0.0380 35 0.5893 0.7394 0.4247 0.31 0.3584
0.6417 0.0434 40 0.5743 0.7465 0.4259 0.23 0.2987
0.6635 0.0488 45 0.5677 0.7535 0.4419 0.19 0.2657
0.5966 0.0543 50 0.5625 0.7512 0.4167 0.15 0.2206
0.5228 0.0597 55 0.5590 0.7535 0.4286 0.15 0.2222
0.5335 0.0651 60 0.5517 0.7629 0.4815 0.13 0.2047
0.539 0.0705 65 0.5417 0.7653 0.5 0.13 0.2063
0.5672 0.0760 70 0.5224 0.7676 0.5161 0.16 0.2443
0.5947 0.0814 75 0.5087 0.7746 0.5556 0.2 0.2941
0.5674 0.0868 80 0.4976 0.7817 0.5854 0.24 0.3404
0.4785 0.0922 85 0.4872 0.7934 0.625 0.3 0.4054
0.4498 0.0977 90 0.4765 0.8075 0.6957 0.32 0.4384
0.5334 0.1031 95 0.4615 0.8099 0.7436 0.29 0.4173
0.5016 0.1085 100 0.4501 0.8216 0.8333 0.3 0.4412
0.5298 0.1139 105 0.4345 0.8357 0.8125 0.39 0.5270
0.4018 0.1194 110 0.4256 0.8498 0.8 0.48 0.6
0.4767 0.1248 115 0.4180 0.8638 0.7692 0.6 0.6742
0.3461 0.1302 120 0.4038 0.8615 0.7662 0.59 0.6667
0.4099 0.1356 125 0.3915 0.8638 0.7838 0.58 0.6667
0.4437 0.1411 130 0.3839 0.8615 0.7662 0.59 0.6667
0.5666 0.1465 135 0.3759 0.8638 0.7692 0.6 0.6742
0.4014 0.1519 140 0.3721 0.8615 0.7412 0.63 0.6811
0.481 0.1574 145 0.3649 0.8709 0.8 0.6 0.6857
0.4848 0.1628 150 0.3642 0.8662 0.7945 0.58 0.6705
0.3549 0.1682 155 0.3610 0.8685 0.7895 0.6 0.6818
0.4809 0.1736 160 0.3606 0.8638 0.7692 0.6 0.6742
0.4665 0.1791 165 0.3536 0.8615 0.7662 0.59 0.6667
0.4774 0.1845 170 0.3541 0.8568 0.7468 0.59 0.6592
0.3454 0.1899 175 0.3516 0.8592 0.7439 0.61 0.6703
0.3102 0.1953 180 0.3500 0.8638 0.7692 0.6 0.6742
0.2953 0.2008 185 0.3473 0.8662 0.8028 0.57 0.6667
0.4097 0.2062 190 0.3501 0.8709 0.8169 0.58 0.6784
0.3379 0.2116 195 0.3466 0.8638 0.7625 0.61 0.6778
0.3081 0.2170 200 0.3466 0.8662 0.7792 0.6 0.6780
0.3262 0.2225 205 0.3493 0.8685 0.7821 0.61 0.6854
0.3463 0.2279 210 0.3461 0.8662 0.7529 0.64 0.6919
0.402 0.2333 215 0.3423 0.8662 0.7416 0.66 0.6984
0.3456 0.2387 220 0.3401 0.8709 0.7778 0.63 0.6961
0.3949 0.2442 225 0.3384 0.8826 0.8378 0.62 0.7126
0.4872 0.2496 230 0.3386 0.8826 0.8205 0.64 0.7191
0.3817 0.2550 235 0.3359 0.8803 0.8101 0.64 0.7151
0.3366 0.2604 240 0.3328 0.8803 0.8101 0.64 0.7151
0.3824 0.2659 245 0.3286 0.8803 0.8182 0.63 0.7119
0.325 0.2713 250 0.3290 0.8732 0.7875 0.63 0.7
0.2943 0.2767 255 0.3264 0.8685 0.7821 0.61 0.6854
0.3434 0.2821 260 0.3275 0.8732 0.7875 0.63 0.7
0.3224 0.2876 265 0.3280 0.8779 0.7857 0.66 0.7174
0.3386 0.2930 270 0.3257 0.8826 0.8289 0.63 0.7159
0.304 0.2984 275 0.3238 0.8826 0.8125 0.65 0.7222
0.4793 0.3039 280 0.3250 0.8850 0.8228 0.65 0.7263
0.4138 0.3093 285 0.3238 0.8850 0.8312 0.64 0.7232
0.3852 0.3147 290 0.3237 0.8897 0.8272 0.67 0.7403
0.3196 0.3201 295 0.3240 0.8850 0.8 0.68 0.7351
0.4023 0.3256 300 0.3214 0.8897 0.8533 0.64 0.7314
0.3974 0.3310 305 0.3214 0.8944 0.8667 0.65 0.7429
0.2933 0.3364 310 0.3207 0.8967 0.85 0.68 0.7556
0.3391 0.3418 315 0.3151 0.8991 0.8608 0.68 0.7598
0.2682 0.3473 320 0.3155 0.9014 0.8452 0.71 0.7717
0.4006 0.3527 325 0.3179 0.8944 0.8090 0.72 0.7619
0.3129 0.3581 330 0.3144 0.8967 0.8415 0.69 0.7582
0.3386 0.3635 335 0.3114 0.8920 0.8553 0.65 0.7386
0.3413 0.3690 340 0.3116 0.8920 0.8649 0.64 0.7356
0.3122 0.3744 345 0.3118 0.8920 0.8553 0.65 0.7386
0.3871 0.3798 350 0.3110 0.8920 0.8375 0.67 0.7444
0.2153 0.3852 355 0.3101 0.8944 0.8481 0.67 0.7486
0.3732 0.3907 360 0.3098 0.8920 0.8553 0.65 0.7386
0.3858 0.3961 365 0.3082 0.8944 0.8395 0.68 0.7514
0.2806 0.4015 370 0.3079 0.8944 0.8767 0.64 0.7399
0.3634 0.4069 375 0.3081 0.8944 0.8571 0.66 0.7458
0.3554 0.4124 380 0.3134 0.8920 0.8375 0.67 0.7444
0.35 0.4178 385 0.3092 0.8944 0.8481 0.67 0.7486
0.369 0.4232 390 0.3090 0.8897 0.8841 0.61 0.7219
0.3476 0.4286 395 0.3047 0.8897 0.8630 0.63 0.7283
0.3525 0.4341 400 0.3073 0.8850 0.7802 0.71 0.7435
0.3312 0.4395 405 0.3042 0.8826 0.7907 0.68 0.7312
0.3379 0.4449 410 0.2998 0.8920 0.8375 0.67 0.7444
0.3113 0.4504 415 0.2999 0.9014 0.8816 0.67 0.7614
0.3772 0.4558 420 0.2991 0.8967 0.85 0.68 0.7556
0.3598 0.4612 425 0.3028 0.8873 0.8023 0.69 0.7419
0.3031 0.4666 430 0.2989 0.8967 0.8590 0.67 0.7528
0.3537 0.4721 435 0.2986 0.8991 0.8904 0.65 0.7514
0.3188 0.4775 440 0.2993 0.8991 0.8904 0.65 0.7514
0.2694 0.4829 445 0.2999 0.8991 0.88 0.66 0.7543
0.2925 0.4883 450 0.3000 0.8920 0.8857 0.62 0.7294
0.2575 0.4938 455 0.3008 0.8873 0.8714 0.61 0.7176
0.2553 0.4992 460 0.2954 0.8897 0.8533 0.64 0.7314
0.2947 0.5046 465 0.2991 0.8944 0.8235 0.7 0.7568
0.1847 0.5100 470 0.2949 0.8944 0.8667 0.65 0.7429
0.3747 0.5155 475 0.2963 0.8920 0.8649 0.64 0.7356
0.2153 0.5209 480 0.2954 0.8944 0.8667 0.65 0.7429
0.283 0.5263 485 0.2911 0.8967 0.85 0.68 0.7556
0.255 0.5317 490 0.2924 0.8944 0.8313 0.69 0.7541
0.3864 0.5372 495 0.2920 0.9014 0.8625 0.69 0.7667
0.3019 0.5426 500 0.2953 0.8920 0.8214 0.69 0.75
0.3855 0.5480 505 0.2941 0.8991 0.8608 0.68 0.7598
0.2762 0.5534 510 0.2938 0.8967 0.8784 0.65 0.7471
0.3806 0.5589 515 0.2970 0.8944 0.8767 0.64 0.7399
0.2644 0.5643 520 0.2951 0.8991 0.8904 0.65 0.7514
0.4142 0.5697 525 0.2952 0.9014 0.8625 0.69 0.7667
0.2897 0.5751 530 0.3044 0.8756 0.7474 0.71 0.7282
0.2989 0.5806 535 0.2940 0.8991 0.8701 0.67 0.7571
0.3252 0.5860 540 0.2947 0.9014 0.8919 0.66 0.7586
0.3944 0.5914 545 0.2938 0.8967 0.8590 0.67 0.7528
0.2853 0.5969 550 0.2991 0.8920 0.8293 0.68 0.7473
0.2416 0.6023 555 0.2970 0.8991 0.8701 0.67 0.7571
0.4176 0.6077 560 0.3034 0.8944 0.8767 0.64 0.7399
0.3137 0.6131 565 0.2977 0.8967 0.8784 0.65 0.7471
0.3712 0.6186 570 0.2952 0.9014 0.8625 0.69 0.7667
0.2587 0.6240 575 0.2931 0.8991 0.8701 0.67 0.7571
0.3369 0.6294 580 0.2975 0.8967 0.8889 0.64 0.7442
0.3566 0.6348 585 0.2960 0.8991 0.8904 0.65 0.7514
0.316 0.6403 590 0.2939 0.8967 0.8684 0.66 0.75
0.321 0.6457 595 0.2914 0.9014 0.8718 0.68 0.7640
0.3272 0.6511 600 0.2931 0.8991 0.8701 0.67 0.7571
0.4241 0.6565 605 0.2937 0.9014 0.8718 0.68 0.7640
0.2708 0.6620 610 0.2928 0.9014 0.8625 0.69 0.7667
0.3422 0.6674 615 0.2930 0.9038 0.8734 0.69 0.7709
0.2884 0.6728 620 0.2945 0.8967 0.8684 0.66 0.75
0.247 0.6782 625 0.2931 0.8967 0.8684 0.66 0.75
0.3017 0.6837 630 0.2915 0.8967 0.8684 0.66 0.75
0.301 0.6891 635 0.2890 0.8967 0.8684 0.66 0.75
0.4131 0.6945 640 0.2909 0.8967 0.8415 0.69 0.7582
0.3617 0.6999 645 0.2936 0.8991 0.8353 0.71 0.7676
0.3389 0.7054 650 0.2929 0.8967 0.8182 0.72 0.7660
0.3274 0.7108 655 0.2903 0.9061 0.8659 0.71 0.7802
0.295 0.7162 660 0.2898 0.8967 0.8684 0.66 0.75
0.1267 0.7216 665 0.2917 0.9014 0.8919 0.66 0.7586
0.3636 0.7271 670 0.2919 0.9038 0.9041 0.66 0.7630
0.3415 0.7325 675 0.2926 0.9038 0.9041 0.66 0.7630
0.3356 0.7379 680 0.2914 0.9014 0.8919 0.66 0.7586
0.2784 0.7434 685 0.2861 0.9014 0.8718 0.68 0.7640
0.3187 0.7488 690 0.2856 0.9014 0.8537 0.7 0.7692
0.249 0.7542 695 0.2854 0.9038 0.8554 0.71 0.7760
0.3047 0.7596 700 0.2856 0.9061 0.8571 0.72 0.7826
0.358 0.7651 705 0.2851 0.9014 0.8537 0.7 0.7692
0.3681 0.7705 710 0.2846 0.8991 0.8608 0.68 0.7598
0.2785 0.7759 715 0.2854 0.8944 0.8571 0.66 0.7458
0.2938 0.7813 720 0.2862 0.8967 0.8784 0.65 0.7471
0.322 0.7868 725 0.2866 0.8967 0.8784 0.65 0.7471
0.3773 0.7922 730 0.2845 0.8944 0.8571 0.66 0.7458
0.3124 0.7976 735 0.2857 0.8991 0.8608 0.68 0.7598
0.1688 0.8030 740 0.2866 0.8991 0.8519 0.69 0.7624
0.389 0.8085 745 0.2853 0.8944 0.8481 0.67 0.7486
0.2116 0.8139 750 0.2861 0.8967 0.8590 0.67 0.7528
0.2834 0.8193 755 0.2864 0.8967 0.8590 0.67 0.7528
0.2635 0.8247 760 0.2852 0.8991 0.8701 0.67 0.7571
0.3773 0.8302 765 0.2856 0.9014 0.8816 0.67 0.7614
0.2997 0.8356 770 0.2855 0.8991 0.8701 0.67 0.7571
0.3151 0.8410 775 0.2861 0.8991 0.8701 0.67 0.7571
0.3008 0.8464 780 0.2863 0.8991 0.88 0.66 0.7543
0.3162 0.8519 785 0.2852 0.8991 0.8701 0.67 0.7571
0.2638 0.8573 790 0.2848 0.9014 0.8816 0.67 0.7614
0.4227 0.8627 795 0.2849 0.9014 0.8816 0.67 0.7614
0.2635 0.8681 800 0.2856 0.8991 0.8701 0.67 0.7571
0.2781 0.8736 805 0.2841 0.8991 0.8701 0.67 0.7571
0.42 0.8790 810 0.2845 0.8991 0.8701 0.67 0.7571
0.4196 0.8844 815 0.2849 0.8991 0.8701 0.67 0.7571
0.2923 0.8899 820 0.2830 0.8991 0.8701 0.67 0.7571
0.2545 0.8953 825 0.2843 0.8967 0.8590 0.67 0.7528
0.4537 0.9007 830 0.2848 0.8991 0.8701 0.67 0.7571
0.2664 0.9061 835 0.2847 0.8991 0.8701 0.67 0.7571
0.292 0.9116 840 0.2842 0.8991 0.8701 0.67 0.7571
0.453 0.9170 845 0.2847 0.8967 0.8590 0.67 0.7528
0.3671 0.9224 850 0.2833 0.8991 0.8701 0.67 0.7571
0.2909 0.9278 855 0.2847 0.8967 0.8590 0.67 0.7528
0.3266 0.9333 860 0.2837 0.8944 0.8481 0.67 0.7486
0.2051 0.9387 865 0.2838 0.8991 0.8701 0.67 0.7571
0.2648 0.9441 870 0.2839 0.8991 0.8701 0.67 0.7571
0.3606 0.9495 875 0.2837 0.8991 0.8701 0.67 0.7571
0.2971 0.9550 880 0.2833 0.8991 0.8701 0.67 0.7571
0.282 0.9604 885 0.2826 0.8967 0.8590 0.67 0.7528
0.3994 0.9658 890 0.2838 0.8967 0.8590 0.67 0.7528
0.2317 0.9712 895 0.2845 0.8967 0.8590 0.67 0.7528
0.2784 0.9767 900 0.2836 0.8967 0.8590 0.67 0.7528
0.3679 0.9821 905 0.2828 0.8991 0.8701 0.67 0.7571
0.2623 0.9875 910 0.2839 0.8967 0.8590 0.67 0.7528
0.2232 0.9929 915 0.2838 0.8991 0.8701 0.67 0.7571
0.3868 0.9984 920 0.2838 0.8991 0.8701 0.67 0.7571

Framework versions

  • PEFT 0.12.0
  • Transformers 4.46.0
  • Pytorch 2.4.0+cu118
  • Datasets 3.0.0
  • Tokenizers 0.20.1
Downloads last month
352
Inference API
Unable to determine this model’s pipeline type. Check the docs .

Model tree for mtzig/v0_mistral_lora_last_n_train_3

Adapter
(5)
this model