eval_name
stringlengths
12
111
Precision
stringclasses
3 values
Type
stringclasses
6 values
T
stringclasses
6 values
Weight type
stringclasses
2 values
Architecture
stringclasses
48 values
Model
stringlengths
355
650
fullname
stringlengths
4
102
Model sha
stringlengths
0
40
Average ⬆️
float64
1.41
51.2
Hub License
stringclasses
25 values
Hub ❤️
int64
0
5.84k
#Params (B)
int64
-1
140
Available on the hub
bool
2 classes
Not_Merged
bool
2 classes
MoE
bool
2 classes
Flagged
bool
1 class
Chat Template
bool
2 classes
CO₂ cost (kg)
float64
0.04
107
IFEval Raw
float64
0
0.87
IFEval
float64
0
86.7
BBH Raw
float64
0.28
0.75
BBH
float64
0.81
63.5
MATH Lvl 5 Raw
float64
0
0.51
MATH Lvl 5
float64
0
50.7
GPQA Raw
float64
0.22
0.44
GPQA
float64
0
24.9
MUSR Raw
float64
0.29
0.59
MUSR
float64
0
36.4
MMLU-PRO Raw
float64
0.1
0.7
MMLU-PRO
float64
0
66.8
Maintainer's Highlight
bool
2 classes
Upload To Hub Date
stringlengths
0
10
Submission Date
stringclasses
151 values
Generation
int64
0
6
Base Model
stringlengths
4
102
LeroyDyer__Spydaz_Web_AI_12_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/LeroyDyer/_Spydaz_Web_AI_12" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">LeroyDyer/_Spydaz_Web_AI_12</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/LeroyDyer___Spydaz_Web_AI_12-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
LeroyDyer/_Spydaz_Web_AI_12
675cf7fbfa36974b2eb5aef53afdf56a65ecfcfd
6.451513
0
7
false
true
true
false
false
0.648525
0.276499
27.649858
0.31634
4.495994
0.003776
0.377644
0.268456
2.46085
0.358156
2.202865
0.113697
1.521868
false
2024-09-19
0
Removed
LeroyDyer__Spydaz_Web_AI_14_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/LeroyDyer/_Spydaz_Web_AI_14" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">LeroyDyer/_Spydaz_Web_AI_14</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/LeroyDyer___Spydaz_Web_AI_14-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
LeroyDyer/_Spydaz_Web_AI_14
53e73726a0a780db48303f4befbf7574e5c04984
4.202131
0
7
false
true
true
false
true
0.621351
0.181177
18.117705
0.298885
2.1624
0.001511
0.151057
0.26594
2.12528
0.339521
1.106771
0.113946
1.549572
false
2024-09-23
0
Removed
LeroyDyer__Spydaz_Web_AI_BIBLE_002_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/LeroyDyer/_Spydaz_Web_AI_BIBLE_002" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">LeroyDyer/_Spydaz_Web_AI_BIBLE_002</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/LeroyDyer___Spydaz_Web_AI_BIBLE_002-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
LeroyDyer/_Spydaz_Web_AI_BIBLE_002
f47113e6352f4df8c50e9e571fc85cd7a154a07f
6.760197
apache-2.0
2
7
true
true
true
false
false
0.65325
0.219495
21.949538
0.328907
6.34958
0.011329
1.132931
0.284396
4.58613
0.340698
2.453906
0.136802
4.089096
false
2024-09-10
2024-09-14
0
LeroyDyer/_Spydaz_Web_AI_BIBLE_002
LeroyDyer__Spydaz_Web_AI_ChatML_002_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/LeroyDyer/_Spydaz_Web_AI_ChatML_002" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">LeroyDyer/_Spydaz_Web_AI_ChatML_002</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/LeroyDyer___Spydaz_Web_AI_ChatML_002-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
LeroyDyer/_Spydaz_Web_AI_ChatML_002
9475af8113cf4027839974283b702d6be502f7fa
5.526904
0
7
false
true
true
false
true
0.640805
0.241228
24.122772
0.310638
4.191974
0
0
0.25755
1.006711
0.362313
2.789063
0.109458
1.050901
false
2024-09-01
0
Removed
LeroyDyer__Spydaz_Web_AI_ChatQA_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/LeroyDyer/_Spydaz_Web_AI_ChatQA" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">LeroyDyer/_Spydaz_Web_AI_ChatQA</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/LeroyDyer___Spydaz_Web_AI_ChatQA-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
LeroyDyer/_Spydaz_Web_AI_ChatQA
9f86dd12d4c75e0290aa3084a44cf111bc975144
4.951488
0
7
false
true
true
false
false
0.582079
0.141459
14.145911
0.323595
5.599562
0
0
0.26594
2.12528
0.344729
2.557813
0.147523
5.280363
false
2024-08-06
0
Removed
LeroyDyer__Spydaz_Web_AI_ChatQA_003_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/LeroyDyer/_Spydaz_Web_AI_ChatQA_003" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">LeroyDyer/_Spydaz_Web_AI_ChatQA_003</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/LeroyDyer___Spydaz_Web_AI_ChatQA_003-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
LeroyDyer/_Spydaz_Web_AI_ChatQA_003
6.131679
0
7
false
true
true
false
false
0.629813
0.220919
22.091938
0.317181
4.293436
0.003021
0.302115
0.270973
2.796421
0.381844
5.830469
0.113281
1.475694
false
2024-09-14
0
Removed
Lil-R_2_PRYMMAL-ECE-2B-SLERP-V1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/Lil-R/2_PRYMMAL-ECE-2B-SLERP-V1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Lil-R/2_PRYMMAL-ECE-2B-SLERP-V1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Lil-R__2_PRYMMAL-ECE-2B-SLERP-V1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Lil-R/2_PRYMMAL-ECE-2B-SLERP-V1
fda2d7dd2d797726ebd34cee88095e0ae6b0b093
21.173037
apache-2.0
0
2
true
false
true
false
false
2.42416
0.582346
58.234595
0.428707
19.534911
0.092145
9.214502
0.306208
7.494407
0.437469
13.916927
0.267786
18.642878
false
2024-11-05
2024-11-07
1
Lil-R/2_PRYMMAL-ECE-2B-SLERP-V1 (Merge)
Lil-R_2_PRYMMAL-ECE-2B-SLERP-V2_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/Lil-R/2_PRYMMAL-ECE-2B-SLERP-V2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Lil-R/2_PRYMMAL-ECE-2B-SLERP-V2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Lil-R__2_PRYMMAL-ECE-2B-SLERP-V2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Lil-R/2_PRYMMAL-ECE-2B-SLERP-V2
7e55d63df09ec396f39adcc426a91f2e74606bd0
21.036189
apache-2.0
0
2
true
false
true
false
false
1.289832
0.554269
55.426934
0.437647
20.197377
0.092145
9.214502
0.297819
6.375839
0.448167
15.620833
0.274435
19.381649
false
2024-11-05
2024-11-05
1
Lil-R/2_PRYMMAL-ECE-2B-SLERP-V2 (Merge)
Lil-R_2_PRYMMAL-ECE-7B-SLERP_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Lil-R/2_PRYMMAL-ECE-7B-SLERP" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Lil-R/2_PRYMMAL-ECE-7B-SLERP</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Lil-R__2_PRYMMAL-ECE-7B-SLERP-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Lil-R/2_PRYMMAL-ECE-7B-SLERP
fdc2ac0da72ad62ecc9677cdac32dd097bc99c3a
30.798005
apache-2.0
0
7
true
false
true
false
false
0.685049
0.559265
55.926497
0.555664
36.48257
0.318731
31.873112
0.310403
8.053691
0.439604
13.483854
0.450715
38.968307
false
2024-11-04
2024-11-04
1
Lil-R/2_PRYMMAL-ECE-7B-SLERP (Merge)
Lil-R_2_PRYMMAL-ECE-7B-SLERP-V1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Lil-R/2_PRYMMAL-ECE-7B-SLERP-V1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Lil-R/2_PRYMMAL-ECE-7B-SLERP-V1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Lil-R__2_PRYMMAL-ECE-7B-SLERP-V1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Lil-R/2_PRYMMAL-ECE-7B-SLERP-V1
1f9b9683053a13b9f5c3863a6de53d9e14a2e6c5
3.720413
apache-2.0
0
7
true
true
true
false
false
1.2496
0.107337
10.733742
0.305258
2.784018
0
0
0.250839
0.111857
0.391083
7.31875
0.112367
1.374113
false
2024-10-28
2024-10-28
0
Lil-R/2_PRYMMAL-ECE-7B-SLERP-V1
Lil-R_2_PRYMMAL-ECE-7B-SLERP-V2_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Lil-R/2_PRYMMAL-ECE-7B-SLERP-V2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Lil-R/2_PRYMMAL-ECE-7B-SLERP-V2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Lil-R__2_PRYMMAL-ECE-7B-SLERP-V2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Lil-R/2_PRYMMAL-ECE-7B-SLERP-V2
d633d064bcd8723da5b2337048cee1079e745766
3.720413
apache-2.0
0
7
true
true
true
false
false
1.275211
0.107337
10.733742
0.305258
2.784018
0
0
0.250839
0.111857
0.391083
7.31875
0.112367
1.374113
false
2024-10-28
2024-10-28
0
Lil-R/2_PRYMMAL-ECE-7B-SLERP-V2
Lil-R_2_PRYMMAL-ECE-7B-SLERP-V3_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Lil-R/2_PRYMMAL-ECE-7B-SLERP-V3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Lil-R/2_PRYMMAL-ECE-7B-SLERP-V3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Lil-R__2_PRYMMAL-ECE-7B-SLERP-V3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Lil-R/2_PRYMMAL-ECE-7B-SLERP-V3
691d98e52b8136355cf3884a4c29968bf0fc6dcf
8.778022
0
7
false
true
true
false
false
1.285801
0.223467
22.346707
0.35784
10.612229
0
0
0.256711
0.894855
0.410708
9.738542
0.181682
9.075798
false
2024-10-28
2024-10-28
1
Lil-R/2_PRYMMAL-ECE-7B-SLERP-V3 (Merge)
Lil-R_PRYMMAL-ECE-1B-SLERP-V1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Lil-R/PRYMMAL-ECE-1B-SLERP-V1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Lil-R/PRYMMAL-ECE-1B-SLERP-V1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Lil-R__PRYMMAL-ECE-1B-SLERP-V1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Lil-R/PRYMMAL-ECE-1B-SLERP-V1
5770824fbfc2f9df22f6a1442e1392b029e333ec
14.404948
apache-2.0
0
1
true
false
true
false
false
0.647309
0.28744
28.743955
0.419045
17.999676
0.074773
7.477341
0.276007
3.467562
0.397437
7.346354
0.292553
21.394799
false
2024-10-29
2024-10-29
1
Lil-R/PRYMMAL-ECE-1B-SLERP-V1 (Merge)
Lil-R_PRYMMAL-ECE-7B-SLERP-V8_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Lil-R/PRYMMAL-ECE-7B-SLERP-V8" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Lil-R/PRYMMAL-ECE-7B-SLERP-V8</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Lil-R__PRYMMAL-ECE-7B-SLERP-V8-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Lil-R/PRYMMAL-ECE-7B-SLERP-V8
19fa915c941013075673c2943e2d06d131afcfef
3.222584
apache-2.0
0
7
true
true
true
false
false
1.278111
0.125847
12.58472
0.295509
2.270601
0
0
0.25
0
0.363146
3.059896
0.112783
1.420287
false
2024-10-28
2024-10-28
0
Lil-R/PRYMMAL-ECE-7B-SLERP-V8
LilRg_10PRYMMAL-3B-slerp_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Phi3ForCausalLM
<a target="_blank" href="https://huggingface.co/LilRg/10PRYMMAL-3B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">LilRg/10PRYMMAL-3B-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/LilRg__10PRYMMAL-3B-slerp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
LilRg/10PRYMMAL-3B-slerp
3e0a12c2ec82e18136fc1cf1609c66154cff8a6e
20.382961
apache-2.0
0
3
true
false
true
false
false
2.742907
0.19459
19.459035
0.532038
34.877918
0.107251
10.725076
0.321309
9.50783
0.452906
15.713281
0.388132
32.014628
false
2024-09-23
2024-09-24
1
LilRg/10PRYMMAL-3B-slerp (Merge)
LilRg_ECE-1B-merge-PRYMMAL_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/LilRg/ECE-1B-merge-PRYMMAL" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">LilRg/ECE-1B-merge-PRYMMAL</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/LilRg__ECE-1B-merge-PRYMMAL-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
LilRg/ECE-1B-merge-PRYMMAL
009c75039786c38e2a6168cf93c9a46a4d111fb9
14.346595
apache-2.0
0
1
true
false
true
false
false
0.690357
0.271228
27.122812
0.423456
19.141465
0.092145
9.214502
0.28104
4.138702
0.380104
5.279687
0.290642
21.182402
false
2024-10-07
2024-10-07
1
LilRg/ECE-1B-merge-PRYMMAL (Merge)
LilRg_ECE_Finetunning_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
?
<a target="_blank" href="https://huggingface.co/LilRg/ECE_Finetunning" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">LilRg/ECE_Finetunning</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/LilRg__ECE_Finetunning-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
LilRg/ECE_Finetunning
8d10549bcf802355f2d6203a33ed27e81b15b9e5
11.911504
apache-2.0
0
16
true
true
true
false
false
1.770188
0.044538
4.453849
0.473216
26.530835
0.040785
4.07855
0.282718
4.362416
0.383948
7.69349
0.319149
24.349882
false
2024-09-28
2024-09-28
3
meta-llama/Meta-Llama-3.1-8B
LilRg_PRYMMAL-6B-slerp_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/LilRg/PRYMMAL-6B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">LilRg/PRYMMAL-6B-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/LilRg__PRYMMAL-6B-slerp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
LilRg/PRYMMAL-6B-slerp
1ce0f5fdaae6a7866eda77df18378e9b5621af65
3.232706
apache-2.0
0
3
true
false
true
false
false
0.348082
0.115331
11.533066
0.286762
2.212431
0
0
0.245805
0
0.36975
4.452083
0.110788
1.198655
false
2024-09-24
2024-09-24
1
LilRg/PRYMMAL-6B-slerp (Merge)
LilRg_PRYMMAL-ECE-7B-SLERP-V3_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/LilRg/PRYMMAL-ECE-7B-SLERP-V3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">LilRg/PRYMMAL-ECE-7B-SLERP-V3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/LilRg__PRYMMAL-ECE-7B-SLERP-V3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
LilRg/PRYMMAL-ECE-7B-SLERP-V3
742eee22ab39880acb8650b7290d420065d0514b
3.370301
0
7
false
true
true
false
false
1.281127
0.124323
12.432346
0.295724
2.290323
0
0
0.256711
0.894855
0.367146
3.193229
0.112699
1.411052
false
2024-10-26
2024-10-26
1
LilRg/PRYMMAL-ECE-7B-SLERP-V3 (Merge)
LilRg_PRYMMAL-ECE-7B-SLERP-V4_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/LilRg/PRYMMAL-ECE-7B-SLERP-V4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">LilRg/PRYMMAL-ECE-7B-SLERP-V4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/LilRg__PRYMMAL-ECE-7B-SLERP-V4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
LilRg/PRYMMAL-ECE-7B-SLERP-V4
5a45274282197dcce0f22b442f65df14ac75f507
3.380293
0
7
false
true
true
false
false
1.30571
0.124923
12.492298
0.295724
2.290323
0
0
0.256711
0.894855
0.367146
3.193229
0.112699
1.411052
false
2024-10-26
2024-10-26
1
LilRg/PRYMMAL-ECE-7B-SLERP-V4 (Merge)
LilRg_PRYMMAL-ECE-7B-SLERP-V5_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/LilRg/PRYMMAL-ECE-7B-SLERP-V5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">LilRg/PRYMMAL-ECE-7B-SLERP-V5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/LilRg__PRYMMAL-ECE-7B-SLERP-V5-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
LilRg/PRYMMAL-ECE-7B-SLERP-V5
63f1ed2c963e3cb78d6c6a89836e0712aa7c3a6f
3.380293
0
7
false
true
true
false
false
1.286425
0.124923
12.492298
0.295724
2.290323
0
0
0.256711
0.894855
0.367146
3.193229
0.112699
1.411052
false
2024-10-26
2024-10-26
1
LilRg/PRYMMAL-ECE-7B-SLERP-V5 (Merge)
LilRg_PRYMMAL-ECE-7B-SLERP-V6_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/LilRg/PRYMMAL-ECE-7B-SLERP-V6" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">LilRg/PRYMMAL-ECE-7B-SLERP-V6</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/LilRg__PRYMMAL-ECE-7B-SLERP-V6-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
LilRg/PRYMMAL-ECE-7B-SLERP-V6
92a8c865ef44974d0bafd22c1f991afe7889717b
3.370301
0
7
false
true
true
false
false
1.247841
0.124323
12.432346
0.295724
2.290323
0
0
0.256711
0.894855
0.367146
3.193229
0.112699
1.411052
false
2024-10-26
2024-10-26
1
LilRg/PRYMMAL-ECE-7B-SLERP-V6 (Merge)
LilRg_PRYMMAL-ECE-7B-SLERP-V7_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/LilRg/PRYMMAL-ECE-7B-SLERP-V7" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">LilRg/PRYMMAL-ECE-7B-SLERP-V7</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/LilRg__PRYMMAL-ECE-7B-SLERP-V7-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
LilRg/PRYMMAL-ECE-7B-SLERP-V7
834363d4b420f85ff1af920a68149240c580726c
3.380293
0
7
false
true
true
false
false
1.27462
0.124923
12.492298
0.295724
2.290323
0
0
0.256711
0.894855
0.367146
3.193229
0.112699
1.411052
false
2024-10-26
2024-10-26
1
LilRg/PRYMMAL-ECE-7B-SLERP-V7 (Merge)
LilRg_PRYMMAL-slerp-Merge_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Phi3ForCausalLM
<a target="_blank" href="https://huggingface.co/LilRg/PRYMMAL-slerp-Merge" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">LilRg/PRYMMAL-slerp-Merge</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/LilRg__PRYMMAL-slerp-Merge-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
LilRg/PRYMMAL-slerp-Merge
e5597549ceb5afe56428097cb297326537d07c3e
22.382841
apache-2.0
0
3
true
false
true
false
false
1.417514
0.3044
30.44001
0.536416
35.553776
0.098943
9.89426
0.32047
9.395973
0.463479
17.201563
0.386303
31.811466
false
2024-09-24
2024-09-24
1
LilRg/PRYMMAL-slerp-Merge (Merge)
LimYeri_CodeMind-Llama3-8B-unsloth_v2-merged_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/LimYeri/CodeMind-Llama3-8B-unsloth_v2-merged" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">LimYeri/CodeMind-Llama3-8B-unsloth_v2-merged</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/LimYeri__CodeMind-Llama3-8B-unsloth_v2-merged-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
LimYeri/CodeMind-Llama3-8B-unsloth_v2-merged
d4ec745f8279e3ac6d41709153c21cc077e66385
22.409967
apache-2.0
0
8
true
true
true
false
true
0.826625
0.694628
69.462803
0.486009
26.655629
0.062689
6.268882
0.265101
2.013423
0.331615
2.21849
0.350565
27.840573
false
2024-06-04
2024-08-28
1
unsloth/llama-3-8b-Instruct-bnb-4bit
LimYeri_CodeMind-Llama3-8B-unsloth_v3-merged_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/LimYeri/CodeMind-Llama3-8B-unsloth_v3-merged" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">LimYeri/CodeMind-Llama3-8B-unsloth_v3-merged</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/LimYeri__CodeMind-Llama3-8B-unsloth_v3-merged-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
LimYeri/CodeMind-Llama3-8B-unsloth_v3-merged
548a221b00d8056fe7090f5e6e0af58ee7c62563
21.845773
0
8
false
true
true
false
true
0.836281
0.676293
67.629335
0.490816
27.025216
0.064199
6.41994
0.258389
1.118568
0.335615
1.151823
0.349568
27.729758
false
2024-08-28
0
Removed
LimYeri_CodeMind-Llama3-8B-unsloth_v4-one-DPO-merged_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/LimYeri/CodeMind-Llama3-8B-unsloth_v4-one-DPO-merged" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">LimYeri/CodeMind-Llama3-8B-unsloth_v4-one-DPO-merged</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/LimYeri__CodeMind-Llama3-8B-unsloth_v4-one-DPO-merged-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
LimYeri/CodeMind-Llama3-8B-unsloth_v4-one-DPO-merged
e21e4932c56cebd3f9816bf083c1792cdccbe7a7
21.749254
apache-2.0
0
8
true
true
true
false
true
0.732312
0.649241
64.924068
0.485266
26.370177
0.070242
7.024169
0.268456
2.46085
0.360792
3.565625
0.335356
26.150635
false
2024-06-07
2024-08-28
1
unsloth/llama-3-8b-Instruct-bnb-4bit
LimYeri_CodeMind-Llama3-8B-unsloth_v4-one-merged_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/LimYeri/CodeMind-Llama3-8B-unsloth_v4-one-merged" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">LimYeri/CodeMind-Llama3-8B-unsloth_v4-one-merged</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/LimYeri__CodeMind-Llama3-8B-unsloth_v4-one-merged-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
LimYeri/CodeMind-Llama3-8B-unsloth_v4-one-merged
9c8939ccdc10beee56462eadbc16e28359a6d4c4
17.613246
apache-2.0
0
8
true
true
true
false
false
0.887142
0.321087
32.108694
0.473876
24.574735
0.055136
5.513595
0.309564
7.941834
0.406927
9.399219
0.335273
26.141401
false
2024-06-06
2024-08-28
1
unsloth/llama-3-8b-Instruct-bnb-4bit
LimYeri_CodeMind-Llama3.1-8B-unsloth-merged_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/LimYeri/CodeMind-Llama3.1-8B-unsloth-merged" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">LimYeri/CodeMind-Llama3.1-8B-unsloth-merged</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/LimYeri__CodeMind-Llama3.1-8B-unsloth-merged-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
LimYeri/CodeMind-Llama3.1-8B-unsloth-merged
911ffe6614d23bfc9cb7ece0cd3afd861a65d7f0
22.254755
mit
0
8
true
true
true
false
true
0.833687
0.649016
64.901572
0.469478
24.185739
0.104985
10.498489
0.264262
1.901566
0.37524
6.038281
0.334026
26.002881
false
2024-08-31
2024-08-31
2
unsloth/Meta-Llama-3.1-8B
Locutusque_Hercules-6.0-Llama-3.1-8B_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Locutusque/Hercules-6.0-Llama-3.1-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Locutusque/Hercules-6.0-Llama-3.1-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Locutusque__Hercules-6.0-Llama-3.1-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Locutusque/Hercules-6.0-Llama-3.1-8B
f35a95aeabf9f82bbd64bfc6fd0d857df750ee83
23.483546
llama3.1
8
8
true
true
true
false
true
0.885143
0.663004
66.300416
0.48133
26.639652
0.14577
14.577039
0.264262
1.901566
0.362125
2.432292
0.361453
29.05031
false
2024-09-25
2024-09-26
0
Locutusque/Hercules-6.0-Llama-3.1-8B
Locutusque_Hercules-6.1-Llama-3.1-8B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Locutusque/Hercules-6.1-Llama-3.1-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Locutusque/Hercules-6.1-Llama-3.1-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Locutusque__Hercules-6.1-Llama-3.1-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Locutusque/Hercules-6.1-Llama-3.1-8B
f4abf4385111b4acbea8bee2c6636ef84b2dac43
22.609956
llama3.1
7
8
true
true
true
false
true
0.956803
0.600681
60.068064
0.465624
24.151873
0.169184
16.918429
0.260906
1.454139
0.355333
3.416667
0.366855
29.650561
false
2024-09-30
2024-10-01
0
Locutusque/Hercules-6.1-Llama-3.1-8B
Locutusque_Llama-3-NeuralHercules-5.0-8B_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Locutusque/Llama-3-NeuralHercules-5.0-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Locutusque/Llama-3-NeuralHercules-5.0-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Locutusque__Llama-3-NeuralHercules-5.0-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Locutusque/Llama-3-NeuralHercules-5.0-8B
2bbb675e592a1772f2389fe2d58a5b610d479d94
15.992123
llama3
2
8
true
true
true
false
true
0.890715
0.448931
44.893106
0.394047
16.342072
0.04003
4.003021
0.268456
2.46085
0.388073
6.775781
0.293301
21.477911
false
2024-05-28
2024-06-26
0
Locutusque/Llama-3-NeuralHercules-5.0-8B
Locutusque_Llama-3-Yggdrasil-2.0-8B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Locutusque/Llama-3-Yggdrasil-2.0-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Locutusque/Llama-3-Yggdrasil-2.0-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Locutusque__Llama-3-Yggdrasil-2.0-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Locutusque/Llama-3-Yggdrasil-2.0-8B
ec2329946ccc81a7c1ae36210728f717bc4f01d8
20.359493
1
8
false
true
true
false
true
0.814941
0.537058
53.705834
0.477246
26.922801
0.077039
7.703927
0.262584
1.677852
0.397656
8.073698
0.316656
24.072843
false
2024-06-05
2024-06-26
1
Locutusque/Llama-3-Yggdrasil-2.0-8B (Merge)
Locutusque_TinyMistral-248M-v2.5_float16
float16
🟩 continuously pretrained
🟩
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Locutusque/TinyMistral-248M-v2.5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Locutusque/TinyMistral-248M-v2.5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Locutusque__TinyMistral-248M-v2.5-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Locutusque/TinyMistral-248M-v2.5
214e48aabc01235e25c67477898756f1bebef215
3.871794
apache-2.0
26
0
true
false
true
false
true
0.242214
0.133641
13.364096
0.303858
3.181881
0
0
0.250839
0.111857
0.378156
5.069531
0.113531
1.503398
false
2024-01-24
2024-09-17
0
Locutusque/TinyMistral-248M-v2.5
Luni_StarDust-12b-v1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Luni/StarDust-12b-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Luni/StarDust-12b-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Luni__StarDust-12b-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Luni/StarDust-12b-v1
91976b0c71dce1310f4a6139552e10a6149bdc31
23.183629
apache-2.0
14
12
true
true
true
false
true
1.452633
0.545926
54.592592
0.536614
34.446276
0.060423
6.042296
0.276007
3.467562
0.432448
13.75599
0.341174
26.79706
false
2024-08-29
2024-09-03
1
Luni/StarDust-12b-v1 (Merge)
Luni_StarDust-12b-v2_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Luni/StarDust-12b-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Luni/StarDust-12b-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Luni__StarDust-12b-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Luni/StarDust-12b-v2
75bffd7b86f37c2cebc4fdf83fbc3ab33d6c6e05
24.089195
apache-2.0
31
12
true
true
true
false
true
1.532525
0.562862
56.286209
0.541948
34.952884
0.061178
6.117825
0.293624
5.816555
0.433813
14.259896
0.343916
27.101803
false
2024-09-01
2024-09-03
1
Luni/StarDust-12b-v2 (Merge)
Lyte_Llama-3.1-8B-Instruct-Reasoner-1o1_v0.3_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Lyte/Llama-3.1-8B-Instruct-Reasoner-1o1_v0.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Lyte/Llama-3.1-8B-Instruct-Reasoner-1o1_v0.3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Lyte__Llama-3.1-8B-Instruct-Reasoner-1o1_v0.3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Lyte/Llama-3.1-8B-Instruct-Reasoner-1o1_v0.3
35ab483f04afa763f36f978408f4f82e0379ee25
25.262525
apache-2.0
7
8
true
true
true
false
true
0.915737
0.709816
70.981551
0.494952
27.835212
0.160876
16.087613
0.270134
2.684564
0.346125
4.898958
0.361785
29.087249
false
2024-09-17
2024-09-17
2
unsloth/Meta-Llama-3.1-8B
Lyte_Llama-3.2-1B-Instruct-COT-RL-Expriement1-EP04_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Lyte/Llama-3.2-1B-Instruct-COT-RL-Expriement1-EP04" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Lyte/Llama-3.2-1B-Instruct-COT-RL-Expriement1-EP04</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Lyte__Llama-3.2-1B-Instruct-COT-RL-Expriement1-EP04-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Lyte/Llama-3.2-1B-Instruct-COT-RL-Expriement1-EP04
59d93307c6f2cb7a29c593cbc7393122d502d1b1
14.546603
0
1
false
true
true
false
true
0.450133
0.57735
57.735032
0.351504
8.894409
0.074018
7.401813
0.260067
1.342282
0.323552
2.54401
0.184259
9.362072
false
2024-09-26
0
Removed
Lyte_Llama-3.2-3B-Overthinker_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Lyte/Llama-3.2-3B-Overthinker" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Lyte/Llama-3.2-3B-Overthinker</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Lyte__Llama-3.2-3B-Overthinker-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Lyte/Llama-3.2-3B-Overthinker
0e7af37fb3381365905fc2df24811c0e6d2ba5b2
19.077846
apache-2.0
19
3
true
true
true
false
true
0.73364
0.640798
64.079753
0.432009
20.095582
0.030967
3.096677
0.259228
1.230425
0.341906
3.904948
0.298537
22.059693
false
2024-10-17
2024-10-18
2
meta-llama/Llama-3.2-3B-Instruct
M4-ai_TinyMistral-248M-v3_bfloat16
bfloat16
🟢 pretrained
🟢
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/M4-ai/TinyMistral-248M-v3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">M4-ai/TinyMistral-248M-v3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/M4-ai__TinyMistral-248M-v3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
M4-ai/TinyMistral-248M-v3
fa23fe617768c671f0bbbff1edf4556cfe844167
4.130108
apache-2.0
4
0
true
true
true
false
false
0.234184
0.163866
16.386632
0.288455
1.777554
0
0
0.240772
0
0.379333
5.15
0.113198
1.46646
false
2024-02-05
2024-10-18
0
M4-ai/TinyMistral-248M-v3
MEscriva_ECE-PRYMMAL-0.5B-FT-V5-MUSR-Mathis_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
?
<a target="_blank" href="https://huggingface.co/MEscriva/ECE-PRYMMAL-0.5B-FT-V5-MUSR-Mathis" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MEscriva/ECE-PRYMMAL-0.5B-FT-V5-MUSR-Mathis</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MEscriva__ECE-PRYMMAL-0.5B-FT-V5-MUSR-Mathis-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MEscriva/ECE-PRYMMAL-0.5B-FT-V5-MUSR-Mathis
7a9d848188a674302d64a865786d4508be19571a
3.818034
0
0
false
true
true
false
false
1.051556
0.086629
8.662903
0.305729
3.237774
0.004532
0.453172
0.251678
0.223714
0.401719
8.614844
0.115442
1.715795
false
2024-11-12
2024-11-19
0
MEscriva/ECE-PRYMMAL-0.5B-FT-V5-MUSR-Mathis
MLP-KTLim_llama-3-Korean-Bllossom-8B_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/MLP-KTLim/llama-3-Korean-Bllossom-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MLP-KTLim/llama-3-Korean-Bllossom-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MLP-KTLim__llama-3-Korean-Bllossom-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MLP-KTLim/llama-3-Korean-Bllossom-8B
8a738f9f622ffc2b0a4a6b81dabbca80406248bf
20.333976
llama3
281
8
true
true
true
false
true
0.774721
0.51128
51.128007
0.490046
26.927528
0.098187
9.818731
0.262584
1.677852
0.367458
3.632292
0.359375
28.819444
false
2024-04-25
2024-07-09
1
MLP-KTLim/llama-3-Korean-Bllossom-8B (Merge)
MTSAIR_MultiVerse_70B_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/MTSAIR/MultiVerse_70B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MTSAIR/MultiVerse_70B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MTSAIR__MultiVerse_70B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MTSAIR/MultiVerse_70B
063430cdc4d972a0884e3e3e3d45ea4afbdf71a2
32.00519
other
39
72
true
true
true
false
false
13.601817
0.524918
52.491833
0.618313
46.135899
0.178248
17.824773
0.354027
13.870246
0.47399
18.815365
0.486037
42.893026
false
2024-03-25
2024-06-29
0
MTSAIR/MultiVerse_70B
Magpie-Align_Llama-3-8B-Magpie-Align-SFT-v0.1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Magpie-Align/Llama-3-8B-Magpie-Align-SFT-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Magpie-Align/Llama-3-8B-Magpie-Align-SFT-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Magpie-Align__Llama-3-8B-Magpie-Align-SFT-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Magpie-Align/Llama-3-8B-Magpie-Align-SFT-v0.1
1ed587f54f70334f495efb9c027acb03e96fe24f
15.928911
llama3
4
8
true
true
true
false
true
0.833569
0.436142
43.614166
0.46151
23.990124
0.055891
5.589124
0.262584
1.677852
0.32774
0
0.28632
20.702202
false
2024-06-06
2024-09-17
1
meta-llama/Meta-Llama-3-8B
Magpie-Align_Llama-3-8B-Magpie-Align-SFT-v0.3_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Magpie-Align/Llama-3-8B-Magpie-Align-SFT-v0.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Magpie-Align/Llama-3-8B-Magpie-Align-SFT-v0.3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Magpie-Align__Llama-3-8B-Magpie-Align-SFT-v0.3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Magpie-Align/Llama-3-8B-Magpie-Align-SFT-v0.3
d2578eb754d1c20efe604749296580f680950917
17.490285
llama3
3
8
true
true
true
false
true
0.89542
0.506359
50.635868
0.457158
23.698816
0.069486
6.94864
0.26594
2.12528
0.342375
0.396875
0.290226
21.136229
false
2024-07-13
2024-08-06
1
meta-llama/Meta-Llama-3-8B
Magpie-Align_Llama-3-8B-Magpie-Align-v0.1_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Magpie-Align/Llama-3-8B-Magpie-Align-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Magpie-Align/Llama-3-8B-Magpie-Align-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Magpie-Align__Llama-3-8B-Magpie-Align-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Magpie-Align/Llama-3-8B-Magpie-Align-v0.1
a83ddac146fb2da1dd1bfa4069e336074d1439a8
16.473094
llama3
10
8
true
true
true
false
true
0.906849
0.411812
41.181177
0.481144
26.691761
0.033988
3.398792
0.275168
3.355705
0.304698
1.920573
0.300615
22.290559
false
2024-06-29
2024-07-03
2
meta-llama/Meta-Llama-3-8B
Magpie-Align_Llama-3-8B-Magpie-Align-v0.1_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Magpie-Align/Llama-3-8B-Magpie-Align-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Magpie-Align/Llama-3-8B-Magpie-Align-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Magpie-Align__Llama-3-8B-Magpie-Align-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Magpie-Align/Llama-3-8B-Magpie-Align-v0.1
a83ddac146fb2da1dd1bfa4069e336074d1439a8
16.307771
llama3
10
8
true
true
true
false
true
1.849536
0.402719
40.271923
0.478941
26.289712
0.035498
3.549849
0.276846
3.579418
0.308698
1.920573
0.300116
22.235151
false
2024-06-29
2024-07-03
2
meta-llama/Meta-Llama-3-8B
Magpie-Align_Llama-3-8B-Magpie-Align-v0.3_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Magpie-Align/Llama-3-8B-Magpie-Align-v0.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Magpie-Align/Llama-3-8B-Magpie-Align-v0.3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Magpie-Align__Llama-3-8B-Magpie-Align-v0.3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Magpie-Align/Llama-3-8B-Magpie-Align-v0.3
7e420ddd6ff48bf213dcab2a9ddb7845b80dd1aa
16.911558
llama3
3
8
true
true
true
false
true
0.74229
0.449706
44.970567
0.456961
24.311447
0.02719
2.719033
0.265101
2.013423
0.340604
3.742188
0.313414
23.712692
false
2024-07-15
2024-08-06
2
meta-llama/Meta-Llama-3-8B
Magpie-Align_Llama-3.1-8B-Magpie-Align-SFT-v0.1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Magpie-Align/Llama-3.1-8B-Magpie-Align-SFT-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Magpie-Align/Llama-3.1-8B-Magpie-Align-SFT-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Magpie-Align__Llama-3.1-8B-Magpie-Align-SFT-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Magpie-Align/Llama-3.1-8B-Magpie-Align-SFT-v0.1
b191916912f0e76b2bdc93c46c0af590cc87e7ae
17.975799
llama3.1
2
8
true
true
true
false
true
1.82999
0.478207
47.820671
0.476416
26.136677
0.089879
8.987915
0.260906
1.454139
0.33974
1.866667
0.294299
21.588726
false
2024-07-23
2024-09-17
1
meta-llama/Meta-Llama-3.1-8B
Magpie-Align_Llama-3.1-8B-Magpie-Align-v0.1_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Magpie-Align/Llama-3.1-8B-Magpie-Align-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Magpie-Align/Llama-3.1-8B-Magpie-Align-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Magpie-Align__Llama-3.1-8B-Magpie-Align-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Magpie-Align/Llama-3.1-8B-Magpie-Align-v0.1
dd34258a5f2bf7630b5a8e5662b050c60a088927
16.439101
llama3.1
2
8
true
true
true
false
true
0.708041
0.445784
44.578385
0.46224
24.040537
0
0
0.263423
1.789709
0.314062
3.091146
0.326213
25.134826
false
2024-07-24
2024-09-17
2
meta-llama/Meta-Llama-3.1-8B
Magpie-Align_MagpieLM-8B-Chat-v0.1_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Magpie-Align/MagpieLM-8B-Chat-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Magpie-Align/MagpieLM-8B-Chat-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Magpie-Align__MagpieLM-8B-Chat-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Magpie-Align/MagpieLM-8B-Chat-v0.1
0b30eabc82a01fb42f44ba62c2dc81e1bd09cc04
14.006707
llama3.1
20
8
true
true
true
false
true
0.736877
0.370071
37.007141
0.417234
18.255805
0
0
0.261745
1.565996
0.350063
2.824479
0.319481
24.38682
false
2024-09-15
2024-09-19
2
meta-llama/Meta-Llama-3.1-8B
Magpie-Align_MagpieLM-8B-SFT-v0.1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Magpie-Align/MagpieLM-8B-SFT-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Magpie-Align/MagpieLM-8B-SFT-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Magpie-Align__MagpieLM-8B-SFT-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Magpie-Align/MagpieLM-8B-SFT-v0.1
b91f605a511707cb3b7f0893a8ed80c77b32d5a8
16.915349
llama3.1
3
8
true
true
true
false
true
0.800421
0.472062
47.206191
0.455285
23.612313
0.023414
2.34139
0.267617
2.348993
0.364885
3.877344
0.298953
22.105866
false
2024-09-15
2024-09-19
1
meta-llama/Meta-Llama-3.1-8B
ManoloPueblo_ContentCuisine_1-7B-slerp_float16
float16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/ManoloPueblo/ContentCuisine_1-7B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ManoloPueblo/ContentCuisine_1-7B-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ManoloPueblo__ContentCuisine_1-7B-slerp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ManoloPueblo/ContentCuisine_1-7B-slerp
e811e880075a2945623040ee43e9a6972675ff2e
21.04021
1
7
false
true
true
false
false
0.499506
0.390704
39.070444
0.518844
32.789744
0.072508
7.250755
0.302852
7.04698
0.467198
17.266406
0.305352
22.816933
false
2024-11-12
2024-11-12
1
ManoloPueblo/ContentCuisine_1-7B-slerp (Merge)
ManoloPueblo_LLM_MERGE_CC2_float16
float16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/ManoloPueblo/LLM_MERGE_CC2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ManoloPueblo/LLM_MERGE_CC2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ManoloPueblo__LLM_MERGE_CC2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ManoloPueblo/LLM_MERGE_CC2
a39dcd4e8175c0e2ab9bda2c7a4f377b97549644
20.73478
apache-2.0
1
7
true
false
true
false
false
0.573543
0.385309
38.530876
0.520937
33.241074
0.063444
6.344411
0.30453
7.270694
0.459292
16.444792
0.303191
22.576832
false
2024-11-02
2024-11-12
0
ManoloPueblo/LLM_MERGE_CC2
ManoloPueblo_LLM_MERGE_CC3_float16
float16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/ManoloPueblo/LLM_MERGE_CC3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ManoloPueblo/LLM_MERGE_CC3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ManoloPueblo__LLM_MERGE_CC3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ManoloPueblo/LLM_MERGE_CC3
79d2bd3866e363b9e700f59cfc573b2bc9de2442
21.716405
apache-2.0
1
7
true
false
true
false
false
0.53693
0.395875
39.587517
0.524629
33.230018
0.081571
8.1571
0.309564
7.941834
0.467167
17.429167
0.315575
23.952793
false
2024-11-10
2024-11-12
0
ManoloPueblo/LLM_MERGE_CC3
MarinaraSpaghetti_NemoReRemix-12B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/MarinaraSpaghetti/NemoReRemix-12B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MarinaraSpaghetti/NemoReRemix-12B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MarinaraSpaghetti__NemoReRemix-12B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MarinaraSpaghetti/NemoReRemix-12B
9ebc7c2d4577b663fb050d86ed91fb676eb2e1f2
21.682114
25
12
false
true
true
false
false
1.577008
0.334251
33.42509
0.553651
36.124702
0.069486
6.94864
0.317953
9.060403
0.450146
15.668229
0.359791
28.865618
false
2024-08-14
2024-09-17
1
MarinaraSpaghetti/NemoReRemix-12B (Merge)
MarinaraSpaghetti_Nemomix-v4.0-12B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/MarinaraSpaghetti/Nemomix-v4.0-12B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MarinaraSpaghetti/Nemomix-v4.0-12B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MarinaraSpaghetti__Nemomix-v4.0-12B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MarinaraSpaghetti/Nemomix-v4.0-12B
69fbd8449ce3e916fc257e982a78189308123074
24.37986
21
12
false
true
true
false
true
1.354548
0.557466
55.746641
0.527499
32.879943
0.102719
10.271903
0.291946
5.592841
0.424448
12.75599
0.361287
29.031841
false
2024-07-30
2024-08-02
1
MarinaraSpaghetti/Nemomix-v4.0-12B (Merge)
Marsouuu_MiniMathExpert-2_61B-ECE-PRYMMAL-Martial_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/Marsouuu/MiniMathExpert-2_61B-ECE-PRYMMAL-Martial" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Marsouuu/MiniMathExpert-2_61B-ECE-PRYMMAL-Martial</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Marsouuu__MiniMathExpert-2_61B-ECE-PRYMMAL-Martial-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Marsouuu/MiniMathExpert-2_61B-ECE-PRYMMAL-Martial
df21939a22e7233ebb7d62dfaf1c854facc5c772
12.532384
apache-2.0
1
2
true
false
true
false
false
1.45279
0.254842
25.48416
0.395273
15.297499
0.076284
7.628399
0.275168
3.355705
0.408323
9.273698
0.227394
14.154846
false
2024-10-06
2024-10-06
1
Marsouuu/MiniMathExpert-2_61B-ECE-PRYMMAL-Martial (Merge)
Marsouuu_MiniQwenMathExpert-ECE-PRYMMAL-Martial_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Marsouuu/MiniQwenMathExpert-ECE-PRYMMAL-Martial" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Marsouuu/MiniQwenMathExpert-ECE-PRYMMAL-Martial</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Marsouuu__MiniQwenMathExpert-ECE-PRYMMAL-Martial-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Marsouuu/MiniQwenMathExpert-ECE-PRYMMAL-Martial
0787682e65f7763ef978c4cf2e32803be8b49298
14.880579
0
1
false
true
true
false
false
0.689158
0.279496
27.949618
0.423013
19.019949
0.101964
10.196375
0.281879
4.250559
0.38674
6.509115
0.292221
21.357861
false
2024-10-07
2024-10-07
1
Marsouuu/MiniQwenMathExpert-ECE-PRYMMAL-Martial (Merge)
Marsouuu_MistralBase-4x7B-MoE-ECE-PRYMMAL-Martial_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/Marsouuu/MistralBase-4x7B-MoE-ECE-PRYMMAL-Martial" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Marsouuu/MistralBase-4x7B-MoE-ECE-PRYMMAL-Martial</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Marsouuu__MistralBase-4x7B-MoE-ECE-PRYMMAL-Martial-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Marsouuu/MistralBase-4x7B-MoE-ECE-PRYMMAL-Martial
9cb9e74d2a65abd6458dffac103ad99c3b8f5154
6.572938
apache-2.0
1
24
true
false
false
false
false
1.906509
0.169736
16.97363
0.346437
8.870227
0.003021
0.302115
0.259228
1.230425
0.399083
7.852083
0.137882
4.209146
false
2024-10-03
2024-10-03
1
Marsouuu/MistralBase-4x7B-MoE-ECE-PRYMMAL-Martial (Merge)
Marsouuu_general3B-ECE-PRYMMAL-Martial_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Phi3ForCausalLM
<a target="_blank" href="https://huggingface.co/Marsouuu/general3B-ECE-PRYMMAL-Martial" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Marsouuu/general3B-ECE-PRYMMAL-Martial</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Marsouuu__general3B-ECE-PRYMMAL-Martial-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Marsouuu/general3B-ECE-PRYMMAL-Martial
42992194a835a6fcad1edf1f94527ac08a7a60fb
22.07256
apache-2.0
0
3
true
false
true
false
false
0.724443
0.272227
27.222658
0.539435
35.700873
0.100453
10.045317
0.319631
9.284116
0.470052
18.223177
0.387633
31.95922
false
2024-10-23
2024-10-23
1
Marsouuu/general3B-ECE-PRYMMAL-Martial (Merge)
Marsouuu_general3Bv2-ECE-PRYMMAL-Martial_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Marsouuu/general3Bv2-ECE-PRYMMAL-Martial" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Marsouuu/general3Bv2-ECE-PRYMMAL-Martial</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Marsouuu__general3Bv2-ECE-PRYMMAL-Martial-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Marsouuu/general3Bv2-ECE-PRYMMAL-Martial
c6c5b3b0ecf9d04fc3a35bc4135df7cc08be3eb9
31.036691
apache-2.0
0
7
true
false
true
false
false
0.717132
0.569282
56.928173
0.563657
37.667763
0.314199
31.41994
0.310403
8.053691
0.439604
13.283854
0.449801
38.866726
false
2024-11-06
2024-11-06
1
Marsouuu/general3Bv2-ECE-PRYMMAL-Martial (Merge)
Marsouuu_lareneg1_78B-ECE-PRYMMAL-Martial_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Marsouuu/lareneg1_78B-ECE-PRYMMAL-Martial" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Marsouuu/lareneg1_78B-ECE-PRYMMAL-Martial</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Marsouuu__lareneg1_78B-ECE-PRYMMAL-Martial-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Marsouuu/lareneg1_78B-ECE-PRYMMAL-Martial
907a62bb805596e2105c9dca28c0e9ed1e9fd402
14.880579
apache-2.0
0
1
true
false
true
false
false
0.632892
0.279496
27.949618
0.423013
19.019949
0.101964
10.196375
0.281879
4.250559
0.38674
6.509115
0.292221
21.357861
false
2024-10-23
2024-10-23
1
Marsouuu/lareneg1_78B-ECE-PRYMMAL-Martial (Merge)
Marsouuu_lareneg3B-ECE-PRYMMAL-Martial_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Phi3ForCausalLM
<a target="_blank" href="https://huggingface.co/Marsouuu/lareneg3B-ECE-PRYMMAL-Martial" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Marsouuu/lareneg3B-ECE-PRYMMAL-Martial</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Marsouuu__lareneg3B-ECE-PRYMMAL-Martial-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Marsouuu/lareneg3B-ECE-PRYMMAL-Martial
2c8be0ac28ae27dd441298e83f19e17409d89f4e
23.816174
apache-2.0
0
3
true
false
true
false
false
0.490995
0.330329
33.032908
0.545333
36.350722
0.14426
14.425982
0.324664
9.955257
0.472469
18.391927
0.376662
30.740248
false
2024-11-06
2024-11-06
1
Marsouuu/lareneg3B-ECE-PRYMMAL-Martial (Merge)
Marsouuu_lareneg3Bv2-ECE-PRYMMAL-Martial_float16
float16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Marsouuu/lareneg3Bv2-ECE-PRYMMAL-Martial" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Marsouuu/lareneg3Bv2-ECE-PRYMMAL-Martial</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Marsouuu__lareneg3Bv2-ECE-PRYMMAL-Martial-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Marsouuu/lareneg3Bv2-ECE-PRYMMAL-Martial
ff92a6f314c392085af6c85f60a7da745e064653
31.269262
apache-2.0
1
7
true
false
true
false
false
0.665744
0.575327
57.53268
0.562336
37.47164
0.314955
31.495468
0.319631
9.284116
0.436938
12.817188
0.45113
39.01448
false
2024-11-06
2024-11-06
1
Marsouuu/lareneg3Bv2-ECE-PRYMMAL-Martial (Merge)
MaziyarPanahi_Calme-4x7B-MoE-v0.1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/MaziyarPanahi/Calme-4x7B-MoE-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MaziyarPanahi/Calme-4x7B-MoE-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MaziyarPanahi__Calme-4x7B-MoE-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MaziyarPanahi/Calme-4x7B-MoE-v0.1
e2fab90eef37977002947684043f139a1660f519
20.023903
apache-2.0
2
24
true
true
false
false
false
1.360983
0.431521
43.152059
0.510282
31.261878
0.08006
8.006042
0.281879
4.250559
0.419885
10.61901
0.305685
22.853871
false
2024-03-17
2024-08-05
0
MaziyarPanahi/Calme-4x7B-MoE-v0.1
MaziyarPanahi_Calme-4x7B-MoE-v0.2_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/MaziyarPanahi/Calme-4x7B-MoE-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MaziyarPanahi/Calme-4x7B-MoE-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MaziyarPanahi__Calme-4x7B-MoE-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MaziyarPanahi/Calme-4x7B-MoE-v0.2
ffef41baf94b3f88b30cf0aeb3fd72d9e4187161
20.163773
apache-2.0
2
24
true
true
false
false
false
1.415711
0.429447
42.94472
0.511077
31.39682
0.073263
7.326284
0.279362
3.914989
0.43176
12.536719
0.305768
22.863106
false
2024-03-17
2024-08-05
0
MaziyarPanahi/Calme-4x7B-MoE-v0.2
MaziyarPanahi_Llama-3-70B-Instruct-v0.1_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/MaziyarPanahi/Llama-3-70B-Instruct-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MaziyarPanahi/Llama-3-70B-Instruct-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MaziyarPanahi__Llama-3-70B-Instruct-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MaziyarPanahi/Llama-3-70B-Instruct-v0.1
6db1cb4256525fc5429734ddc0eb941d08d0be30
26.056975
llama3
1
70
true
true
true
false
true
11.263986
0.471438
47.143801
0.536626
32.712917
0.163897
16.389728
0.284396
4.58613
0.443302
15.31276
0.461769
40.196513
false
2024-05-14
2024-06-26
2
meta-llama/Meta-Llama-3-70B
MaziyarPanahi_Llama-3-8B-Instruct-v0.10_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/MaziyarPanahi/Llama-3-8B-Instruct-v0.10" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MaziyarPanahi/Llama-3-8B-Instruct-v0.10</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MaziyarPanahi__Llama-3-8B-Instruct-v0.10-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MaziyarPanahi/Llama-3-8B-Instruct-v0.10
4411eb9f6f5e4c462a6bdbc64c26dcc123100b66
26.759639
other
6
8
true
true
true
false
true
1.142131
0.766743
76.674335
0.492431
27.924674
0.055136
5.513595
0.308725
7.829978
0.421437
10.813021
0.38622
31.802231
false
2024-06-04
2024-06-26
4
meta-llama/Meta-Llama-3-8B-Instruct
MaziyarPanahi_Llama-3-8B-Instruct-v0.8_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/MaziyarPanahi/Llama-3-8B-Instruct-v0.8" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MaziyarPanahi/Llama-3-8B-Instruct-v0.8</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MaziyarPanahi__Llama-3-8B-Instruct-v0.8-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MaziyarPanahi/Llama-3-8B-Instruct-v0.8
94d222b8447b600b9836da4036df9490b59fe966
26.901472
other
8
8
true
true
true
false
true
2.536338
0.752755
75.275491
0.496278
28.270419
0.07855
7.854985
0.305369
7.38255
0.420198
10.92474
0.385306
31.70065
false
2024-05-01
2024-07-11
2
meta-llama/Meta-Llama-3-8B-Instruct
MaziyarPanahi_Llama-3-8B-Instruct-v0.9_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/MaziyarPanahi/Llama-3-8B-Instruct-v0.9" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MaziyarPanahi/Llama-3-8B-Instruct-v0.9</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MaziyarPanahi__Llama-3-8B-Instruct-v0.9-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MaziyarPanahi/Llama-3-8B-Instruct-v0.9
ddf91fdc0a3ab5e5d76864f1c4cf44e5adacd565
26.824409
other
6
8
true
true
true
false
true
0.766357
0.763046
76.304649
0.493613
27.903013
0.075529
7.55287
0.307886
7.718121
0.414802
9.85026
0.384558
31.617538
false
2024-05-30
2024-08-06
3
meta-llama/Meta-Llama-3-8B-Instruct
MaziyarPanahi_Qwen1.5-MoE-A2.7B-Wikihow_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2MoeForCausalLM
<a target="_blank" href="https://huggingface.co/MaziyarPanahi/Qwen1.5-MoE-A2.7B-Wikihow" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MaziyarPanahi/Qwen1.5-MoE-A2.7B-Wikihow</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MaziyarPanahi__Qwen1.5-MoE-A2.7B-Wikihow-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MaziyarPanahi/Qwen1.5-MoE-A2.7B-Wikihow
191cf0630b7b50fe1fc9be198e1f203935df1428
11.507207
apache-2.0
2
14
true
true
false
false
true
8.306083
0.295433
29.543279
0.392007
15.473439
0.033233
3.323263
0.275168
3.355705
0.350219
2.010677
0.238032
15.336879
false
2024-03-30
2024-09-12
1
Qwen/Qwen1.5-MoE-A2.7B
MaziyarPanahi_Qwen2-7B-Instruct-v0.1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/MaziyarPanahi/Qwen2-7B-Instruct-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MaziyarPanahi/Qwen2-7B-Instruct-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MaziyarPanahi__Qwen2-7B-Instruct-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MaziyarPanahi/Qwen2-7B-Instruct-v0.1
5123ecd76cefd4ef3b6009542b13e060d03e5232
22.981509
apache-2.0
1
7
true
true
true
false
false
1.453599
0.335225
33.522498
0.512306
31.923607
0.221299
22.129909
0.285235
4.697987
0.443479
13.868229
0.385721
31.746823
false
2024-06-27
2024-07-07
1
Qwen/Qwen2-7B
MaziyarPanahi_Qwen2-7B-Instruct-v0.8_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/MaziyarPanahi/Qwen2-7B-Instruct-v0.8" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MaziyarPanahi/Qwen2-7B-Instruct-v0.8</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MaziyarPanahi__Qwen2-7B-Instruct-v0.8-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MaziyarPanahi/Qwen2-7B-Instruct-v0.8
a6f9d0e11efcba18c905554ab43b877ead187a77
19.520373
apache-2.0
6
7
true
true
true
false
false
1.334106
0.277473
27.747266
0.463711
25.532525
0.174471
17.44713
0.293624
5.816555
0.429313
12.064063
0.356632
28.514702
false
2024-06-27
2024-07-07
1
Qwen/Qwen2-7B
MaziyarPanahi_calme-2.1-llama3.1-70b_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/MaziyarPanahi/calme-2.1-llama3.1-70b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MaziyarPanahi/calme-2.1-llama3.1-70b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MaziyarPanahi__calme-2.1-llama3.1-70b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MaziyarPanahi/calme-2.1-llama3.1-70b
f39ad1c90b0f30379e80756d29c6533cf84c362a
34.339859
4
70
false
true
true
false
true
15.45484
0.84343
84.342988
0.644755
48.553646
0.01435
1.435045
0.32802
10.402685
0.438031
13.720573
0.528258
47.58422
false
2024-07-23
2024-07-24
2
meta-llama/Meta-Llama-3.1-70B
MaziyarPanahi_calme-2.1-phi3-4b_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Phi3ForCausalLM
<a target="_blank" href="https://huggingface.co/MaziyarPanahi/calme-2.1-phi3-4b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MaziyarPanahi/calme-2.1-phi3-4b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MaziyarPanahi__calme-2.1-phi3-4b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MaziyarPanahi/calme-2.1-phi3-4b
6764c79badacba5fa3584d2d2593d762caa1d17d
24.588084
mit
1
3
true
true
true
false
true
0.752469
0.552521
55.252065
0.559532
38.12428
0.047583
4.758308
0.329698
10.626398
0.401531
8.258073
0.374584
30.509382
false
2024-05-09
2024-06-26
1
microsoft/Phi-3-mini-4k-instruct
MaziyarPanahi_calme-2.1-phi3.5-4b_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Phi3ForCausalLM
<a target="_blank" href="https://huggingface.co/MaziyarPanahi/calme-2.1-phi3.5-4b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MaziyarPanahi/calme-2.1-phi3.5-4b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MaziyarPanahi__calme-2.1-phi3.5-4b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MaziyarPanahi/calme-2.1-phi3.5-4b
583b7f382a8ed35f6f7c09f2950f0f2346945a83
27.207327
mit
3
3
true
true
true
false
true
1.004551
0.56591
56.590956
0.54837
36.110097
0.156344
15.634441
0.34396
12.527964
0.399458
9.765625
0.393534
32.614879
false
2024-08-23
2024-08-23
1
microsoft/Phi-3.5-mini-instruct
MaziyarPanahi_calme-2.1-qwen2-72b_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/MaziyarPanahi/calme-2.1-qwen2-72b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MaziyarPanahi/calme-2.1-qwen2-72b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MaziyarPanahi__calme-2.1-qwen2-72b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MaziyarPanahi/calme-2.1-qwen2-72b
0369c39770f45f2464587918f2dbdb8449ea3a0d
43.945772
other
28
72
true
true
true
false
true
13.134871
0.816277
81.627748
0.696556
57.325882
0.380665
38.066465
0.380872
17.449664
0.473219
20.152344
0.541473
49.052527
false
2024-06-08
2024-06-26
2
Qwen/Qwen2-72B
MaziyarPanahi_calme-2.1-qwen2-7b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/MaziyarPanahi/calme-2.1-qwen2-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MaziyarPanahi/calme-2.1-qwen2-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MaziyarPanahi__calme-2.1-qwen2-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MaziyarPanahi/calme-2.1-qwen2-7b
5aac57e2290f7c49af88a9cb9883ce25b58882a1
23.504116
apache-2.0
1
7
true
true
true
false
true
1.434255
0.381612
38.16119
0.504593
31.007097
0.228852
22.885196
0.28943
5.257271
0.443698
13.795573
0.369265
29.918366
false
2024-06-27
2024-09-18
1
Qwen/Qwen2-7B
MaziyarPanahi_calme-2.1-qwen2.5-72b_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/MaziyarPanahi/calme-2.1-qwen2.5-72b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MaziyarPanahi/calme-2.1-qwen2.5-72b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MaziyarPanahi__calme-2.1-qwen2.5-72b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MaziyarPanahi/calme-2.1-qwen2.5-72b
eb6c92dec932070ea872f39469ca5b9daf2d34e6
38.390458
other
1
72
true
true
true
false
true
14.748894
0.866236
86.623603
0.726162
61.655703
0.023414
2.34139
0.363255
15.100671
0.429844
13.297135
0.561918
51.324246
false
2024-09-19
2024-09-26
1
Qwen/Qwen2.5-72B
MaziyarPanahi_calme-2.1-rys-78b_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/MaziyarPanahi/calme-2.1-rys-78b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MaziyarPanahi/calme-2.1-rys-78b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MaziyarPanahi__calme-2.1-rys-78b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MaziyarPanahi/calme-2.1-rys-78b
e746f5ddc0c9b31a2382d985a4ec87fa910847c7
44.555882
mit
3
77
true
true
true
false
true
14.332288
0.813555
81.35547
0.709786
59.470031
0.388973
38.897281
0.394295
19.239374
0.469313
18.997396
0.544382
49.375739
false
2024-08-06
2024-08-08
1
dnhkng/RYS-XLarge
MaziyarPanahi_calme-2.2-llama3-70b_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/MaziyarPanahi/calme-2.2-llama3-70b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MaziyarPanahi/calme-2.2-llama3-70b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MaziyarPanahi__calme-2.2-llama3-70b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MaziyarPanahi/calme-2.2-llama3-70b
95366b974baedee4d95c1e841bc3d15e94753804
38.291121
llama3
17
70
true
true
true
false
true
10.628273
0.820849
82.084868
0.643543
48.571706
0.248489
24.848943
0.341443
12.192394
0.444573
15.304948
0.520695
46.743868
false
2024-04-27
2024-06-26
2
meta-llama/Meta-Llama-3-70B
MaziyarPanahi_calme-2.2-llama3.1-70b_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/MaziyarPanahi/calme-2.2-llama3.1-70b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MaziyarPanahi/calme-2.2-llama3.1-70b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MaziyarPanahi__calme-2.2-llama3.1-70b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MaziyarPanahi/calme-2.2-llama3.1-70b
c81ac05ed2c2344e9fd366cfff197da406ef5234
36.450483
2
70
false
true
true
false
true
15.841824
0.859267
85.926675
0.679292
54.206462
0.024924
2.492447
0.324664
9.955257
0.454156
17.069531
0.541473
49.052527
false
2024-09-09
2024-09-09
2
meta-llama/Meta-Llama-3.1-70B
MaziyarPanahi_calme-2.2-phi3-4b_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Phi3ForCausalLM
<a target="_blank" href="https://huggingface.co/MaziyarPanahi/calme-2.2-phi3-4b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MaziyarPanahi/calme-2.2-phi3-4b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MaziyarPanahi__calme-2.2-phi3-4b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MaziyarPanahi/calme-2.2-phi3-4b
c0a366a4c01d7e724ceba7e2f2c19251983423fe
23.23113
mit
2
3
true
true
true
false
true
0.796791
0.506908
50.690834
0.55296
37.733734
0.024924
2.492447
0.321309
9.50783
0.397563
7.695313
0.3814
31.266622
false
2024-05-10
2024-06-26
1
microsoft/Phi-3-mini-4k-instruct
MaziyarPanahi_calme-2.2-qwen2-72b_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/MaziyarPanahi/calme-2.2-qwen2-72b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MaziyarPanahi/calme-2.2-qwen2-72b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MaziyarPanahi__calme-2.2-qwen2-72b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MaziyarPanahi/calme-2.2-qwen2-72b
529e9bd80a76d943409bc92bb246aa7ca63dd9e6
43.775393
other
5
72
true
true
true
false
true
13.517415
0.800815
80.081517
0.69396
56.795942
0.43429
43.429003
0.374161
16.55481
0.450802
16.516927
0.543467
49.274158
false
2024-07-09
2024-08-06
1
Qwen/Qwen2-72B
MaziyarPanahi_calme-2.2-qwen2-7b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/MaziyarPanahi/calme-2.2-qwen2-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MaziyarPanahi/calme-2.2-qwen2-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MaziyarPanahi__calme-2.2-qwen2-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MaziyarPanahi/calme-2.2-qwen2-7b
bbb1d119f75c5b2eaa8978286808bd59cae04997
23.532967
apache-2.0
1
7
true
true
true
false
true
1.54875
0.35973
35.972996
0.521491
33.109366
0.21148
21.148036
0.291107
5.480984
0.435823
13.277865
0.389877
32.208555
false
2024-06-27
2024-09-18
1
Qwen/Qwen2-7B
MaziyarPanahi_calme-2.2-qwen2.5-72b_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/MaziyarPanahi/calme-2.2-qwen2.5-72b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MaziyarPanahi/calme-2.2-qwen2.5-72b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MaziyarPanahi__calme-2.2-qwen2.5-72b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MaziyarPanahi/calme-2.2-qwen2.5-72b
c6c7fdf70d8bf81364108975eb8ba78eecac83d4
38.022663
other
6
72
true
true
true
false
true
14.258064
0.847676
84.767639
0.72764
61.803604
0.037009
3.700906
0.35906
14.541387
0.420667
12.016667
0.561752
51.305777
false
2024-09-19
2024-09-26
1
Qwen/Qwen2.5-72B
MaziyarPanahi_calme-2.2-rys-78b_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/MaziyarPanahi/calme-2.2-rys-78b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MaziyarPanahi/calme-2.2-rys-78b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MaziyarPanahi__calme-2.2-rys-78b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MaziyarPanahi/calme-2.2-rys-78b
8d0dde25c9042705f65559446944a19259c3fc8e
44.260453
mit
3
77
true
true
true
false
true
13.523356
0.798642
79.864205
0.708101
59.268646
0.399547
39.954683
0.406879
20.917226
0.453563
16.828646
0.538564
48.729314
false
2024-08-06
2024-08-08
1
dnhkng/RYS-XLarge
MaziyarPanahi_calme-2.3-llama3-70b_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/MaziyarPanahi/calme-2.3-llama3-70b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MaziyarPanahi/calme-2.3-llama3-70b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MaziyarPanahi__calme-2.3-llama3-70b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MaziyarPanahi/calme-2.3-llama3-70b
bd17453eaae0e36d1e1e17da13fdd155fce91a29
37.155149
llama3
3
70
true
true
true
false
true
9.636809
0.80104
80.104013
0.639917
48.008585
0.237915
23.791541
0.338087
11.744966
0.426125
12.565625
0.520445
46.716164
false
2024-04-27
2024-08-30
2
meta-llama/Meta-Llama-3-70B
MaziyarPanahi_calme-2.3-llama3.1-70b_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/MaziyarPanahi/calme-2.3-llama3.1-70b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MaziyarPanahi/calme-2.3-llama3.1-70b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MaziyarPanahi__calme-2.3-llama3.1-70b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MaziyarPanahi/calme-2.3-llama3.1-70b
a39c79250721b75beefa1b1763895eafd010f6f6
40.644274
3
70
false
true
true
false
true
14.060558
0.860466
86.046579
0.687165
55.585495
0.234894
23.489426
0.34396
12.527964
0.456823
17.736198
0.53632
48.479979
false
2024-09-10
2024-09-18
2
meta-llama/Meta-Llama-3.1-70B
MaziyarPanahi_calme-2.3-phi3-4b_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Phi3ForCausalLM
<a target="_blank" href="https://huggingface.co/MaziyarPanahi/calme-2.3-phi3-4b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MaziyarPanahi/calme-2.3-phi3-4b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MaziyarPanahi__calme-2.3-phi3-4b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MaziyarPanahi/calme-2.3-phi3-4b
e1f70c3724c728aadd1c7c1bb279487494f7059e
23.105983
mit
9
3
true
true
true
false
true
0.837931
0.492645
49.264507
0.553787
37.658892
0.034743
3.47432
0.317953
9.060403
0.398833
7.754167
0.382813
31.423611
false
2024-05-10
2024-06-26
1
microsoft/Phi-3-mini-4k-instruct
MaziyarPanahi_calme-2.3-qwen2-72b_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/MaziyarPanahi/calme-2.3-qwen2-72b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MaziyarPanahi/calme-2.3-qwen2-72b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MaziyarPanahi__calme-2.3-qwen2-72b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MaziyarPanahi/calme-2.3-qwen2-72b
12ff2e800f968e867a580c072905cf4671da066f
30.407679
other
2
72
true
true
true
false
true
19.448485
0.384984
38.498406
0.657631
51.228304
0.161631
16.163142
0.371644
16.219239
0.41124
11.238281
0.541888
49.0987
false
2024-08-06
2024-09-15
1
Qwen/Qwen2-72B
MaziyarPanahi_calme-2.3-qwen2-7b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/MaziyarPanahi/calme-2.3-qwen2-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MaziyarPanahi/calme-2.3-qwen2-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MaziyarPanahi__calme-2.3-qwen2-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MaziyarPanahi/calme-2.3-qwen2-7b
ca39e60052a600a709e03fefceabd9620e0b66d7
23.043937
apache-2.0
1
7
true
true
true
false
true
1.8302
0.382486
38.248625
0.506405
30.956082
0.204683
20.468278
0.29698
6.263982
0.44224
13.313281
0.36112
29.013372
false
2024-06-27
2024-09-18
1
Qwen/Qwen2-7B
MaziyarPanahi_calme-2.3-rys-78b_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/MaziyarPanahi/calme-2.3-rys-78b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MaziyarPanahi/calme-2.3-rys-78b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MaziyarPanahi__calme-2.3-rys-78b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MaziyarPanahi/calme-2.3-rys-78b
a8a4e55c2f7054d25c2f0ab3a3b3d806eb915180
44.418905
mit
4
77
true
true
true
false
true
13.29861
0.806585
80.658542
0.710776
59.574547
0.389728
38.97281
0.404362
20.581655
0.454927
16.999219
0.54754
49.726655
false
2024-08-06
2024-09-03
1
dnhkng/RYS-XLarge
MaziyarPanahi_calme-2.4-llama3-70b_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/MaziyarPanahi/calme-2.4-llama3-70b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MaziyarPanahi/calme-2.4-llama3-70b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MaziyarPanahi__calme-2.4-llama3-70b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MaziyarPanahi/calme-2.4-llama3-70b
cb03e4d810b82d86e7cb01ab146bade09a5d06d1
32.498813
llama3
14
70
true
true
true
false
true
17.743758
0.502737
50.273718
0.641819
48.397766
0.245468
24.546828
0.339765
11.96868
0.428792
13.098958
0.520362
46.70693
false
2024-04-28
2024-06-26
2
meta-llama/Meta-Llama-3-70B
MaziyarPanahi_calme-2.4-qwen2-7b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/MaziyarPanahi/calme-2.4-qwen2-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MaziyarPanahi/calme-2.4-qwen2-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MaziyarPanahi__calme-2.4-qwen2-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MaziyarPanahi/calme-2.4-qwen2-7b
d683c3ef1feb13e92227f5fd92fe5bc4b55ea4a2
22.801088
apache-2.0
1
7
true
true
true
false
true
1.61849
0.329955
32.995452
0.510142
31.818266
0.200151
20.015106
0.283557
4.474273
0.445281
14.426823
0.397689
33.076611
false
2024-06-27
2024-09-18
1
Qwen/Qwen2-7B
MaziyarPanahi_calme-2.4-rys-78b_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/MaziyarPanahi/calme-2.4-rys-78b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MaziyarPanahi/calme-2.4-rys-78b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MaziyarPanahi__calme-2.4-rys-78b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MaziyarPanahi/calme-2.4-rys-78b
0a35e51ffa9efa644c11816a2d56434804177acb
50.714695
mit
42
77
true
true
true
false
true
12.976328
0.80109
80.109
0.727951
62.156549
0.404079
40.407855
0.402685
20.357942
0.577062
34.566146
0.700216
66.690677
false
2024-08-07
2024-09-03
2
dnhkng/RYS-XLarge
MaziyarPanahi_calme-2.5-qwen2-7b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/MaziyarPanahi/calme-2.5-qwen2-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MaziyarPanahi/calme-2.5-qwen2-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MaziyarPanahi__calme-2.5-qwen2-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MaziyarPanahi/calme-2.5-qwen2-7b
20fb1afc22c0722cb2c57185fff59befeba0fbec
22.672127
apache-2.0
1
7
true
true
true
false
true
1.399169
0.314492
31.449221
0.488656
28.280995
0.226586
22.65861
0.310403
8.053691
0.456469
15.791927
0.368185
29.798316
false
2024-06-27
2024-09-29
1
Qwen/Qwen2-7B
MaziyarPanahi_calme-2.6-qwen2-7b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/MaziyarPanahi/calme-2.6-qwen2-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MaziyarPanahi/calme-2.6-qwen2-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MaziyarPanahi__calme-2.6-qwen2-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MaziyarPanahi/calme-2.6-qwen2-7b
ebfaae016a50f8922098a2a262ec3ca704504cae
21.333036
apache-2.0
1
7
true
true
true
false
true
1.640271
0.344268
34.426765
0.493024
29.308419
0.127644
12.76435
0.284396
4.58613
0.458615
16.560156
0.373172
30.352394
false
2024-06-27
2024-09-29
1
Qwen/Qwen2-7B
MaziyarPanahi_calme-2.7-qwen2-7b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/MaziyarPanahi/calme-2.7-qwen2-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MaziyarPanahi/calme-2.7-qwen2-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MaziyarPanahi__calme-2.7-qwen2-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MaziyarPanahi/calme-2.7-qwen2-7b
edc11a1baccedc04a5a4576ee4910fd8922ad47f
22.342679
apache-2.0
2
7
true
true
true
false
true
1.36428
0.35923
35.923018
0.488317
28.912245
0.137462
13.746224
0.291107
5.480984
0.482427
19.936719
0.370512
30.056885
false
2024-06-27
2024-09-18
1
Qwen/Qwen2-7B