eval_name
stringlengths
12
111
Precision
stringclasses
3 values
Type
stringclasses
7 values
T
stringclasses
7 values
Weight type
stringclasses
2 values
Architecture
stringclasses
64 values
Model
stringlengths
355
689
fullname
stringlengths
4
102
Model sha
stringlengths
0
40
Average ⬆️
float64
0.74
52.1
Hub License
stringclasses
27 values
Hub ❤️
int64
0
6.09k
#Params (B)
float64
-1
141
Available on the hub
bool
2 classes
MoE
bool
2 classes
Flagged
bool
2 classes
Chat Template
bool
2 classes
CO₂ cost (kg)
float64
0.04
187
IFEval Raw
float64
0
0.9
IFEval
float64
0
90
BBH Raw
float64
0.22
0.83
BBH
float64
0.25
76.7
MATH Lvl 5 Raw
float64
0
0.71
MATH Lvl 5
float64
0
71.5
GPQA Raw
float64
0.21
0.47
GPQA
float64
0
29.4
MUSR Raw
float64
0.29
0.6
MUSR
float64
0
38.7
MMLU-PRO Raw
float64
0.1
0.73
MMLU-PRO
float64
0
70
Merged
bool
2 classes
Official Providers
bool
2 classes
Upload To Hub Date
stringclasses
525 values
Submission Date
stringclasses
263 values
Generation
int64
0
10
Base Model
stringlengths
4
102
SicariusSicariiStuff_Phi-Line_14B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/SicariusSicariiStuff/Phi-Line_14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">SicariusSicariiStuff/Phi-Line_14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/SicariusSicariiStuff__Phi-Line_14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
SicariusSicariiStuff/Phi-Line_14B
4eaf6b4e21774b8c6da9f998f0e2e71b3ab16296
37.562081
mit
11
14.66
true
false
false
true
0.936814
0.649565
64.956538
0.615443
43.794069
0.385952
38.595166
0.353188
13.758389
0.447854
14.781771
0.545379
49.486554
false
false
2025-02-17
2025-02-18
1
SicariusSicariiStuff/Phi-Line_14B (Merge)
SicariusSicariiStuff_Phi-lthy4_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/SicariusSicariiStuff/Phi-lthy4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">SicariusSicariiStuff/Phi-lthy4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/SicariusSicariiStuff__Phi-lthy4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
SicariusSicariiStuff/Phi-lthy4
888f1003ec7de0d2880d3a83b1e23c125ac47fb1
30.269041
mit
28
11.933
true
false
false
true
0.739112
0.767942
76.794239
0.587936
40.152882
0.136707
13.670695
0.286913
4.9217
0.408292
9.036458
0.433344
37.038268
false
false
2025-02-12
2025-02-12
1
SicariusSicariiStuff/Phi-lthy4 (Merge)
SicariusSicariiStuff_Qwen2.5-14B_Uncencored_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/SicariusSicariiStuff/Qwen2.5-14B_Uncencored" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">SicariusSicariiStuff/Qwen2.5-14B_Uncencored</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/SicariusSicariiStuff__Qwen2.5-14B_Uncencored-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
SicariusSicariiStuff/Qwen2.5-14B_Uncencored
1daf648ac2f837c66bf6bb00459e034987d9486f
31.724939
0
14
false
false
false
false
5.478391
0.315791
31.579099
0.630894
46.720235
0.317976
31.797583
0.381711
17.561521
0.451667
15.291667
0.526596
47.399527
false
false
2024-09-20
0
Removed
SicariusSicariiStuff_Qwen2.5-14B_Uncensored_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/SicariusSicariiStuff/Qwen2.5-14B_Uncensored" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">SicariusSicariiStuff/Qwen2.5-14B_Uncensored</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/SicariusSicariiStuff__Qwen2.5-14B_Uncensored-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
SicariusSicariiStuff/Qwen2.5-14B_Uncensored
0710a2341d269dcd56f9136fed442373d4dadc5d
31.750334
0
14
false
false
false
false
4.841773
0.317315
31.731472
0.630894
46.720235
0.317976
31.797583
0.381711
17.561521
0.451667
15.291667
0.526596
47.399527
false
false
2024-09-21
0
Removed
SicariusSicariiStuff_Qwen2.5-14B_Uncensored_Instruct_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/SicariusSicariiStuff/Qwen2.5-14B_Uncensored_Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">SicariusSicariiStuff/Qwen2.5-14B_Uncensored_Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/SicariusSicariiStuff__Qwen2.5-14B_Uncensored_Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
SicariusSicariiStuff/Qwen2.5-14B_Uncensored_Instruct
28.958792
0
14.77
false
false
false
true
7.810326
0.378939
37.893899
0.593679
42.113097
0.32855
32.854985
0.329698
10.626398
0.369656
4.407031
0.512716
45.857343
false
false
2024-09-21
0
Removed
SicariusSicariiStuff_Redemption_Wind_24B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/SicariusSicariiStuff/Redemption_Wind_24B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">SicariusSicariiStuff/Redemption_Wind_24B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/SicariusSicariiStuff__Redemption_Wind_24B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
SicariusSicariiStuff/Redemption_Wind_24B
e7b0e4989b34e5a7b1a3068c95ae83b951ac658e
28.370595
apache-2.0
21
23.572
true
false
false
false
2.965807
0.250145
25.014517
0.642816
48.417358
0.185801
18.58006
0.383389
17.785235
0.42624
11.179948
0.543218
49.246454
false
false
2025-02-06
2025-02-07
0
SicariusSicariiStuff/Redemption_Wind_24B
SicariusSicariiStuff_Winged_Imp_8B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/SicariusSicariiStuff/Winged_Imp_8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">SicariusSicariiStuff/Winged_Imp_8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/SicariusSicariiStuff__Winged_Imp_8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
SicariusSicariiStuff/Winged_Imp_8B
64411873c8b98fdbe62058a240fdcf1a550a00d0
26.911878
0
8.03
false
false
false
true
1.299679
0.743013
74.301298
0.512038
30.592875
0.120091
12.009063
0.282718
4.362416
0.414833
10.8875
0.363863
29.318115
false
false
2025-01-24
0
Removed
SicariusSicariiStuff_Wingless_Imp_8B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/SicariusSicariiStuff/Wingless_Imp_8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">SicariusSicariiStuff/Wingless_Imp_8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/SicariusSicariiStuff__Wingless_Imp_8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
SicariusSicariiStuff/Wingless_Imp_8B
5da96e0a37d80faaca421606a4e1c6b7e5cafd78
26.911878
llama3.1
9
8.03
true
false
false
true
1.295926
0.743013
74.301298
0.512038
30.592875
0.120091
12.009063
0.282718
4.362416
0.414833
10.8875
0.363863
29.318115
true
false
2025-01-24
2025-01-24
1
SicariusSicariiStuff/Wingless_Imp_8B (Merge)
SicariusSicariiStuff_Zion_Alpha_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/SicariusSicariiStuff/Zion_Alpha" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">SicariusSicariiStuff/Zion_Alpha</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/SicariusSicariiStuff__Zion_Alpha-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
SicariusSicariiStuff/Zion_Alpha
e52e1b6e98dce3a54d82f87f83920c0a3f189457
19.186491
apache-2.0
3
7.242
true
false
false
false
1.180955
0.332402
33.240247
0.493211
29.160501
0.052115
5.21148
0.290268
5.369128
0.472688
18.452604
0.313165
23.684988
false
false
2024-05-19
2024-10-18
0
SicariusSicariiStuff/Zion_Alpha
SicariusSicariiStuff_dn_ep02_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/SicariusSicariiStuff/dn_ep02" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">SicariusSicariiStuff/dn_ep02</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/SicariusSicariiStuff__dn_ep02-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
SicariusSicariiStuff/dn_ep02
ab9d5937cff45d0da251d6094cbf5a3cef4d42d8
25.28444
0
8.03
false
false
false
false
1.38746
0.506434
50.643404
0.526601
32.643774
0.141994
14.199396
0.315436
8.724832
0.431635
12.18776
0.399767
33.307476
false
false
2024-11-19
0
Removed
SkyOrbis_SKY-Ko-Llama3.1-8B-lora_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/SkyOrbis/SKY-Ko-Llama3.1-8B-lora" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">SkyOrbis/SKY-Ko-Llama3.1-8B-lora</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/SkyOrbis__SKY-Ko-Llama3.1-8B-lora-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
SkyOrbis/SKY-Ko-Llama3.1-8B-lora
69db8a7ac983f08e280c2b4ed55d159abeea8719
24.022312
llama3.1
0
8.03
true
false
false
false
1.346285
0.505835
50.583452
0.508839
29.191619
0.154834
15.483384
0.321309
9.50783
0.399792
8.507292
0.377743
30.860298
false
false
2024-12-31
2024-12-31
1
SkyOrbis/SKY-Ko-Llama3.1-8B-lora (Merge)
SkyOrbis_SKY-Ko-Llama3.1-8B-lora-epoch1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/SkyOrbis/SKY-Ko-Llama3.1-8B-lora-epoch1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">SkyOrbis/SKY-Ko-Llama3.1-8B-lora-epoch1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/SkyOrbis__SKY-Ko-Llama3.1-8B-lora-epoch1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
SkyOrbis/SKY-Ko-Llama3.1-8B-lora-epoch1
d275bea3b261da56fb8332afae6e670797caf6cb
24.022312
llama3.1
0
8.03
true
false
false
false
1.364203
0.505835
50.583452
0.508839
29.191619
0.154834
15.483384
0.321309
9.50783
0.399792
8.507292
0.377743
30.860298
false
false
2025-01-02
2025-01-02
1
SkyOrbis/SKY-Ko-Llama3.1-8B-lora-epoch1 (Merge)
SkyOrbis_SKY-Ko-Llama3.2-1B-lora-epoch3_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/SkyOrbis/SKY-Ko-Llama3.2-1B-lora-epoch3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">SkyOrbis/SKY-Ko-Llama3.2-1B-lora-epoch3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/SkyOrbis__SKY-Ko-Llama3.2-1B-lora-epoch3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
SkyOrbis/SKY-Ko-Llama3.2-1B-lora-epoch3
9d92423a35bafedfbfe5782bd69df1f2e3e8620e
7.679539
llama3
0
1.236
true
false
false
true
0.741439
0.324708
32.470844
0.316659
5.493124
0.02719
2.719033
0.251678
0.223714
0.338156
2.069531
0.127909
3.10099
false
false
2024-12-23
2024-12-23
1
SkyOrbis/SKY-Ko-Llama3.2-1B-lora-epoch3 (Merge)
SkyOrbis_SKY-Ko-Llama3.2-1B-lora-epoch5_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/SkyOrbis/SKY-Ko-Llama3.2-1B-lora-epoch5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">SkyOrbis/SKY-Ko-Llama3.2-1B-lora-epoch5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/SkyOrbis__SKY-Ko-Llama3.2-1B-lora-epoch5-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
SkyOrbis/SKY-Ko-Llama3.2-1B-lora-epoch5
7f6435abc4e61ee287b0d31b7d3e5654a2d8ec30
12.134493
llama3
0
1.236
true
false
false
false
0.726106
0.435992
43.599206
0.340602
8.132119
0.052115
5.21148
0.259228
1.230425
0.347146
4.126563
0.194564
10.507166
false
false
2024-12-23
2024-12-27
1
SkyOrbis/SKY-Ko-Llama3.2-1B-lora-epoch5 (Merge)
SkyOrbis_SKY-Ko-Llama3.2-1B-lora-v2-epoch3_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/SkyOrbis/SKY-Ko-Llama3.2-1B-lora-v2-epoch3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">SkyOrbis/SKY-Ko-Llama3.2-1B-lora-v2-epoch3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/SkyOrbis__SKY-Ko-Llama3.2-1B-lora-v2-epoch3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
SkyOrbis/SKY-Ko-Llama3.2-1B-lora-v2-epoch3
3ceb1f21e9423b3069e75d51ab7e3ac3c5896c42
12.134493
llama3.2
0
1.236
true
false
false
false
0.695744
0.435992
43.599206
0.340602
8.132119
0.052115
5.21148
0.259228
1.230425
0.347146
4.126563
0.194564
10.507166
false
false
2024-12-27
2024-12-27
1
SkyOrbis/SKY-Ko-Llama3.2-1B-lora-v2-epoch3 (Merge)
SkyOrbis_SKY-Ko-Llama3.2-1B-lora-v2-epoch5_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/SkyOrbis/SKY-Ko-Llama3.2-1B-lora-v2-epoch5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">SkyOrbis/SKY-Ko-Llama3.2-1B-lora-v2-epoch5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/SkyOrbis__SKY-Ko-Llama3.2-1B-lora-v2-epoch5-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
SkyOrbis/SKY-Ko-Llama3.2-1B-lora-v2-epoch5
285b5f8c99d2bc8233288d95ab645f74e6dd95fd
11.804479
llama3.2
0
1.236
true
false
false
false
0.708343
0.424677
42.467652
0.339684
8.268549
0.050604
5.060423
0.254195
0.559284
0.345844
3.963802
0.194564
10.507166
false
false
2024-12-27
2024-12-28
1
SkyOrbis/SKY-Ko-Llama3.2-1B-lora-v2-epoch5 (Merge)
SkyOrbis_SKY-Ko-Llama3.2-3B-lora-epoch1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/SkyOrbis/SKY-Ko-Llama3.2-3B-lora-epoch1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">SkyOrbis/SKY-Ko-Llama3.2-3B-lora-epoch1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/SkyOrbis__SKY-Ko-Llama3.2-3B-lora-epoch1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
SkyOrbis/SKY-Ko-Llama3.2-3B-lora-epoch1
bee7f5b3a4bc739c24cee6a0f936470df2d58a56
20.420217
llama3.2
0
3.213
true
false
false
false
1.841248
0.533112
53.311214
0.439963
20.806137
0.14577
14.577039
0.291946
5.592841
0.352229
5.961979
0.300449
22.272089
false
false
2024-12-27
2024-12-27
1
SkyOrbis/SKY-Ko-Llama3.2-3B-lora-epoch1 (Merge)
SkyOrbis_SKY-Ko-Llama3.2-3B-lora-epoch2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/SkyOrbis/SKY-Ko-Llama3.2-3B-lora-epoch2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">SkyOrbis/SKY-Ko-Llama3.2-3B-lora-epoch2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/SkyOrbis__SKY-Ko-Llama3.2-3B-lora-epoch2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
SkyOrbis/SKY-Ko-Llama3.2-3B-lora-epoch2
ff0f4aa3aee4535aaec8c4989014e1126d3dd36a
20.420217
llama3.2
0
3.213
true
false
false
false
1.144561
0.533112
53.311214
0.439963
20.806137
0.14577
14.577039
0.291946
5.592841
0.352229
5.961979
0.300449
22.272089
false
false
2024-12-29
2024-12-30
1
SkyOrbis/SKY-Ko-Llama3.2-3B-lora-epoch2 (Merge)
SkyOrbis_SKY-Ko-Llama3.2-3B-lora-epoch3_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/SkyOrbis/SKY-Ko-Llama3.2-3B-lora-epoch3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">SkyOrbis/SKY-Ko-Llama3.2-3B-lora-epoch3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/SkyOrbis__SKY-Ko-Llama3.2-3B-lora-epoch3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
SkyOrbis/SKY-Ko-Llama3.2-3B-lora-epoch3
879c73ee9539aca6cabff3a3fc5a8b37108dbd15
20.420217
llama3.2
0
3.213
true
false
false
false
1.133253
0.533112
53.311214
0.439963
20.806137
0.14577
14.577039
0.291946
5.592841
0.352229
5.961979
0.300449
22.272089
false
false
2024-12-30
2024-12-31
1
SkyOrbis/SKY-Ko-Llama3.2-3B-lora-epoch3 (Merge)
SkyOrbis_SKY-Ko-Qwen2.5-3B-Instruct_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/SkyOrbis/SKY-Ko-Qwen2.5-3B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">SkyOrbis/SKY-Ko-Qwen2.5-3B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/SkyOrbis__SKY-Ko-Qwen2.5-3B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
SkyOrbis/SKY-Ko-Qwen2.5-3B-Instruct
3241e4efcc62259e56caa03f8b42c301edc9320a
15.791201
0
3.086
false
false
false
false
1.459793
0.35341
35.341006
0.426482
19.150679
0.069486
6.94864
0.279362
3.914989
0.402365
9.26224
0.281167
20.129654
false
false
2025-01-09
2025-01-09
1
SkyOrbis/SKY-Ko-Qwen2.5-3B-Instruct (Merge)
SkyOrbis_SKY-Ko-Qwen2.5-7B-Instruct-SFT-step-15000_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/SkyOrbis/SKY-Ko-Qwen2.5-7B-Instruct-SFT-step-15000" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">SkyOrbis/SKY-Ko-Qwen2.5-7B-Instruct-SFT-step-15000</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/SkyOrbis__SKY-Ko-Qwen2.5-7B-Instruct-SFT-step-15000-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
SkyOrbis/SKY-Ko-Qwen2.5-7B-Instruct-SFT-step-15000
4b2f6c40cc0b83c77d40805f23f300d90055641a
24.131333
0
7.616
false
false
false
false
1.286765
0.381887
38.188673
0.507796
31.327612
0.186556
18.655589
0.327181
10.290828
0.443604
13.950521
0.391373
32.374778
false
false
2025-01-31
2025-02-01
1
SkyOrbis/SKY-Ko-Qwen2.5-7B-Instruct-SFT-step-15000 (Merge)
SkyOrbis_SKY-Ko-Qwen2.5-7B-Instruct-SFT-step-5000_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/SkyOrbis/SKY-Ko-Qwen2.5-7B-Instruct-SFT-step-5000" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">SkyOrbis/SKY-Ko-Qwen2.5-7B-Instruct-SFT-step-5000</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/SkyOrbis__SKY-Ko-Qwen2.5-7B-Instruct-SFT-step-5000-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
SkyOrbis/SKY-Ko-Qwen2.5-7B-Instruct-SFT-step-5000
af6741845310182a40e5f8e2882af5f23e3a9ffd
24.667118
0
7.616
false
false
false
false
1.235658
0.381237
38.123734
0.538986
34.951435
0.20997
20.996979
0.302852
7.04698
0.423792
10.907292
0.423787
35.976285
false
false
2025-01-31
2025-02-04
1
SkyOrbis/SKY-Ko-Qwen2.5-7B-Instruct-SFT-step-5000 (Merge)
Skywork_Skywork-Reward-Gemma-2-27B-v0.2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForSequenceClassification
<a target="_blank" href="https://huggingface.co/Skywork/Skywork-Reward-Gemma-2-27B-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Skywork/Skywork-Reward-Gemma-2-27B-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Skywork__Skywork-Reward-Gemma-2-27B-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Skywork/Skywork-Reward-Gemma-2-27B-v0.2
a92f2ec997c806de469ff287ef3b71982e886fc2
34.661448
30
27.227
false
false
false
true
8.85794
0.780732
78.073179
0.63596
48.159904
0.227341
22.734139
0.34396
12.527964
0.423146
11.993229
0.410322
34.480275
false
false
2024-10-14
2024-12-27
2
google/gemma-2-27b
Skywork_Skywork-o1-Open-Llama-3.1-8B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Skywork/Skywork-o1-Open-Llama-3.1-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Skywork/Skywork-o1-Open-Llama-3.1-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Skywork__Skywork-o1-Open-Llama-3.1-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Skywork/Skywork-o1-Open-Llama-3.1-8B
a41903315f39ebf1c08fdba0ef52758f7afe3682
20.752995
other
110
8.03
true
false
false
true
1.396378
0.351836
35.183646
0.451591
23.017599
0.521148
52.114804
0.259228
1.230425
0.315646
1.522396
0.203042
11.449099
false
false
2024-11-26
2025-01-01
2
meta-llama/Meta-Llama-3.1-8B
Solshine_Brimful-merged-replete_float16
float16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Solshine/Brimful-merged-replete" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Solshine/Brimful-merged-replete</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Solshine__Brimful-merged-replete-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Solshine/Brimful-merged-replete
01ce8c3df6edb87d31f0e9a9651cbcbc4d4823e8
3.879827
2
12.277
false
false
false
true
4.333447
0.176056
17.60562
0.288344
1.992139
0.003021
0.302115
0.25755
1.006711
0.342125
1.432292
0.108461
0.940086
false
false
2024-10-01
2024-10-01
1
Solshine/Brimful-merged-replete (Merge)
Solshine_Llama-3-1-big-thoughtful-passthrough-merge-2_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Solshine/Llama-3-1-big-thoughtful-passthrough-merge-2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Solshine/Llama-3-1-big-thoughtful-passthrough-merge-2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Solshine__Llama-3-1-big-thoughtful-passthrough-merge-2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Solshine/Llama-3-1-big-thoughtful-passthrough-merge-2
d48047d6577e22fdda73a1be8e18971912db66d2
6.928703
2
18.5
false
false
false
true
6.762705
0.254667
25.466651
0.320938
5.008442
0.010574
1.057402
0.259228
1.230425
0.388948
6.751823
0.118517
2.057476
false
false
2024-09-19
2024-09-24
1
Solshine/Llama-3-1-big-thoughtful-passthrough-merge-2 (Merge)
Sorawiz_Gemma-9B-Base_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/Sorawiz/Gemma-9B-Base" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Sorawiz/Gemma-9B-Base</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Sorawiz__Gemma-9B-Base-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Sorawiz/Gemma-9B-Base
f89db94ea783ddb1e365e7863cc015456dfc9f1d
20.842484
2
10.159
false
false
false
true
2.064133
0.166738
16.673759
0.593041
41.28135
0.098187
9.818731
0.339765
11.96868
0.40451
9.363802
0.423537
35.948582
false
false
2025-02-11
2025-02-12
1
Sorawiz/Gemma-9B-Base (Merge)
Sorawiz_Gemma-Creative-9B-Base_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/Sorawiz/Gemma-Creative-9B-Base" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Sorawiz/Gemma-Creative-9B-Base</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Sorawiz__Gemma-Creative-9B-Base-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Sorawiz/Gemma-Creative-9B-Base
aeb1b97a3ddad1fc8f7ee16692c09e7da528fcb1
18.299606
3
10.159
false
false
false
true
2.051282
0.1515
15.150024
0.545861
34.622422
0.077795
7.779456
0.329698
10.626398
0.401875
8.201042
0.400765
33.418292
false
false
2025-02-12
2025-02-12
1
Sorawiz/Gemma-Creative-9B-Base (Merge)
Sourjayon_DeepSeek-R1-8b-Sify_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Sourjayon/DeepSeek-R1-8b-Sify" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Sourjayon/DeepSeek-R1-8b-Sify</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Sourjayon__DeepSeek-R1-8b-Sify-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Sourjayon/DeepSeek-R1-8b-Sify
5e9bb0d78129b9b5a8b91d0dacc55de23b8c21fe
13.31321
apache-2.0
0
8.03
true
false
false
true
0.865325
0.367948
36.794816
0.337936
6.926822
0.244713
24.471299
0.252517
0.33557
0.330313
0.455729
0.198055
10.895021
false
false
2025-02-05
2025-02-10
2
deepseek-ai/DeepSeek-R1-Distill-Llama-8B
Sourjayon_DeepSeek-R1-ForumNXT_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Sourjayon/DeepSeek-R1-ForumNXT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Sourjayon/DeepSeek-R1-ForumNXT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Sourjayon__DeepSeek-R1-ForumNXT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Sourjayon/DeepSeek-R1-ForumNXT
8c5fe80c0c72215522cd277878bfb97319ff845d
11.959696
apache-2.0
0
1.777
true
false
false
false
1.206775
0.260287
26.028715
0.33102
6.957542
0.257553
25.755287
0.274329
3.243848
0.33924
2.571615
0.164811
7.201167
false
false
2025-01-31
2025-02-03
2
deepseek-ai/DeepSeek-R1-Distill-Qwen-1.5B
SpaceYL_ECE_Poirot_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/SpaceYL/ECE_Poirot" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">SpaceYL/ECE_Poirot</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/SpaceYL__ECE_Poirot-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
SpaceYL/ECE_Poirot
601fc736a6b7f0cff96219cbd9a903070db37adb
15.74256
apache-2.0
5
1.544
true
false
false
false
0.614353
0.310696
31.069562
0.426223
18.616426
0.09139
9.138973
0.297819
6.375839
0.402646
8.330729
0.288314
20.923833
true
false
2025-02-20
2025-02-20
1
SpaceYL/ECE_Poirot (Merge)
Spestly_Athena-1-3B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Spestly/Athena-1-3B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Spestly/Athena-1-3B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Spestly__Athena-1-3B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Spestly/Athena-1-3B
b2f9ab2db333f73c0adb8fa83837dbfb6cbd6204
25.482047
other
1
3.086
true
false
false
true
1.489361
0.556917
55.691676
0.470155
26.30887
0.237915
23.791541
0.293624
5.816555
0.436229
13.295313
0.351895
27.988327
false
false
2024-12-18
2025-01-28
2
Qwen/Qwen2.5-3B
Spestly_Atlas-Pro-1.5B-Preview_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Spestly/Atlas-Pro-1.5B-Preview" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Spestly/Atlas-Pro-1.5B-Preview</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Spestly__Atlas-Pro-1.5B-Preview-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Spestly/Atlas-Pro-1.5B-Preview
4fce245a33bec99c00548878787413c2dafec0b7
13.953857
mit
1
1.777
true
false
false
false
1.178942
0.242951
24.295093
0.349894
9.077408
0.319486
31.94864
0.29698
6.263982
0.335427
1.861719
0.192487
10.2763
false
false
2025-01-27
2025-01-27
2
Spestly/Atlas-R1-1.5B-Preview (Merge)
Spestly_Atlas-Pro-7B-Preview_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Spestly/Atlas-Pro-7B-Preview" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Spestly/Atlas-Pro-7B-Preview</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Spestly__Atlas-Pro-7B-Preview-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Spestly/Atlas-Pro-7B-Preview
3c693093b74675bebc507a0b92bb45e2bd0ee177
24.637553
mit
3
7.616
true
false
false
false
1.385866
0.315416
31.541643
0.466792
25.274195
0.508308
50.830816
0.337248
11.63311
0.391083
6.652083
0.297041
21.893469
false
false
2025-01-27
2025-01-27
2
Spestly/Atlas-R1-7B-Preview (Merge)
Stark2008_GutenLaserPi_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Stark2008/GutenLaserPi" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Stark2008/GutenLaserPi</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Stark2008__GutenLaserPi-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Stark2008/GutenLaserPi
d5ab84c6f8f0c88c16380242c7e11e8cefc934b7
21.400725
0
7.242
false
false
false
false
1.13993
0.422653
42.265301
0.521234
32.97771
0.07855
7.854985
0.286913
4.9217
0.462021
16.985937
0.310588
23.398715
false
false
2024-07-11
2024-07-11
1
Stark2008/GutenLaserPi (Merge)
Stark2008_LayleleFlamPi_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Stark2008/LayleleFlamPi" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Stark2008/LayleleFlamPi</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Stark2008__LayleleFlamPi-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Stark2008/LayleleFlamPi
b2897d17a65dea17383f52711475c8b41567c5d0
20.871096
0
7.242
false
false
false
false
1.259316
0.428423
42.842325
0.511565
31.20741
0.066465
6.646526
0.285235
4.697987
0.460844
16.572135
0.309342
23.260195
false
false
2024-07-12
2024-07-12
1
Stark2008/LayleleFlamPi (Merge)
Stark2008_VisFlamCat_float16
float16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Stark2008/VisFlamCat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Stark2008/VisFlamCat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Stark2008__VisFlamCat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Stark2008/VisFlamCat
290efa41ac83b8408cab084d093bcd9ae9abb0c9
21.340907
0
7.242
false
false
false
false
1.220432
0.436592
43.659158
0.521696
32.881397
0.076284
7.628399
0.290268
5.369128
0.446271
14.683854
0.314412
23.823508
false
false
2024-07-12
2024-07-12
1
Stark2008/VisFlamCat (Merge)
Steelskull_L3.3-MS-Nevoria-70b_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Steelskull/L3.3-MS-Nevoria-70b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Steelskull/L3.3-MS-Nevoria-70b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Steelskull__L3.3-MS-Nevoria-70b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Steelskull/L3.3-MS-Nevoria-70b
6271121beeac444db45ef12ce7c52215604463c3
44.041819
other
73
70.554
true
false
false
false
39.22483
0.696327
69.632686
0.699754
56.602649
0.39577
39.577039
0.470638
29.418345
0.468229
18.628646
0.553524
50.391548
true
false
2025-01-14
2025-01-17
1
Steelskull/L3.3-MS-Nevoria-70b (Merge)
Steelskull_L3.3-Nevoria-R1-70b_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Steelskull/L3.3-Nevoria-R1-70b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Steelskull/L3.3-Nevoria-R1-70b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Steelskull__L3.3-Nevoria-R1-70b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Steelskull/L3.3-Nevoria-R1-70b
cdcb10280e4c652127eb3d3af61125fc7f731fdd
43.613083
other
70
70.554
true
false
false
true
39.626962
0.602379
60.237946
0.697167
56.167288
0.462991
46.299094
0.46896
29.194631
0.477531
20.191406
0.546293
49.588135
true
false
2025-01-23
2025-02-05
1
Steelskull/L3.3-Nevoria-R1-70b (Merge)
StelleX_Qwen2.5_Math_7B_Cot_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/StelleX/Qwen2.5_Math_7B_Cot" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">StelleX/Qwen2.5_Math_7B_Cot</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/StelleX__Qwen2.5_Math_7B_Cot-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
StelleX/Qwen2.5_Math_7B_Cot
1549288a296c6e44cfcf4b9513769000bc768e36
17.801856
0
7.616
false
false
false
false
2.055984
0.214275
21.427479
0.431292
19.796911
0.326284
32.628399
0.294463
5.928412
0.392417
6.91875
0.281001
20.111185
false
false
2024-10-29
0
Removed
StelleX_Vorisatex-7B-preview_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/StelleX/Vorisatex-7B-preview" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">StelleX/Vorisatex-7B-preview</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/StelleX__Vorisatex-7B-preview-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
StelleX/Vorisatex-7B-preview
57612bb8af75e5e8d75b4df3dde993fdc48efbea
5.954613
0
7.613
false
false
false
false
2.498143
0.151501
15.150135
0.31117
4.133712
0.028701
2.870091
0.251678
0.223714
0.41924
11.504948
0.116606
1.84508
false
false
2024-10-29
0
Removed
SultanR_SmolTulu-1.7b-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/SultanR/SmolTulu-1.7b-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">SultanR/SmolTulu-1.7b-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/SultanR__SmolTulu-1.7b-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
SultanR/SmolTulu-1.7b-Instruct
11ed78c7c7a2e7f3c73c8f6f36c010f6dcba3245
16.33101
apache-2.0
13
1.711
true
false
false
true
0.614783
0.654087
65.408671
0.371309
12.25983
0.079305
7.930514
0.269295
2.572707
0.354031
1.920573
0.171044
7.893765
false
false
2024-12-01
2024-12-01
1
SultanR/SmolTulu-1.7b-Instruct (Merge)
SultanR_SmolTulu-1.7b-Reinforced_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/SultanR/SmolTulu-1.7b-Reinforced" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">SultanR/SmolTulu-1.7b-Reinforced</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/SultanR__SmolTulu-1.7b-Reinforced-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
SultanR/SmolTulu-1.7b-Reinforced
530b6c0c63a3513fd012e218ad53d64b75d1b259
16.574834
apache-2.0
5
1.711
true
false
false
true
0.579232
0.679066
67.906599
0.355187
10.015215
0.071752
7.175227
0.276007
3.467562
0.340604
2.408854
0.17628
8.475547
false
false
2024-12-16
2024-12-16
1
SultanR/SmolTulu-1.7b-Reinforced (Merge)
SultanR_SmolTulu-1.7b-it-v0_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/SultanR/SmolTulu-1.7b-it-v0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">SultanR/SmolTulu-1.7b-it-v0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/SultanR__SmolTulu-1.7b-it-v0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
SultanR/SmolTulu-1.7b-it-v0
75369e5c868ba261ea13f7bf85987ac1fe7ceb72
16.33101
apache-2.0
13
1.711
true
false
false
true
0.612903
0.654087
65.408671
0.371309
12.25983
0.079305
7.930514
0.269295
2.572707
0.354031
1.920573
0.171044
7.893765
false
false
2024-12-01
2024-12-01
1
SultanR/SmolTulu-1.7b-it-v0 (Merge)
Supichi_BBA-123_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Supichi/BBA-123" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Supichi/BBA-123</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Supichi__BBA-123-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Supichi/BBA-123
7551128748cbe65e49192e9551217e70bb00574d
4.797006
0
17.161
false
false
false
false
1.4063
0.207955
20.795489
0.292011
2.218337
0
0
0.260067
1.342282
0.349906
2.571615
0.116689
1.854314
false
false
2025-02-26
2025-02-26
1
Supichi/BBA-123 (Merge)
Supichi_BBA99_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Supichi/BBA99" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Supichi/BBA99</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Supichi__BBA99-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Supichi/BBA99
5b6e65ee5eb1c8bd0923108dc1929269f0d5b4bc
3.550456
0
17.161
false
false
false
false
1.409634
0.14066
14.066012
0.276896
1.305051
0
0
0.263423
1.789709
0.321844
2.897135
0.111203
1.244829
false
false
2025-02-26
2025-02-26
1
Supichi/BBA99 (Merge)
Supichi_BBAIK29_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Supichi/BBAIK29" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Supichi/BBAIK29</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Supichi__BBAIK29-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Supichi/BBAIK29
6a8f2407d1f64205d5104126fb20d57377510cc8
30.240917
0
7.616
false
false
false
false
0.641298
0.458848
45.884808
0.558964
36.963549
0.367825
36.782477
0.312081
8.277405
0.450083
14.99375
0.446892
38.543514
false
false
2025-02-27
2025-02-27
1
Supichi/BBAIK29 (Merge)
Supichi_BBAI_135_Gemma_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/Supichi/BBAI_135_Gemma" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Supichi/BBAI_135_Gemma</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Supichi__BBAI_135_Gemma-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Supichi/BBAI_135_Gemma
487cc6e1636bc7eda7c9ba19cd066890144397cf
5.349663
0
19.3
false
false
false
false
3.216886
0.065621
6.562144
0.356841
10.857976
0
0
0.267617
2.348993
0.380479
4.859896
0.167221
7.468972
false
false
2025-02-26
2025-02-26
1
Supichi/BBAI_135_Gemma (Merge)
Supichi_BBAI_250_Xia0_gZ_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Supichi/BBAI_250_Xia0_gZ" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Supichi/BBAI_250_Xia0_gZ</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Supichi__BBAI_250_Xia0_gZ-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Supichi/BBAI_250_Xia0_gZ
ae7d06f7e08b50df13d09541b4bbe08425d857ca
30.643171
0
7.616
false
false
false
false
0.609836
0.46854
46.854014
0.556768
36.654123
0.364048
36.404834
0.321309
9.50783
0.457927
15.940885
0.446476
38.49734
false
false
2025-02-27
2025-02-27
1
Supichi/BBAI_250_Xia0_gZ (Merge)
Supichi_BBAI_275_Tsunami_gZ_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Supichi/BBAI_275_Tsunami_gZ" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Supichi/BBAI_275_Tsunami_gZ</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Supichi__BBAI_275_Tsunami_gZ-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Supichi/BBAI_275_Tsunami_gZ
1373772684b0ec67425d782107d4b13ea1bcc2c1
30.835432
0
7.616
false
false
false
false
0.619508
0.536959
53.69586
0.553126
36.254177
0.32855
32.854985
0.321309
9.50783
0.444781
13.897656
0.449219
38.802083
false
false
2025-02-27
2025-02-27
1
Supichi/BBAI_275_Tsunami_gZ (Merge)
Supichi_BBAI_525_Tsu_gZ_Xia0_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Supichi/BBAI_525_Tsu_gZ_Xia0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Supichi/BBAI_525_Tsu_gZ_Xia0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Supichi__BBAI_525_Tsu_gZ_Xia0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Supichi/BBAI_525_Tsu_gZ_Xia0
5338ec1d5ab109af8985ade45ddcd447fca07ce5
30.935388
0
7.616
false
false
false
false
0.650021
0.533861
53.386127
0.556193
36.525252
0.3429
34.29003
0.312081
8.277405
0.447448
14.497656
0.447723
38.63586
false
false
2025-02-27
2025-02-27
1
Supichi/BBAI_525_Tsu_gZ_Xia0 (Merge)
Supichi_BBAI_78B_Calme_3_1_Ties_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Supichi/BBAI_78B_Calme_3_1_Ties" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Supichi/BBAI_78B_Calme_3_1_Ties</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Supichi__BBAI_78B_Calme_3_1_Ties-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Supichi/BBAI_78B_Calme_3_1_Ties
d07c51d1296700d65a39301e568b03058e4ee2ca
3.942051
0
27.06
false
false
false
false
2.364663
0.182801
18.280052
0.282813
1.530418
0
0
0.229027
0
0.309969
2.246094
0.114362
1.595745
false
false
2025-02-28
2025-02-28
1
Supichi/BBAI_78B_Calme_3_1_Ties (Merge)
Supichi_BBAI_QWEEN_V000000_LUMEN_14B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Supichi/BBAI_QWEEN_V000000_LUMEN_14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Supichi/BBAI_QWEEN_V000000_LUMEN_14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Supichi__BBAI_QWEEN_V000000_LUMEN_14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Supichi/BBAI_QWEEN_V000000_LUMEN_14B
f8ada5e5d4986818bfd70e99ef7ee7f1a9e8bff5
4.277816
0
10.366
false
false
false
false
0.918078
0.181452
18.145188
0.229726
3.006897
0
0
0.231544
0
0.344542
2.734375
0.116024
1.780437
false
false
2025-02-26
2025-02-26
1
Supichi/BBAI_QWEEN_V000000_LUMEN_14B (Merge)
Supichi_HF_TOKEN_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Supichi/HF_TOKEN" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Supichi/HF_TOKEN</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Supichi__HF_TOKEN-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Supichi/HF_TOKEN
a2d007a56354d9bd21cabda7615b65cb3955e7da
3.487653
0
17.161
false
false
false
false
1.416889
0.137987
13.798721
0.276392
1.147698
0.000755
0.075529
0.263423
1.789709
0.327177
2.897135
0.110954
1.217125
false
false
2025-02-26
2025-02-26
1
Supichi/HF_TOKEN (Merge)
Supichi_NJS26_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Supichi/NJS26" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Supichi/NJS26</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Supichi__NJS26-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Supichi/NJS26
cbc12277c5f471fcf15fe4078dd2715d5baa972f
11.996376
1
7.242
false
false
false
false
0.416959
0.044813
4.481332
0.478015
26.847432
0.032477
3.247734
0.317953
9.060403
0.385406
5.709115
0.30369
22.63224
false
false
2025-02-26
2025-02-26
1
Supichi/NJS26 (Merge)
Svak_MN-12B-Inferor-v0.0_float16
float16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Svak/MN-12B-Inferor-v0.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Svak/MN-12B-Inferor-v0.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Svak__MN-12B-Inferor-v0.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Svak/MN-12B-Inferor-v0.0
ab9efd0cc19b862ea1ab37a60dacac78aa022ad1
25.410934
10
12.248
false
false
false
true
2.49426
0.570756
57.07556
0.519501
30.846427
0.101964
10.196375
0.308725
7.829978
0.463885
18.085677
0.355884
28.43159
false
false
2024-11-07
2024-11-08
1
Svak/MN-12B-Inferor-v0.0 (Merge)
Svak_MN-12B-Inferor-v0.1_float16
float16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Svak/MN-12B-Inferor-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Svak/MN-12B-Inferor-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Svak__MN-12B-Inferor-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Svak/MN-12B-Inferor-v0.1
2d8cfac16dac3151d5e8e5ecd62866ca83c5149a
26.937535
5
12.248
false
false
false
true
1.990765
0.634653
63.465272
0.514676
30.850765
0.126133
12.613293
0.325503
10.067114
0.435083
15.052083
0.36619
29.576684
false
false
2024-11-08
2024-11-08
1
Svak/MN-12B-Inferor-v0.1 (Merge)
Syed-Hasan-8503_Phi-3-mini-4K-instruct-cpo-simpo_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Phi3ForCausalLM
<a target="_blank" href="https://huggingface.co/Syed-Hasan-8503/Phi-3-mini-4K-instruct-cpo-simpo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Syed-Hasan-8503/Phi-3-mini-4K-instruct-cpo-simpo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Syed-Hasan-8503__Phi-3-mini-4K-instruct-cpo-simpo-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Syed-Hasan-8503/Phi-3-mini-4K-instruct-cpo-simpo
2896ef357be81fd433c17801d76ce148e60a7032
27.216374
apache-2.0
2
3.821
true
false
false
true
2.328477
0.571405
57.140498
0.568153
39.148158
0.1571
15.70997
0.330537
10.738255
0.396354
8.777604
0.386054
31.783762
false
false
2024-06-24
2024-06-26
0
Syed-Hasan-8503/Phi-3-mini-4K-instruct-cpo-simpo
T145_KRONOS-8B-V1-P1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/T145/KRONOS-8B-V1-P1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">T145/KRONOS-8B-V1-P1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/T145__KRONOS-8B-V1-P1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
T145/KRONOS-8B-V1-P1
f39b904870a1b8b922e650214819ab10a3028d0f
28.90766
1
8.03
false
false
false
true
1.494116
0.784978
78.49783
0.508545
29.97328
0.197885
19.78852
0.295302
6.040268
0.388104
8.479688
0.375997
30.666371
false
false
2024-12-05
2025-02-05
1
T145/KRONOS-8B-V1-P1 (Merge)
T145_KRONOS-8B-V1-P2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/T145/KRONOS-8B-V1-P2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">T145/KRONOS-8B-V1-P2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/T145__KRONOS-8B-V1-P2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
T145/KRONOS-8B-V1-P2
7d3caeb1c7d1a8cf55e124a6df4ade74f8d89a0e
24.505319
0
8.03
false
false
false
true
1.328403
0.672421
67.24214
0.477176
25.864336
0.160121
16.012085
0.291946
5.592841
0.35676
5.061719
0.345329
27.258791
false
false
2024-12-05
2025-02-06
1
T145/KRONOS-8B-V1-P2 (Merge)
T145_KRONOS-8B-V1-P3_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/T145/KRONOS-8B-V1-P3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">T145/KRONOS-8B-V1-P3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/T145__KRONOS-8B-V1-P3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
T145/KRONOS-8B-V1-P3
5ab222b73cd2291a1ef2499aa60d3ca786d119d5
25.821838
1
8.03
false
false
false
true
1.492216
0.713737
71.373733
0.512788
30.270032
0.192598
19.259819
0.260067
1.342282
0.361563
5.961979
0.340509
26.723183
false
false
2024-12-06
2025-02-05
1
T145/KRONOS-8B-V1-P3 (Merge)
T145_KRONOS-8B-V2_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/T145/KRONOS-8B-V2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">T145/KRONOS-8B-V2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/T145__KRONOS-8B-V2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
T145/KRONOS-8B-V2
8a004e1e51aa24574ba961613fe9698df30bd9a0
25.049794
llama3.1
1
8.03
true
false
false
true
1.369991
0.518024
51.80244
0.513269
30.674907
0.226586
22.65861
0.298658
6.487696
0.382865
8.258073
0.373753
30.417036
true
false
2024-12-08
2024-12-13
1
T145/KRONOS-8B-V2 (Merge)
T145_KRONOS-8B-V3_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/T145/KRONOS-8B-V3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">T145/KRONOS-8B-V3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/T145__KRONOS-8B-V3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
T145/KRONOS-8B-V3
75b0ff936de5caa98a6b9680bafeeb92d4b9abaa
25.736803
1
8.03
false
false
false
true
1.401565
0.547475
54.747514
0.511866
30.291099
0.259819
25.981873
0.288591
5.145414
0.392229
7.828646
0.373836
30.426271
false
false
2024-12-18
2024-12-18
1
T145/KRONOS-8B-V3 (Merge)
T145_KRONOS-8B-V4_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/T145/KRONOS-8B-V4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">T145/KRONOS-8B-V4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/T145__KRONOS-8B-V4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
T145/KRONOS-8B-V4
f9faad008b866745fd60755e558f7a06d3a59da4
28.750288
1
8.03
false
false
false
true
1.390519
0.78895
78.894999
0.509247
30.140619
0.194864
19.486405
0.28943
5.257271
0.382958
7.769792
0.378574
30.952645
false
false
2024-12-19
2024-12-19
1
T145/KRONOS-8B-V4 (Merge)
T145_KRONOS-8B-V5_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/T145/KRONOS-8B-V5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">T145/KRONOS-8B-V5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/T145__KRONOS-8B-V5-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
T145/KRONOS-8B-V5
67a69b38a382a7cebe2d8d7b52aeafab6ff89a29
26.264835
1
8.03
false
false
false
true
1.380818
0.540506
54.050586
0.508865
30.173682
0.268882
26.888218
0.290268
5.369128
0.405469
10.45026
0.375914
30.657137
false
false
2024-12-19
2024-12-20
1
T145/KRONOS-8B-V5 (Merge)
T145_KRONOS-8B-V6_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/T145/KRONOS-8B-V6" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">T145/KRONOS-8B-V6</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/T145__KRONOS-8B-V6-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
T145/KRONOS-8B-V6
7afd2483e81c58ad3865a9cac6f2e66afe1d1f78
27.89804
1
8.03
false
false
false
true
1.439657
0.702247
70.224671
0.503361
29.659286
0.259819
25.981873
0.279362
3.914989
0.412104
9.813021
0.35015
27.7944
false
false
2024-12-20
2024-12-20
1
T145/KRONOS-8B-V6 (Merge)
T145_KRONOS-8B-V7_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/T145/KRONOS-8B-V7" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">T145/KRONOS-8B-V7</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/T145__KRONOS-8B-V7-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
T145/KRONOS-8B-V7
422458ab11c4a8bb502fd8681551f9b54d7e6162
15.899831
0
4.015
false
false
false
true
1.454147
0.35291
35.291028
0.452622
23.890173
0.111027
11.102719
0.266779
2.237136
0.367115
4.022656
0.269697
18.855275
false
false
2024-12-25
0
Removed
T145_KRONOS-8B-V8_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/T145/KRONOS-8B-V8" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">T145/KRONOS-8B-V8</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/T145__KRONOS-8B-V8-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
T145/KRONOS-8B-V8
1085c73a0b9bea22cc0dd85cf2745c62387949d9
28.793304
1
8.03
false
false
false
true
1.455437
0.777035
77.703493
0.509441
30.053094
0.204683
20.468278
0.28943
5.257271
0.386896
8.361979
0.378241
30.915706
false
false
2025-02-01
2025-02-01
1
T145/KRONOS-8B-V8 (Merge)
T145_KRONOS-8B-V9_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/T145/KRONOS-8B-V9" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">T145/KRONOS-8B-V9</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/T145__KRONOS-8B-V9-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
T145/KRONOS-8B-V9
46861313565195fbc0edca6396c3c214b308baa1
28.922153
0
8.03
false
false
false
true
1.437986
0.785578
78.557782
0.509921
30.068011
0.19864
19.864048
0.296141
6.152125
0.386802
8.316927
0.375166
30.574025
false
false
2025-02-02
2025-02-02
1
T145/KRONOS-8B-V9 (Merge)
T145_Llama-3.1-8B-Instruct-Zeus_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/T145/Llama-3.1-8B-Instruct-Zeus" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">T145/Llama-3.1-8B-Instruct-Zeus</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/T145__Llama-3.1-8B-Instruct-Zeus-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
T145/Llama-3.1-8B-Instruct-Zeus
672f9f1d4256f999b4513061c5406f60bfda2949
29.649994
apache-2.0
0
8.03
true
false
false
true
1.389369
0.794121
79.412071
0.517398
31.388991
0.195619
19.561934
0.301174
6.823266
0.397625
8.569792
0.389295
32.143913
true
false
2024-11-28
2024-11-28
1
T145/Llama-3.1-8B-Instruct-Zeus (Merge)
T145_Llama-3.1-8B-Zeus_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/T145/Llama-3.1-8B-Zeus" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">T145/Llama-3.1-8B-Zeus</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/T145__Llama-3.1-8B-Zeus-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
T145/Llama-3.1-8B-Zeus
712ff76aa966b0a9c5c65a074b2eb2b2cb56de86
9.07644
0
8.03
false
false
false
true
1.463435
0.351761
35.17611
0.367118
10.560808
0.01435
1.435045
0.265101
2.013423
0.331583
1.58125
0.133228
3.692007
false
false
2024-11-28
0
Removed
T145_Meta-Llama-3.1-8B-Instruct-TIES_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/T145/Meta-Llama-3.1-8B-Instruct-TIES" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">T145/Meta-Llama-3.1-8B-Instruct-TIES</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/T145__Meta-Llama-3.1-8B-Instruct-TIES-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
T145/Meta-Llama-3.1-8B-Instruct-TIES
62a8c4f6e02a2e18f79688877fa925dcac8096aa
24.976591
1
8.03
false
false
false
true
1.369424
0.542354
54.235429
0.507011
29.774263
0.20997
20.996979
0.294463
5.928412
0.384292
8.036458
0.377992
30.888002
false
false
2024-12-21
2024-12-22
1
T145/Meta-Llama-3.1-8B-Instruct-TIES (Merge)
T145_ZEUS-8B-V10_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/T145/ZEUS-8B-V10" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">T145/ZEUS-8B-V10</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/T145__ZEUS-8B-V10-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
T145/ZEUS-8B-V10
94b750b9de63df6391bb42d304a3dabea259b178
30.369691
2
8.03
false
false
false
true
1.366933
0.770665
77.066517
0.526976
32.695048
0.21148
21.148036
0.324664
9.955257
0.389781
9.089323
0.390376
32.263963
false
false
2024-12-24
2024-12-24
1
T145/ZEUS-8B-V10 (Merge)
T145_ZEUS-8B-V11_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/T145/ZEUS-8B-V11" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">T145/ZEUS-8B-V11</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/T145__ZEUS-8B-V11-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
T145/ZEUS-8B-V11
407c0bd5c2de36aee4a7c3ec769f82705616fcf2
29.941073
1
8.03
false
false
false
true
2.171996
0.809958
80.995758
0.516198
31.207913
0.196375
19.637462
0.314597
8.612975
0.380667
7.15
0.388381
32.042332
false
false
2024-12-27
2024-12-27
1
T145/ZEUS-8B-V11 (Merge)
T145_ZEUS-8B-V12_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/T145/ZEUS-8B-V12" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">T145/ZEUS-8B-V12</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/T145__ZEUS-8B-V12-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
T145/ZEUS-8B-V12
32ee6e7da83b1fac23e2d931279c3c4adb6d9718
30.33368
0
8.03
false
false
false
true
2.741878
0.781556
78.155627
0.525391
32.449002
0.21148
21.148036
0.32047
9.395973
0.385844
8.497135
0.391207
32.356309
false
false
2024-12-29
2024-12-29
1
T145/ZEUS-8B-V12 (Merge)
T145_ZEUS-8B-V13_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/T145/ZEUS-8B-V13" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">T145/ZEUS-8B-V13</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/T145__ZEUS-8B-V13-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
T145/ZEUS-8B-V13
48a73e233cf8736313b616ca0e87b26841318f4e
30.621362
llama3.1
2
8.03
true
false
false
true
1.35177
0.790424
79.042385
0.527713
32.727458
0.213746
21.374622
0.323826
9.8434
0.384479
8.393229
0.391124
32.347074
true
false
2024-12-29
2024-12-30
1
T145/ZEUS-8B-V13 (Merge)
T145_ZEUS-8B-V13-abliterated_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/T145/ZEUS-8B-V13-abliterated" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">T145/ZEUS-8B-V13-abliterated</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/T145__ZEUS-8B-V13-abliterated-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
T145/ZEUS-8B-V13-abliterated
1edf9d72638d57bbe697717344391355cb610781
29.488668
llama3.1
1
8.03
true
false
false
true
1.367199
0.787751
78.775095
0.51976
31.784785
0.179003
17.900302
0.311242
8.165548
0.387146
8.393229
0.387217
31.913047
true
false
2024-12-31
2025-01-01
1
T145/ZEUS-8B-V13-abliterated (Merge)
T145_ZEUS-8B-V14_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/T145/ZEUS-8B-V14" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">T145/ZEUS-8B-V14</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/T145__ZEUS-8B-V14-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
T145/ZEUS-8B-V14
d5a831c4923e9effe2f64426f6066c66eec4569c
30.191102
1
8.03
false
false
false
true
1.348179
0.77094
77.093999
0.527459
32.693446
0.212991
21.299094
0.32047
9.395973
0.384448
8.289323
0.391373
32.374778
false
false
2024-12-30
2024-12-30
1
T145/ZEUS-8B-V14 (Merge)
T145_ZEUS-8B-V15_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/T145/ZEUS-8B-V15" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">T145/ZEUS-8B-V15</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/T145__ZEUS-8B-V15-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
T145/ZEUS-8B-V15
3d83f7ec7ddc41d81c4bb7859420f377427d5367
29.370031
0
4.015
false
false
false
true
1.389291
0.701273
70.127262
0.553755
36.181603
0.230363
23.036254
0.276007
3.467562
0.402
9.416667
0.405918
33.990839
false
false
2024-12-31
0
Removed
T145_ZEUS-8B-V16_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/T145/ZEUS-8B-V16" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">T145/ZEUS-8B-V16</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/T145__ZEUS-8B-V16-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
T145/ZEUS-8B-V16
497a1e669fd64ebb576149bfc55aa826383daaff
30.579931
1
8.03
false
false
false
true
1.331311
0.792547
79.254711
0.526582
32.532183
0.220544
22.054381
0.307047
7.606264
0.395083
9.51875
0.39262
32.513298
false
false
2024-12-31
2024-12-31
1
T145/ZEUS-8B-V16 (Merge)
T145_ZEUS-8B-V17_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/T145/ZEUS-8B-V17" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">T145/ZEUS-8B-V17</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/T145__ZEUS-8B-V17-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
T145/ZEUS-8B-V17
c8420f0ef12a0c42f5e1cb62adbb2bf403d4c77f
31.006564
2
8.03
false
false
false
true
1.344623
0.794071
79.407084
0.525087
32.338483
0.22432
22.432024
0.322148
9.619687
0.401625
9.636458
0.393451
32.605644
false
false
2025-01-01
2025-01-01
1
T145/ZEUS-8B-V17 (Merge)
T145_ZEUS-8B-V17-abliterated_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/T145/ZEUS-8B-V17-abliterated" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">T145/ZEUS-8B-V17-abliterated</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/T145__ZEUS-8B-V17-abliterated-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
T145/ZEUS-8B-V17-abliterated
b1e9142e0efb74d5ecc9ab82aff858ff6715678a
26.847962
2
7.594
false
false
false
true
1.340049
0.757601
75.760094
0.520041
31.522204
0.043807
4.380665
0.303691
7.158837
0.426927
13.132552
0.362201
29.133422
false
false
2025-01-01
2025-01-01
1
T145/ZEUS-8B-V17-abliterated (Merge)
T145_ZEUS-8B-V17-abliterated-V2_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/T145/ZEUS-8B-V17-abliterated-V2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">T145/ZEUS-8B-V17-abliterated-V2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/T145__ZEUS-8B-V17-abliterated-V2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
T145/ZEUS-8B-V17-abliterated-V2
9cc40b877185b7796b3cfc49558ce28a1cb0d207
22.607357
1
8.03
false
false
false
true
1.669435
0.653212
65.321237
0.492801
27.568612
0.111782
11.178248
0.27349
3.131991
0.340729
1.757812
0.340176
26.686244
false
false
2025-01-04
2025-01-04
1
T145/ZEUS-8B-V17-abliterated-V2 (Merge)
T145_ZEUS-8B-V17-abliterated-V4_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/T145/ZEUS-8B-V17-abliterated-V4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">T145/ZEUS-8B-V17-abliterated-V4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/T145__ZEUS-8B-V17-abliterated-V4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
T145/ZEUS-8B-V17-abliterated-V4
04b4069dcd85e42eb2649fe39f00325e7febb415
26.587167
llama3.1
4
8.03
true
false
false
true
1.316405
0.72283
72.282987
0.516922
30.971615
0.093656
9.365559
0.283557
4.474273
0.418708
11.605208
0.37741
30.82336
true
false
2025-01-04
2025-01-05
1
T145/ZEUS-8B-V17-abliterated-V4 (Merge)
T145_ZEUS-8B-V18_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/T145/ZEUS-8B-V18" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">T145/ZEUS-8B-V18</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/T145__ZEUS-8B-V18-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
T145/ZEUS-8B-V18
bf3c9a2836a00cdccdc85b1587b46d1146877850
30.932928
1
8.03
false
false
false
true
1.327292
0.783405
78.34047
0.52698
32.52959
0.218278
21.827795
0.321309
9.50783
0.404292
10.703125
0.394199
32.688756
false
false
2025-01-01
2025-01-02
1
T145/ZEUS-8B-V18 (Merge)
T145_ZEUS-8B-V19_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/T145/ZEUS-8B-V19" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">T145/ZEUS-8B-V19</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/T145__ZEUS-8B-V19-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
T145/ZEUS-8B-V19
8000d59047526f61b9588180df0a862928c2ccea
31.073717
1
8.03
false
false
false
true
1.386448
0.788251
78.825073
0.527623
32.643628
0.220544
22.054381
0.322148
9.619687
0.404292
10.703125
0.393368
32.59641
false
false
2025-01-03
2025-01-03
1
T145/ZEUS-8B-V19 (Merge)
T145_ZEUS-8B-V2_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/T145/ZEUS-8B-V2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">T145/ZEUS-8B-V2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/T145__ZEUS-8B-V2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
T145/ZEUS-8B-V2
8f874a61fe651717afaf484e3a556a0c11b7f292
30.143481
2
8.03
false
false
false
true
1.391435
0.802938
80.293843
0.519441
31.605593
0.216012
21.601208
0.302013
6.935123
0.391021
8.244271
0.389628
32.180851
false
false
2024-12-01
2024-12-01
1
T145/ZEUS-8B-V2 (Merge)
T145_ZEUS-8B-V2-ORPO_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/T145/ZEUS-8B-V2-ORPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">T145/ZEUS-8B-V2-ORPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/T145__ZEUS-8B-V2-ORPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
T145/ZEUS-8B-V2-ORPO
fee1b04ccafb9f6bbb4db88effd837ad72e00571
27.882958
0
4.015
false
false
false
true
1.414975
0.718683
71.868309
0.507525
29.59149
0.182779
18.277946
0.310403
8.053691
0.3935
9.754167
0.367769
29.752142
false
false
2024-12-31
0
Removed
T145_ZEUS-8B-V2-abliterated_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/T145/ZEUS-8B-V2-abliterated" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">T145/ZEUS-8B-V2-abliterated</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/T145__ZEUS-8B-V2-abliterated-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
T145/ZEUS-8B-V2-abliterated
d07c040573a4a468d774e5f47811be3e4c05e622
29.796706
llama3.1
2
8.03
true
false
false
true
2.05545
0.78955
78.954951
0.512887
30.982564
0.21148
21.148036
0.312919
8.389262
0.391083
7.91875
0.38248
31.386673
true
false
2024-12-30
2024-12-31
1
T145/ZEUS-8B-V2-abliterated (Merge)
T145_ZEUS-8B-V20_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/T145/ZEUS-8B-V20" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">T145/ZEUS-8B-V20</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/T145__ZEUS-8B-V20-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
T145/ZEUS-8B-V20
0daec2344934c6f945fe8df88de345f66c89fe84
31.039974
1
8.03
false
false
false
true
1.336775
0.795595
79.559458
0.524401
32.221587
0.219033
21.903323
0.322987
9.731544
0.404323
10.273698
0.392952
32.550236
false
false
2025-01-07
2025-01-07
1
T145/ZEUS-8B-V20 (Merge)
T145_ZEUS-8B-V21_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/T145/ZEUS-8B-V21" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">T145/ZEUS-8B-V21</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/T145__ZEUS-8B-V21-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
T145/ZEUS-8B-V21
8b3646b8e380835dc6955ae210743360b3f9c298
12.085755
0
8.03
false
false
false
true
1.440348
0.378515
37.851456
0.339758
7.358048
0.159366
15.936556
0.264262
1.901566
0.326156
1.536198
0.171376
7.930703
false
false
2025-01-08
0
Removed
T145_ZEUS-8B-V22_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/T145/ZEUS-8B-V22" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">T145/ZEUS-8B-V22</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/T145__ZEUS-8B-V22-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
T145/ZEUS-8B-V22
e0c00dfff7eb8b0fe0c3b63980d9558f55dd569c
31.143604
llama3.1
2
8.03
true
false
false
true
1.412507
0.799516
79.951639
0.524492
32.213956
0.22281
22.280967
0.32802
10.402685
0.398958
9.369792
0.393783
32.642583
true
false
2025-01-09
2025-01-09
1
T145/ZEUS-8B-V22 (Merge)
T145_ZEUS-8B-V23_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/T145/ZEUS-8B-V23" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">T145/ZEUS-8B-V23</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/T145__ZEUS-8B-V23-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
T145/ZEUS-8B-V23
62b55d14842dfcbe33a0847e2b8fc18ffabf05bf
28.439674
1
8.03
false
false
false
true
1.332491
0.762122
76.212228
0.5195
31.4673
0.182024
18.202417
0.309564
7.941834
0.392198
7.191406
0.366606
29.622858
false
false
2025-01-10
2025-01-10
1
T145/ZEUS-8B-V23 (Merge)
T145_ZEUS-8B-V24_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/T145/ZEUS-8B-V24" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">T145/ZEUS-8B-V24</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/T145__ZEUS-8B-V24-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
T145/ZEUS-8B-V24
0b8f6c8232f6018f1a9849618773b16dd4405650
22.065645
1
8.03
false
false
false
true
1.524097
0.599981
59.998138
0.477796
26.153954
0.14577
14.577039
0.261745
1.565996
0.372917
4.714583
0.328457
25.384161
false
false
2025-01-22
2025-01-23
1
T145/ZEUS-8B-V24 (Merge)
T145_ZEUS-8B-V25_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/T145/ZEUS-8B-V25" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">T145/ZEUS-8B-V25</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/T145__ZEUS-8B-V25-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
T145/ZEUS-8B-V25
20e25207ddcba81f481e6178b5ede453da0b93db
16.814748
0
8.03
false
false
false
true
1.617968
0.332028
33.202791
0.454691
21.846213
0.203927
20.392749
0.264262
1.901566
0.348823
2.602865
0.288481
20.942302
false
false
2025-01-23
2025-01-23
1
T145/ZEUS-8B-V25 (Merge)
T145_ZEUS-8B-V26_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/T145/ZEUS-8B-V26" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">T145/ZEUS-8B-V26</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/T145__ZEUS-8B-V26-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
T145/ZEUS-8B-V26
621279b3792ebb97f5ea94136481d1a84c7babc1
26.628444
1
8.03
false
false
false
true
1.434082
0.670798
67.079793
0.523155
32.251005
0.124622
12.462236
0.295302
6.040268
0.401625
9.636458
0.390708
32.300901
false
false
2025-01-25
2025-01-25
1
T145/ZEUS-8B-V26 (Merge)
T145_ZEUS-8B-V27_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/T145/ZEUS-8B-V27" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">T145/ZEUS-8B-V27</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/T145__ZEUS-8B-V27-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
T145/ZEUS-8B-V27
819e1d5470b2d5e86e49ee0be692fee5016386ca
26.817909
1
8.03
false
false
false
true
1.455595
0.654362
65.436154
0.523031
32.219304
0.134441
13.444109
0.307886
7.718121
0.397687
9.844271
0.390209
32.245493
false
false
2025-01-26
2025-01-26
1
T145/ZEUS-8B-V27 (Merge)
T145_ZEUS-8B-V28_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/T145/ZEUS-8B-V28" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">T145/ZEUS-8B-V28</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/T145__ZEUS-8B-V28-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
T145/ZEUS-8B-V28
c70e0e93166320fe9e70c4b568239d6ec4c69d03
26.179429
0
8.03
false
false
false
true
1.454431
0.635252
63.525224
0.525426
32.621741
0.126888
12.688822
0.303691
7.158837
0.389625
8.836458
0.390209
32.245493
false
false
2025-01-27
2025-01-28
1
T145/ZEUS-8B-V28 (Merge)
T145_ZEUS-8B-V29_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/T145/ZEUS-8B-V29" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">T145/ZEUS-8B-V29</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/T145__ZEUS-8B-V29-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
T145/ZEUS-8B-V29
be5ab42b6f8339d012595850b91402da5a45ba48
29.1164
1
8.03
false
false
false
true
1.408967
0.741764
74.176407
0.525333
32.349727
0.160121
16.012085
0.326342
10.178971
0.40026
9.532552
0.392038
32.448655
false
false
2025-02-02
2025-02-02
1
T145/ZEUS-8B-V29 (Merge)
T145_ZEUS-8B-V2L1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/T145/ZEUS-8B-V2L1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">T145/ZEUS-8B-V2L1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/T145__ZEUS-8B-V2L1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
T145/ZEUS-8B-V2L1
c2d7f009c769f7ebdef00412ad85f2d3bdea9869
19.959332
0
8.03
false
false
false
false
1.489274
0.319189
31.918864
0.501349
28.694208
0.123867
12.386707
0.312919
8.389262
0.388198
9.058073
0.36378
29.30888
false
false
2024-12-02
0
Removed