eval_name
stringlengths
12
111
Precision
stringclasses
3 values
Type
stringclasses
6 values
T
stringclasses
6 values
Weight type
stringclasses
2 values
Architecture
stringclasses
48 values
Model
stringlengths
355
650
fullname
stringlengths
4
102
Model sha
stringlengths
0
40
Average ⬆️
float64
1.41
51.2
Hub License
stringclasses
25 values
Hub ❤️
int64
0
5.84k
#Params (B)
int64
-1
140
Available on the hub
bool
2 classes
Not_Merged
bool
2 classes
MoE
bool
2 classes
Flagged
bool
1 class
Chat Template
bool
2 classes
CO₂ cost (kg)
float64
0.04
107
IFEval Raw
float64
0
0.87
IFEval
float64
0
86.7
BBH Raw
float64
0.28
0.75
BBH
float64
0.81
63.5
MATH Lvl 5 Raw
float64
0
0.51
MATH Lvl 5
float64
0
50.7
GPQA Raw
float64
0.22
0.44
GPQA
float64
0
24.9
MUSR Raw
float64
0.29
0.59
MUSR
float64
0
36.4
MMLU-PRO Raw
float64
0.1
0.7
MMLU-PRO
float64
0
66.8
Maintainer's Highlight
bool
2 classes
Upload To Hub Date
stringlengths
0
10
Submission Date
stringclasses
151 values
Generation
int64
0
6
Base Model
stringlengths
4
102
xukp20_Llama-3-8B-Instruct-SPPO-score-Iter3_bt_8b-table-0.002_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/xukp20/Llama-3-8B-Instruct-SPPO-score-Iter3_bt_8b-table-0.002" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xukp20/Llama-3-8B-Instruct-SPPO-score-Iter3_bt_8b-table-0.002</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xukp20__Llama-3-8B-Instruct-SPPO-score-Iter3_bt_8b-table-0.002-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
xukp20/Llama-3-8B-Instruct-SPPO-score-Iter3_bt_8b-table-0.002
8ef9ef7e2bf522e707a7b090af55f2ec1eafd4b9
23.261322
0
8
false
true
true
false
true
0.869474
0.685161
68.516093
0.507516
29.74055
0.054381
5.438066
0.258389
1.118568
0.383177
5.630469
0.362118
29.124187
false
2024-09-28
2024-09-29
0
xukp20/Llama-3-8B-Instruct-SPPO-score-Iter3_bt_8b-table-0.002
xukp20_Llama-3-8B-Instruct-SPPO-score-Iter3_gp_2b-table-0.001_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/xukp20/Llama-3-8B-Instruct-SPPO-score-Iter3_gp_2b-table-0.001" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xukp20/Llama-3-8B-Instruct-SPPO-score-Iter3_gp_2b-table-0.001</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xukp20__Llama-3-8B-Instruct-SPPO-score-Iter3_gp_2b-table-0.001-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
xukp20/Llama-3-8B-Instruct-SPPO-score-Iter3_gp_2b-table-0.001
86673872245ad902f8d466bdc20edae9c115b965
20.032169
0
8
false
true
true
false
true
0.675094
0.548224
54.822427
0.488717
26.839803
0.044562
4.456193
0.260906
1.454139
0.363271
2.942187
0.367104
29.678265
false
2024-09-28
2024-09-29
0
xukp20/Llama-3-8B-Instruct-SPPO-score-Iter3_gp_2b-table-0.001
xukp20_llama-3-8b-instruct-sppo-iter1-gp-2b-tau01-table_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/xukp20/llama-3-8b-instruct-sppo-iter1-gp-2b-tau01-table" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xukp20/llama-3-8b-instruct-sppo-iter1-gp-2b-tau01-table</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xukp20__llama-3-8b-instruct-sppo-iter1-gp-2b-tau01-table-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
xukp20/llama-3-8b-instruct-sppo-iter1-gp-2b-tau01-table
abb3afe2b0398b24ed823b0124c8a72d354487bd
23.498955
0
8
false
true
true
false
true
1.379342
0.690931
69.093117
0.497846
28.119887
0.0929
9.29003
0.259228
1.230425
0.367333
3.083333
0.371592
30.176936
false
2024-09-22
2024-09-23
0
xukp20/llama-3-8b-instruct-sppo-iter1-gp-2b-tau01-table
xxx777xxxASD_L3.1-ClaudeMaid-4x8B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/xxx777xxxASD/L3.1-ClaudeMaid-4x8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xxx777xxxASD/L3.1-ClaudeMaid-4x8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xxx777xxxASD__L3.1-ClaudeMaid-4x8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
xxx777xxxASD/L3.1-ClaudeMaid-4x8B
2a98d9cb91c7aa775acbf5bfe7bb91beb2faf682
26.190883
llama3.1
7
24
true
true
false
false
true
2.376185
0.669649
66.964875
0.507085
29.437348
0.128399
12.839879
0.291107
5.480984
0.428937
13.750521
0.358045
28.67169
false
2024-07-27
2024-07-28
0
xxx777xxxASD/L3.1-ClaudeMaid-4x8B
yam-peleg_Hebrew-Gemma-11B-Instruct_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
GemmaForCausalLM
<a target="_blank" href="https://huggingface.co/yam-peleg/Hebrew-Gemma-11B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">yam-peleg/Hebrew-Gemma-11B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/yam-peleg__Hebrew-Gemma-11B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
yam-peleg/Hebrew-Gemma-11B-Instruct
a40259d1efbcac4829ed44d3b589716f615ed362
13.919763
other
22
10
true
true
true
false
true
1.937267
0.302077
30.207738
0.403578
16.862741
0.057402
5.740181
0.276007
3.467562
0.408854
9.973438
0.255402
17.266918
false
2024-03-06
2024-07-31
0
yam-peleg/Hebrew-Gemma-11B-Instruct
yam-peleg_Hebrew-Mistral-7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/yam-peleg/Hebrew-Mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">yam-peleg/Hebrew-Mistral-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/yam-peleg__Hebrew-Mistral-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
yam-peleg/Hebrew-Mistral-7B
3d32134b5959492fd7efbbf16395352594bc89f7
13.302117
apache-2.0
62
7
true
true
true
false
false
1.399281
0.232834
23.283443
0.433404
20.17694
0.049849
4.984894
0.279362
3.914989
0.397656
7.673698
0.278009
19.778738
false
2024-04-26
2024-07-11
0
yam-peleg/Hebrew-Mistral-7B
yam-peleg_Hebrew-Mistral-7B-200K_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/yam-peleg/Hebrew-Mistral-7B-200K" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">yam-peleg/Hebrew-Mistral-7B-200K</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/yam-peleg__Hebrew-Mistral-7B-200K-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
yam-peleg/Hebrew-Mistral-7B-200K
7b51c7b31e3d9e29ea964c579a45233cfad255fe
10.644291
apache-2.0
15
7
true
true
true
false
false
0.735312
0.185573
18.557317
0.414927
17.493603
0.023414
2.34139
0.276007
3.467562
0.376479
4.526563
0.257314
17.479314
false
2024-05-05
2024-07-11
0
yam-peleg/Hebrew-Mistral-7B-200K
yam-peleg_Hebrew-Mistral-7B-200K_bfloat16
bfloat16
🟩 continuously pretrained
🟩
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/yam-peleg/Hebrew-Mistral-7B-200K" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">yam-peleg/Hebrew-Mistral-7B-200K</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/yam-peleg__Hebrew-Mistral-7B-200K-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
yam-peleg/Hebrew-Mistral-7B-200K
7b51c7b31e3d9e29ea964c579a45233cfad255fe
8.235612
apache-2.0
15
7
true
true
true
false
true
1.684494
0.17698
17.698041
0.34105
7.671324
0.021903
2.190332
0.253356
0.447427
0.374
4.416667
0.252909
16.989879
false
2024-05-05
2024-08-06
0
yam-peleg/Hebrew-Mistral-7B-200K
ycros_BagelMIsteryTour-v2-8x7B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/ycros/BagelMIsteryTour-v2-8x7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ycros/BagelMIsteryTour-v2-8x7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ycros__BagelMIsteryTour-v2-8x7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ycros/BagelMIsteryTour-v2-8x7B
98a8b319707be3dab1659594da69a37ed8f8c148
24.258614
cc-by-nc-4.0
16
46
true
false
true
false
true
3.649132
0.599432
59.943173
0.515924
31.699287
0.07855
7.854985
0.30453
7.270694
0.420292
11.303125
0.347324
27.480423
false
2024-01-19
2024-06-28
1
ycros/BagelMIsteryTour-v2-8x7B (Merge)
ycros_BagelMIsteryTour-v2-8x7B_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/ycros/BagelMIsteryTour-v2-8x7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ycros/BagelMIsteryTour-v2-8x7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ycros__BagelMIsteryTour-v2-8x7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ycros/BagelMIsteryTour-v2-8x7B
98a8b319707be3dab1659594da69a37ed8f8c148
24.724802
cc-by-nc-4.0
16
46
true
false
true
false
true
3.619337
0.62621
62.620957
0.514194
31.366123
0.087613
8.761329
0.307886
7.718121
0.41375
10.31875
0.348072
27.563534
false
2024-01-19
2024-08-04
1
ycros/BagelMIsteryTour-v2-8x7B (Merge)
yfzp_Llama-3-8B-Instruct-SPPO-Iter1_bt_2b-table_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/yfzp/Llama-3-8B-Instruct-SPPO-Iter1_bt_2b-table" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">yfzp/Llama-3-8B-Instruct-SPPO-Iter1_bt_2b-table</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/yfzp__Llama-3-8B-Instruct-SPPO-Iter1_bt_2b-table-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
yfzp/Llama-3-8B-Instruct-SPPO-Iter1_bt_2b-table
97b2d0e790a6fcdf39c34a2043f0818368c7dcb3
22.974571
0
8
false
true
true
false
true
0.618253
0.670898
67.089766
0.498661
28.170107
0.073263
7.326284
0.259228
1.230425
0.372698
3.853906
0.371592
30.176936
false
2024-09-29
2024-09-30
0
yfzp/Llama-3-8B-Instruct-SPPO-Iter1_bt_2b-table
yfzp_Llama-3-8B-Instruct-SPPO-Iter1_bt_8b-table_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/yfzp/Llama-3-8B-Instruct-SPPO-Iter1_bt_8b-table" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">yfzp/Llama-3-8B-Instruct-SPPO-Iter1_bt_8b-table</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/yfzp__Llama-3-8B-Instruct-SPPO-Iter1_bt_8b-table-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
yfzp/Llama-3-8B-Instruct-SPPO-Iter1_bt_8b-table
e8786291c206d5cd1b01d29466e3b397278f4e2b
24.877776
0
8
false
true
true
false
true
0.640663
0.733271
73.327105
0.508036
29.308128
0.097432
9.743202
0.260067
1.342282
0.380604
5.008854
0.374834
30.537086
false
2024-09-29
2024-09-30
0
yfzp/Llama-3-8B-Instruct-SPPO-Iter1_bt_8b-table
yfzp_Llama-3-8B-Instruct-SPPO-Iter1_gp_2b-table_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/yfzp/Llama-3-8B-Instruct-SPPO-Iter1_gp_2b-table" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">yfzp/Llama-3-8B-Instruct-SPPO-Iter1_gp_2b-table</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/yfzp__Llama-3-8B-Instruct-SPPO-Iter1_gp_2b-table-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
yfzp/Llama-3-8B-Instruct-SPPO-Iter1_gp_2b-table
0d9cb29aa87b0c17ed011ffbc83803f3f6dd18e7
23.168114
0
8
false
true
true
false
true
0.679554
0.678466
67.846647
0.494121
27.469588
0.095166
9.516616
0.259228
1.230425
0.364667
2.75
0.371759
30.195405
false
2024-09-29
2024-09-29
0
yfzp/Llama-3-8B-Instruct-SPPO-Iter1_gp_2b-table
yfzp_Llama-3-8B-Instruct-SPPO-Iter1_gp_8b-table_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/yfzp/Llama-3-8B-Instruct-SPPO-Iter1_gp_8b-table" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">yfzp/Llama-3-8B-Instruct-SPPO-Iter1_gp_8b-table</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/yfzp__Llama-3-8B-Instruct-SPPO-Iter1_gp_8b-table-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
yfzp/Llama-3-8B-Instruct-SPPO-Iter1_gp_8b-table
7a326a956e6169b287a04ef93cdc0342a0f3311a
24.001677
0
8
false
true
true
false
true
0.648184
0.713188
71.318768
0.502536
28.604424
0.093656
9.365559
0.259228
1.230425
0.371333
3.683333
0.368268
29.80755
false
2024-09-29
2024-09-29
0
yfzp/Llama-3-8B-Instruct-SPPO-Iter1_gp_8b-table
yfzp_Llama-3-8B-Instruct-SPPO-score-Iter1_bt_2b-table-0.001_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/yfzp/Llama-3-8B-Instruct-SPPO-score-Iter1_bt_2b-table-0.001" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">yfzp/Llama-3-8B-Instruct-SPPO-score-Iter1_bt_2b-table-0.001</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/yfzp__Llama-3-8B-Instruct-SPPO-score-Iter1_bt_2b-table-0.001-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
yfzp/Llama-3-8B-Instruct-SPPO-score-Iter1_bt_2b-table-0.001
e5c8baadbf6ce17b344596ad42bd3546f66e253e
22.364867
0
8
false
true
true
false
true
0.582235
0.649565
64.956538
0.497946
28.099199
0.048338
4.833837
0.259228
1.230425
0.377969
4.846094
0.372008
30.223109
false
2024-09-29
2024-09-30
0
yfzp/Llama-3-8B-Instruct-SPPO-score-Iter1_bt_2b-table-0.001
yfzp_Llama-3-8B-Instruct-SPPO-score-Iter1_bt_8b-table-0.002_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/yfzp/Llama-3-8B-Instruct-SPPO-score-Iter1_bt_8b-table-0.002" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">yfzp/Llama-3-8B-Instruct-SPPO-score-Iter1_bt_8b-table-0.002</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/yfzp__Llama-3-8B-Instruct-SPPO-score-Iter1_bt_8b-table-0.002-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
yfzp/Llama-3-8B-Instruct-SPPO-score-Iter1_bt_8b-table-0.002
064e237b850151938caf171a4c8c7e34c93e580e
24.319539
0
8
false
true
true
false
true
0.606022
0.719607
71.960731
0.504515
28.785911
0.07855
7.854985
0.260067
1.342282
0.383146
5.593229
0.373421
30.380098
false
2024-09-29
2024-09-30
0
yfzp/Llama-3-8B-Instruct-SPPO-score-Iter1_bt_8b-table-0.002
yfzp_Llama-3-8B-Instruct-SPPO-score-Iter1_gp_2b-table-0.001_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/yfzp/Llama-3-8B-Instruct-SPPO-score-Iter1_gp_2b-table-0.001" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">yfzp/Llama-3-8B-Instruct-SPPO-score-Iter1_gp_2b-table-0.001</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/yfzp__Llama-3-8B-Instruct-SPPO-score-Iter1_gp_2b-table-0.001-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
yfzp/Llama-3-8B-Instruct-SPPO-score-Iter1_gp_2b-table-0.001
b685b90063258e05f8b4930fdbce2e565f13f620
22.384837
0
8
false
true
true
false
true
0.649092
0.65044
65.043972
0.495788
27.825253
0.073263
7.326284
0.259228
1.230425
0.366031
2.853906
0.370263
30.029181
false
2024-09-29
2024-09-29
0
yfzp/Llama-3-8B-Instruct-SPPO-score-Iter1_gp_2b-table-0.001
yfzp_Llama-3-8B-Instruct-SPPO-score-Iter1_gp_8b-table-0.002_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/yfzp/Llama-3-8B-Instruct-SPPO-score-Iter1_gp_8b-table-0.002" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">yfzp/Llama-3-8B-Instruct-SPPO-score-Iter1_gp_8b-table-0.002</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/yfzp__Llama-3-8B-Instruct-SPPO-score-Iter1_gp_8b-table-0.002-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
yfzp/Llama-3-8B-Instruct-SPPO-score-Iter1_gp_8b-table-0.002
5ab3f2cfc96bdda3b5a629ab4a81adf7394ba90a
23.522522
0
8
false
true
true
false
true
0.60769
0.701597
70.159732
0.499155
28.120615
0.073263
7.326284
0.259228
1.230425
0.377906
4.638281
0.366938
29.659796
false
2024-09-29
2024-09-29
0
yfzp/Llama-3-8B-Instruct-SPPO-score-Iter1_gp_8b-table-0.002
yifAI_Llama-3-8B-Instruct-SPPO-score-Iter3_gp_8b-table-0.002_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/yifAI/Llama-3-8B-Instruct-SPPO-score-Iter3_gp_8b-table-0.002" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">yifAI/Llama-3-8B-Instruct-SPPO-score-Iter3_gp_8b-table-0.002</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/yifAI__Llama-3-8B-Instruct-SPPO-score-Iter3_gp_8b-table-0.002-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
yifAI/Llama-3-8B-Instruct-SPPO-score-Iter3_gp_8b-table-0.002
7a046b74179225d6055dd8aa601b5234f817b1e5
22.624782
0
8
false
true
true
false
true
0.672016
0.648966
64.896586
0.491452
27.281064
0.068731
6.873112
0.261745
1.565996
0.389875
7.134375
0.351978
27.997562
false
2024-09-30
0
Removed
ylalain_ECE-PRYMMAL-YL-1B-SLERP-V8_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/ylalain/ECE-PRYMMAL-YL-1B-SLERP-V8" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ylalain/ECE-PRYMMAL-YL-1B-SLERP-V8</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ylalain__ECE-PRYMMAL-YL-1B-SLERP-V8-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ylalain/ECE-PRYMMAL-YL-1B-SLERP-V8
2c00dbc74e55d42fbc8b08f474fb9568f820edb9
9.604139
apache-2.0
0
1
true
true
true
false
false
0.548428
0.150527
15.052727
0.397557
15.175392
0
0
0.28943
5.257271
0.387458
6.765625
0.238364
15.373818
false
2024-11-13
2024-11-13
0
ylalain/ECE-PRYMMAL-YL-1B-SLERP-V8
ymcki_gemma-2-2b-ORPO-jpn-it-abliterated-18_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/ymcki/gemma-2-2b-ORPO-jpn-it-abliterated-18" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ymcki/gemma-2-2b-ORPO-jpn-it-abliterated-18</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ymcki__gemma-2-2b-ORPO-jpn-it-abliterated-18-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ymcki/gemma-2-2b-ORPO-jpn-it-abliterated-18
aed2a9061ffa21beaec0d617a9605e160136aab4
14.633781
gemma
0
2
true
true
true
false
true
6.200402
0.463095
46.309459
0.40529
16.301992
0.003776
0.377644
0.288591
5.145414
0.375427
4.728385
0.234458
14.93979
false
2024-10-30
2024-11-16
3
google/gemma-2-2b
ymcki_gemma-2-2b-ORPO-jpn-it-abliterated-18-merge_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/ymcki/gemma-2-2b-ORPO-jpn-it-abliterated-18-merge" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ymcki/gemma-2-2b-ORPO-jpn-it-abliterated-18-merge</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ymcki__gemma-2-2b-ORPO-jpn-it-abliterated-18-merge-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ymcki/gemma-2-2b-ORPO-jpn-it-abliterated-18-merge
b72be0a7879f0d82cb2024cfc1d02c370ce3efe8
15.737663
gemma
0
2
true
true
true
false
true
1.98799
0.521821
52.182099
0.414689
17.348337
0.008308
0.830816
0.283557
4.474273
0.351396
3.357813
0.246094
16.232639
false
2024-10-30
2024-11-16
3
google/gemma-2-2b
ymcki_gemma-2-2b-jpn-it-abliterated-17_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/ymcki/gemma-2-2b-jpn-it-abliterated-17" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ymcki/gemma-2-2b-jpn-it-abliterated-17</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ymcki__gemma-2-2b-jpn-it-abliterated-17-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ymcki/gemma-2-2b-jpn-it-abliterated-17
e6f82b93dae0b8207aa3252ab4157182e2610787
15.002982
gemma
1
2
true
true
true
false
true
1.104509
0.508157
50.815724
0.407627
16.234749
0
0
0.271812
2.908277
0.370062
3.891146
0.245512
16.167996
false
2024-10-16
2024-10-18
3
google/gemma-2-2b
ymcki_gemma-2-2b-jpn-it-abliterated-17-18-24_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/ymcki/gemma-2-2b-jpn-it-abliterated-17-18-24" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ymcki/gemma-2-2b-jpn-it-abliterated-17-18-24</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ymcki__gemma-2-2b-jpn-it-abliterated-17-18-24-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ymcki/gemma-2-2b-jpn-it-abliterated-17-18-24
38f56fcb99bd64278a1d90dd23aea527036329a0
14.019765
gemma
0
2
true
true
true
false
true
0.704859
0.505484
50.548434
0.381236
13.114728
0
0
0.28104
4.138702
0.350156
2.069531
0.228225
14.247193
false
2024-11-06
2024-11-06
3
google/gemma-2-2b
ymcki_gemma-2-2b-jpn-it-abliterated-17-ORPO_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/ymcki/gemma-2-2b-jpn-it-abliterated-17-ORPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ymcki/gemma-2-2b-jpn-it-abliterated-17-ORPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ymcki__gemma-2-2b-jpn-it-abliterated-17-ORPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ymcki/gemma-2-2b-jpn-it-abliterated-17-ORPO
531b2e2043285cb40cd0433f5ad43441f8ac6b6c
14.516851
gemma
1
2
true
true
true
false
true
9.681597
0.474785
47.478468
0.389798
14.389413
0.042296
4.229607
0.274329
3.243848
0.37676
4.528385
0.219082
13.231383
false
2024-10-18
2024-10-27
3
google/gemma-2-2b
ymcki_gemma-2-2b-jpn-it-abliterated-17-ORPO-alpaca_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/ymcki/gemma-2-2b-jpn-it-abliterated-17-ORPO-alpaca" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ymcki/gemma-2-2b-jpn-it-abliterated-17-ORPO-alpaca</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ymcki__gemma-2-2b-jpn-it-abliterated-17-ORPO-alpaca-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ymcki/gemma-2-2b-jpn-it-abliterated-17-ORPO-alpaca
5503b5e892be463fa4b1d265b8ba9ba4304af012
12.001731
gemma
2
2
true
true
true
false
true
1.184666
0.306473
30.647349
0.40716
16.922412
0.000755
0.075529
0.269295
2.572707
0.396917
7.914583
0.2249
13.877807
false
2024-10-27
2024-10-27
3
google/gemma-2-2b
ymcki_gemma-2-2b-jpn-it-abliterated-18_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/ymcki/gemma-2-2b-jpn-it-abliterated-18" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ymcki/gemma-2-2b-jpn-it-abliterated-18</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ymcki__gemma-2-2b-jpn-it-abliterated-18-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ymcki/gemma-2-2b-jpn-it-abliterated-18
c50b85f9b60b444f85fe230b8d77fcbc7b18ef91
15.503245
gemma
1
2
true
true
true
false
true
1.052664
0.517525
51.752461
0.413219
17.143415
0
0
0.27349
3.131991
0.374156
4.269531
0.250499
16.722074
false
2024-10-15
2024-10-18
3
google/gemma-2-2b
ymcki_gemma-2-2b-jpn-it-abliterated-18-ORPO_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/ymcki/gemma-2-2b-jpn-it-abliterated-18-ORPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ymcki/gemma-2-2b-jpn-it-abliterated-18-ORPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ymcki__gemma-2-2b-jpn-it-abliterated-18-ORPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ymcki/gemma-2-2b-jpn-it-abliterated-18-ORPO
b9f41f53827b8a5a600546b41f63023bf84617a3
14.943472
gemma
0
2
true
true
true
false
true
1.610377
0.474235
47.423503
0.403894
16.538079
0.035498
3.549849
0.261745
1.565996
0.395333
7.416667
0.218501
13.166741
false
2024-10-22
2024-10-22
3
google/gemma-2-2b
ymcki_gemma-2-2b-jpn-it-abliterated-24_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/ymcki/gemma-2-2b-jpn-it-abliterated-24" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ymcki/gemma-2-2b-jpn-it-abliterated-24</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ymcki__gemma-2-2b-jpn-it-abliterated-24-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ymcki/gemma-2-2b-jpn-it-abliterated-24
06c129ba5261ee88e32035c88f90ca11d835175d
15.604076
gemma
0
2
true
true
true
false
true
0.810442
0.497866
49.786566
0.41096
16.77259
0
0
0.277685
3.691275
0.39149
7.002865
0.24734
16.371158
false
2024-10-24
2024-10-25
3
google/gemma-2-2b
yuvraj17_Llama3-8B-SuperNova-Spectrum-Hermes-DPO_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/yuvraj17/Llama3-8B-SuperNova-Spectrum-Hermes-DPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">yuvraj17/Llama3-8B-SuperNova-Spectrum-Hermes-DPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/yuvraj17__Llama3-8B-SuperNova-Spectrum-Hermes-DPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
yuvraj17/Llama3-8B-SuperNova-Spectrum-Hermes-DPO
0da9f780f7dd94ed1e10c8d3e082472ff2922177
18.075579
apache-2.0
0
8
true
true
true
false
true
0.97203
0.46909
46.908979
0.439987
21.238563
0.055891
5.589124
0.302013
6.935123
0.401219
9.61901
0.263464
18.162677
false
2024-09-24
2024-09-30
0
yuvraj17/Llama3-8B-SuperNova-Spectrum-Hermes-DPO
yuvraj17_Llama3-8B-SuperNova-Spectrum-dare_ties_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/yuvraj17/Llama3-8B-SuperNova-Spectrum-dare_ties" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">yuvraj17/Llama3-8B-SuperNova-Spectrum-dare_ties</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/yuvraj17__Llama3-8B-SuperNova-Spectrum-dare_ties-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
yuvraj17/Llama3-8B-SuperNova-Spectrum-dare_ties
998d15b32900bc230727c8a7984e005f611723e9
19.134801
apache-2.0
0
8
true
false
true
false
false
0.914144
0.401271
40.127085
0.461579
23.492188
0.082326
8.232628
0.275168
3.355705
0.421094
11.003385
0.35738
28.597813
false
2024-09-22
2024-09-23
1
yuvraj17/Llama3-8B-SuperNova-Spectrum-dare_ties (Merge)
yuvraj17_Llama3-8B-abliterated-Spectrum-slerp_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/yuvraj17/Llama3-8B-abliterated-Spectrum-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">yuvraj17/Llama3-8B-abliterated-Spectrum-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/yuvraj17__Llama3-8B-abliterated-Spectrum-slerp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
yuvraj17/Llama3-8B-abliterated-Spectrum-slerp
28789950975ecf5aac846c3f2c0a5d6841651ee6
17.687552
apache-2.0
0
8
true
false
true
false
false
0.82666
0.288488
28.848788
0.497791
28.54693
0.058157
5.81571
0.301174
6.823266
0.399823
11.011198
0.325715
25.079418
false
2024-09-22
2024-09-23
1
yuvraj17/Llama3-8B-abliterated-Spectrum-slerp (Merge)
zake7749_gemma-2-2b-it-chinese-kyara-dpo_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zake7749/gemma-2-2b-it-chinese-kyara-dpo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zake7749/gemma-2-2b-it-chinese-kyara-dpo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zake7749__gemma-2-2b-it-chinese-kyara-dpo-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zake7749/gemma-2-2b-it-chinese-kyara-dpo
bbc011dae0416c1664a0287f3a7a0f9563deac91
19.334585
gemma
6
2
true
true
true
false
false
1.279309
0.538208
53.820751
0.425746
19.061804
0.066465
6.646526
0.266779
2.237136
0.457563
16.761979
0.257314
17.479314
false
2024-08-18
2024-10-17
1
zake7749/gemma-2-2b-it-chinese-kyara-dpo (Merge)
zelk12_Gemma-2-TM-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/Gemma-2-TM-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/Gemma-2-TM-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__Gemma-2-TM-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/Gemma-2-TM-9B
42366d605e6bdad354a5632547e37d34d300ff7a
30.151929
0
10
false
true
true
false
true
1.967893
0.804462
80.446216
0.598659
42.049491
0
0
0.346477
12.863535
0.41524
11.238281
0.408826
34.314051
false
2024-11-06
2024-11-06
1
zelk12/Gemma-2-TM-9B (Merge)
zelk12_MT-Gen1-gemma-2-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/MT-Gen1-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT-Gen1-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT-Gen1-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/MT-Gen1-gemma-2-9B
b78f8883614cbbdf182ebb4acf8a8c124bc782ae
33.041356
0
10
false
true
true
false
true
3.362746
0.788625
78.862529
0.61
44.011247
0.133686
13.36858
0.346477
12.863535
0.421688
11.577604
0.438082
37.564642
false
2024-10-23
2024-10-23
1
zelk12/MT-Gen1-gemma-2-9B (Merge)
zelk12_MT-Gen2-gemma-2-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/MT-Gen2-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT-Gen2-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT-Gen2-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/MT-Gen2-gemma-2-9B
c723f8b9b7334fddd1eb8b6e5230b76fb18139a5
33.644495
1
10
false
true
true
false
true
1.989448
0.790749
79.074855
0.610049
44.107782
0.148792
14.879154
0.346477
12.863535
0.432292
13.303125
0.438747
37.63852
false
2024-11-10
2024-11-10
1
zelk12/MT-Gen2-gemma-2-9B (Merge)
zelk12_MT-Merge-gemma-2-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/MT-Merge-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT-Merge-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT-Merge-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/MT-Merge-gemma-2-9B
f4c3b001bc8692bcbbd7005b6f8db048e651aa46
33.393208
3
10
false
true
true
false
true
3.219056
0.803538
80.353795
0.611838
44.320842
0.13142
13.141994
0.348154
13.087248
0.425625
12.103125
0.43617
37.352246
false
2024-10-22
2024-10-22
1
zelk12/MT-Merge-gemma-2-9B (Merge)
zelk12_MT-Merge1-gemma-2-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/MT-Merge1-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT-Merge1-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT-Merge1-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/MT-Merge1-gemma-2-9B
71bb4577c877715f3f6646a224b184544639c856
33.130536
1
10
false
true
true
false
true
4.036662
0.788625
78.862529
0.61
44.058246
0.126888
12.688822
0.35151
13.534676
0.424385
12.148177
0.437417
37.490765
false
2024-11-07
2024-11-07
1
zelk12/MT-Merge1-gemma-2-9B (Merge)
zelk12_MT-gemma-2-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/MT-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/MT-gemma-2-9B
24e1f894517b86dd866c1a5999ced4a5924dcd90
30.239612
2
10
false
true
true
false
true
3.023399
0.796843
79.684349
0.60636
43.324243
0.003021
0.302115
0.345638
12.751678
0.407115
9.55599
0.422374
35.819297
false
2024-10-11
2024-10-11
1
zelk12/MT-gemma-2-9B (Merge)
zelk12_MT1-Gen1-gemma-2-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/MT1-Gen1-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT1-Gen1-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT1-Gen1-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/MT1-Gen1-gemma-2-9B
939ac6c12059a18fc1117cdb3861f46816eff2fb
33.232259
0
10
false
true
true
false
true
3.362485
0.797443
79.744301
0.611779
44.273282
0.122356
12.23565
0.34396
12.527964
0.430958
13.103125
0.437583
37.509235
false
2024-10-23
2024-10-24
1
zelk12/MT1-Gen1-gemma-2-9B (Merge)
zelk12_MT1-Gen2-gemma-2-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/MT1-Gen2-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT1-Gen2-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT1-Gen2-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/MT1-Gen2-gemma-2-9B
aeaca7dc7d50a425a5d3c38d7c4a7daf1c772ad4
33.142398
2
10
false
true
true
false
true
1.995995
0.798367
79.836722
0.609599
43.919191
0.113293
11.329305
0.352349
13.646532
0.428354
12.844271
0.435505
37.278369
false
2024-11-11
2024-11-11
1
zelk12/MT1-Gen2-gemma-2-9B (Merge)
zelk12_MT1-gemma-2-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/MT1-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT1-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT1-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/MT1-gemma-2-9B
3a5e77518ca9c3c8ea2edac4c03bc220ee91f3ed
33.633829
1
10
false
true
true
false
true
3.345719
0.79467
79.467036
0.610875
44.161526
0.149547
14.954683
0.345638
12.751678
0.432229
13.161979
0.435755
37.306073
false
2024-10-12
2024-10-14
1
zelk12/MT1-gemma-2-9B (Merge)
zelk12_MT2-Gen1-gemma-2-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/MT2-Gen1-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT2-Gen1-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT2-Gen1-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/MT2-Gen1-gemma-2-9B
167abf8eb4ea01fecd42dc32ad68160c51a8685a
32.460223
0
10
false
true
true
false
true
3.38321
0.785578
78.557782
0.61008
44.141103
0.101208
10.120846
0.343121
12.416107
0.424323
12.007031
0.437666
37.518469
false
2024-10-24
2024-10-27
1
zelk12/MT2-Gen1-gemma-2-9B (Merge)
zelk12_MT2-Gen2-gemma-2-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/MT2-Gen2-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT2-Gen2-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT2-Gen2-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/MT2-Gen2-gemma-2-9B
24c487499b5833424ffb9932eed838bb254f61b4
33.471172
3
10
false
true
true
false
true
2.037441
0.7889
78.890012
0.609292
44.044503
0.148036
14.803625
0.346477
12.863535
0.427021
12.577604
0.43883
37.647754
false
2024-11-12
2024-11-12
1
zelk12/MT2-Gen2-gemma-2-9B (Merge)
zelk12_MT2-gemma-2-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/MT2-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT2-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT2-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/MT2-gemma-2-9B
d20d7169ce0f53d586504c50b4b7dc470bf8a781
33.2825
1
10
false
true
true
false
true
3.19411
0.788575
78.857542
0.611511
44.167481
0.147281
14.728097
0.347315
12.975391
0.421656
11.540365
0.436835
37.426123
false
2024-10-14
2024-10-15
1
zelk12/MT2-gemma-2-9B (Merge)
zelk12_MT3-Gen1-gemma-2-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/MT3-Gen1-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT3-Gen1-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT3-Gen1-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/MT3-Gen1-gemma-2-9B
cd78df9e67e2e710d8d305f5a03a92c01b1b425d
31.054845
1
10
false
true
true
false
true
3.113666
0.783779
78.377926
0.610676
44.119495
0.032477
3.247734
0.346477
12.863535
0.415115
10.75599
0.43268
36.964391
false
2024-10-24
2024-10-28
1
zelk12/MT3-Gen1-gemma-2-9B (Merge)
zelk12_MT3-gemma-2-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/MT3-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT3-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT3-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/MT3-gemma-2-9B
d501b6ea59896fac3dc0a623501a5493b3573cde
32.352524
1
10
false
true
true
false
true
3.136653
0.778609
77.860854
0.613078
44.248465
0.104985
10.498489
0.344799
12.639821
0.424292
11.903125
0.43268
36.964391
false
2024-10-15
2024-10-16
1
zelk12/MT3-gemma-2-9B (Merge)
zelk12_MT4-Gen1-gemma-2-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/MT4-Gen1-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT4-Gen1-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT4-Gen1-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/MT4-Gen1-gemma-2-9B
6ed2c66246c7f354decfd3579acb534dc4b0b48c
33.544994
0
10
false
true
true
false
true
2.103561
0.7895
78.949964
0.609383
44.009524
0.150302
15.030211
0.34396
12.527964
0.432229
13.095313
0.438913
37.656989
false
2024-10-25
2024-10-29
1
zelk12/MT4-Gen1-gemma-2-9B (Merge)
zelk12_MT4-gemma-2-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/MT4-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT4-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT4-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/MT4-gemma-2-9B
2167ea02baf9145a697a7d828a17c75b86e5e282
33.447349
0
10
false
true
true
false
true
3.155259
0.776161
77.616059
0.607314
43.553827
0.173716
17.371601
0.338087
11.744966
0.430927
12.999219
0.436586
37.398419
false
2024-10-16
2024-10-20
1
zelk12/MT4-gemma-2-9B (Merge)
zelk12_MT5-Gen1-gemma-2-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/MT5-Gen1-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT5-Gen1-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT5-Gen1-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/MT5-Gen1-gemma-2-9B
0291b776e80f38381788cd8f1fb2c3435ad891b5
31.897632
0
10
false
true
true
false
true
2.017253
0.78313
78.312987
0.611048
44.183335
0.068731
6.873112
0.347315
12.975391
0.420385
11.614844
0.436835
37.426123
false
2024-10-25
2024-10-31
1
zelk12/MT5-Gen1-gemma-2-9B (Merge)
zelk12_MT5-gemma-2-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/MT5-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT5-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT5-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/MT5-gemma-2-9B
b627ae7d796b1ae85b59c55e0e043b8d3ae73d83
32.595305
0
10
false
true
true
false
true
3.26983
0.804787
80.478685
0.611223
44.271257
0.095166
9.516616
0.343121
12.416107
0.420385
11.48151
0.436669
37.407654
false
2024-10-19
2024-10-21
1
zelk12/MT5-gemma-2-9B (Merge)
zelk12_recoilme-gemma-2-Ataraxy-9B-v0.1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__recoilme-gemma-2-Ataraxy-9B-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1
b4208ddf6c741884c16c77b9433d9ead8f216354
30.344893
2
10
false
true
true
false
true
3.443191
0.764895
76.489492
0.607451
43.706516
0.013595
1.359517
0.349832
13.310962
0.413625
10.303125
0.432098
36.899749
false
2024-10-03
2024-10-03
1
zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1 (Merge)
zelk12_recoilme-gemma-2-Ataraxy-9B-v0.1-t0.25_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1-t0.25" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1-t0.25</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__recoilme-gemma-2-Ataraxy-9B-v0.1-t0.25-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1-t0.25
e652c9e07265526851dad994f4640aa265b9ab56
33.300246
1
10
false
true
true
false
true
3.194991
0.770665
77.066517
0.607543
43.85035
0.155589
15.558912
0.343121
12.416107
0.43226
13.132552
0.439993
37.777039
false
2024-10-04
2024-10-04
1
zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1-t0.25 (Merge)
zelk12_recoilme-gemma-2-Ataraxy-9B-v0.1-t0.75_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1-t0.75" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1-t0.75</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__recoilme-gemma-2-Ataraxy-9B-v0.1-t0.75-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1-t0.75
eb0e589291630ba20328db650f74af949d217a97
28.421762
0
10
false
true
true
false
true
3.751453
0.720806
72.080635
0.59952
42.487153
0
0
0.349832
13.310962
0.395115
7.75599
0.414063
34.895833
false
2024-10-04
2024-10-04
1
zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1-t0.75 (Merge)
zelk12_recoilme-gemma-2-Ataraxy-9B-v0.2_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/recoilme-gemma-2-Ataraxy-9B-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/recoilme-gemma-2-Ataraxy-9B-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__recoilme-gemma-2-Ataraxy-9B-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/recoilme-gemma-2-Ataraxy-9B-v0.2
76f56b25bf6d8704282f8c77bfda28ca384883bc
30.113979
1
10
false
true
true
false
true
3.413675
0.759999
75.999902
0.606626
43.633588
0.012085
1.208459
0.348154
13.087248
0.410958
9.836458
0.432264
36.918218
false
2024-10-07
2024-10-11
1
zelk12/recoilme-gemma-2-Ataraxy-9B-v0.2 (Merge)
zelk12_recoilme-gemma-2-Gutenberg-Doppel-9B-v0.1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/recoilme-gemma-2-Gutenberg-Doppel-9B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/recoilme-gemma-2-Gutenberg-Doppel-9B-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__recoilme-gemma-2-Gutenberg-Doppel-9B-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/recoilme-gemma-2-Gutenberg-Doppel-9B-v0.1
1e3e623e9f0b386bfd967c629dd39c87daef5bed
31.626376
1
10
false
true
true
false
true
6.461752
0.761523
76.152276
0.609878
43.941258
0.073263
7.326284
0.341443
12.192394
0.431021
13.310937
0.431516
36.835106
false
2024-10-07
2024-10-07
1
zelk12/recoilme-gemma-2-Gutenberg-Doppel-9B-v0.1 (Merge)
zelk12_recoilme-gemma-2-Ifable-9B-v0.1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/recoilme-gemma-2-Ifable-9B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/recoilme-gemma-2-Ifable-9B-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__recoilme-gemma-2-Ifable-9B-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/recoilme-gemma-2-Ifable-9B-v0.1
8af6620b39c9a36239879b6b2bd88f66e9e9d930
32.254423
0
10
false
true
true
false
true
6.542869
0.794396
79.439554
0.60644
43.39057
0.09139
9.138973
0.35151
13.534676
0.420229
11.095313
0.432347
36.927453
false
2024-10-07
2024-10-07
1
zelk12/recoilme-gemma-2-Ifable-9B-v0.1 (Merge)
zelk12_recoilme-gemma-2-psy10k-mental_healt-9B-v0.1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/zelk12/recoilme-gemma-2-psy10k-mental_healt-9B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/recoilme-gemma-2-psy10k-mental_healt-9B-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__recoilme-gemma-2-psy10k-mental_healt-9B-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/recoilme-gemma-2-psy10k-mental_healt-9B-v0.1
ced039b03be6f65ac0f713efcee76c6534e65639
32.448061
0
10
false
true
true
false
true
3.13222
0.744537
74.453672
0.597759
42.132683
0.180514
18.05136
0.34396
12.527964
0.429469
12.183594
0.418052
35.339096
false
2024-10-07
2024-10-07
1
zelk12/recoilme-gemma-2-psy10k-mental_healt-9B-v0.1 (Merge)
zetasepic_Qwen2.5-72B-Instruct-abliterated_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/zetasepic/Qwen2.5-72B-Instruct-abliterated" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zetasepic/Qwen2.5-72B-Instruct-abliterated</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zetasepic__Qwen2.5-72B-Instruct-abliterated-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zetasepic/Qwen2.5-72B-Instruct-abliterated
af94b3c05c9857dbac73afb1cbce00e4833ec9ef
45.293139
other
9
72
true
true
true
false
false
18.809182
0.715261
71.526106
0.715226
59.912976
0.46148
46.148036
0.406879
20.917226
0.471917
19.122917
0.587184
54.131575
false
2024-10-01
2024-11-08
2
Qwen/Qwen2.5-72B
zhengr_MixTAO-7Bx2-MoE-v8.1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/zhengr/MixTAO-7Bx2-MoE-v8.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zhengr/MixTAO-7Bx2-MoE-v8.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zhengr__MixTAO-7Bx2-MoE-v8.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zhengr/MixTAO-7Bx2-MoE-v8.1
828e963abf2db0f5af9ed0d4034e538fc1cf5f40
17.168311
apache-2.0
54
12
true
true
false
false
true
0.92739
0.418781
41.878106
0.420194
19.176907
0.066465
6.646526
0.298658
6.487696
0.397625
8.303125
0.284658
20.517509
false
2024-02-26
2024-06-27
0
zhengr/MixTAO-7Bx2-MoE-v8.1