eval_name
stringlengths
12
111
Precision
stringclasses
3 values
Type
stringclasses
6 values
T
stringclasses
6 values
Weight type
stringclasses
2 values
Architecture
stringclasses
48 values
Model
stringlengths
355
650
fullname
stringlengths
4
102
Model sha
stringlengths
0
40
Average ⬆️
float64
1.41
51.2
Hub License
stringclasses
25 values
Hub ❤️
int64
0
5.84k
#Params (B)
int64
-1
140
Available on the hub
bool
2 classes
Not_Merged
bool
2 classes
MoE
bool
2 classes
Flagged
bool
1 class
Chat Template
bool
2 classes
CO₂ cost (kg)
float64
0.04
107
IFEval Raw
float64
0
0.87
IFEval
float64
0
86.7
BBH Raw
float64
0.28
0.75
BBH
float64
0.81
63.5
MATH Lvl 5 Raw
float64
0
0.51
MATH Lvl 5
float64
0
50.7
GPQA Raw
float64
0.22
0.44
GPQA
float64
0
24.9
MUSR Raw
float64
0.29
0.59
MUSR
float64
0
36.4
MMLU-PRO Raw
float64
0.1
0.7
MMLU-PRO
float64
0
66.8
Maintainer's Highlight
bool
2 classes
Upload To Hub Date
stringlengths
0
10
Submission Date
stringclasses
151 values
Generation
int64
0
6
Base Model
stringlengths
4
102
HuggingFaceH4_zephyr-7b-alpha_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/HuggingFaceH4/zephyr-7b-alpha" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HuggingFaceH4/zephyr-7b-alpha</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HuggingFaceH4__zephyr-7b-alpha-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
HuggingFaceH4/zephyr-7b-alpha
2ce2d025864af849b3e5029e2ec9d568eeda892d
18.571864
mit
1,101
7
true
true
true
false
true
0.795675
0.519148
51.914808
0.458786
23.955291
0.017372
1.73716
0.297819
6.375839
0.394958
7.503125
0.279505
19.944962
true
2023-10-09
2024-06-12
1
mistralai/Mistral-7B-v0.1
HuggingFaceH4_zephyr-7b-beta_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/HuggingFaceH4/zephyr-7b-beta" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HuggingFaceH4/zephyr-7b-beta</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HuggingFaceH4__zephyr-7b-beta-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
HuggingFaceH4/zephyr-7b-beta
b70e0c9a2d9e14bd1e812d3c398e5f313e93b473
17.767061
mit
1,609
7
true
true
true
false
true
0.555023
0.495043
49.504315
0.431582
21.487542
0.02719
2.719033
0.290268
5.369128
0.392542
7.734375
0.278092
19.787973
true
2023-10-26
2024-06-12
1
mistralai/Mistral-7B-v0.1
HuggingFaceH4_zephyr-7b-gemma-v0.1_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
GemmaForCausalLM
<a target="_blank" href="https://huggingface.co/HuggingFaceH4/zephyr-7b-gemma-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HuggingFaceH4/zephyr-7b-gemma-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HuggingFaceH4__zephyr-7b-gemma-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
HuggingFaceH4/zephyr-7b-gemma-v0.1
03b3427d0ed07d2e0f86c0a7e53d82d4beef9540
15.929338
other
121
8
true
true
true
false
true
1.481775
0.336374
33.637415
0.462374
23.751163
0.075529
7.55287
0.294463
5.928412
0.373969
4.179427
0.284741
20.526743
true
2024-03-01
2024-06-12
2
google/gemma-7b
HuggingFaceH4_zephyr-orpo-141b-A35b-v0.1_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/HuggingFaceH4/zephyr-orpo-141b-A35b-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HuggingFaceH4/zephyr-orpo-141b-A35b-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HuggingFaceH4__zephyr-orpo-141b-A35b-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
HuggingFaceH4/zephyr-orpo-141b-A35b-v0.1
a3be084543d278e61b64cd600f28157afc79ffd3
34.063023
apache-2.0
261
140
true
true
true
false
true
42.067786
0.651089
65.108911
0.629044
47.503796
0.200906
20.090634
0.378356
17.114094
0.446521
14.715104
0.45861
39.845597
true
2024-04-10
2024-06-12
1
mistral-community/Mixtral-8x22B-v0.1
HuggingFaceTB_SmolLM-1.7B_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/HuggingFaceTB/SmolLM-1.7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HuggingFaceTB/SmolLM-1.7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HuggingFaceTB__SmolLM-1.7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
HuggingFaceTB/SmolLM-1.7B
673a07602ca1191e5bc2ddda428e2f608a0a14c0
5.425399
apache-2.0
161
1
true
true
true
false
false
0.324307
0.236157
23.615673
0.318052
4.411128
0.007553
0.755287
0.241611
0
0.342094
2.128385
0.114777
1.641918
true
2024-07-14
2024-07-18
0
HuggingFaceTB/SmolLM-1.7B
HuggingFaceTB_SmolLM-1.7B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/HuggingFaceTB/SmolLM-1.7B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HuggingFaceTB/SmolLM-1.7B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HuggingFaceTB__SmolLM-1.7B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
HuggingFaceTB/SmolLM-1.7B-Instruct
0ad161e59935a9a691dfde2818df8b98786f30a7
5.138222
apache-2.0
103
1
true
true
true
false
true
0.317023
0.234783
23.47826
0.288511
2.080374
0
0
0.260067
1.342282
0.348667
2.083333
0.116606
1.84508
true
2024-07-15
2024-07-18
1
HuggingFaceTB/SmolLM-1.7B
HuggingFaceTB_SmolLM-135M_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/HuggingFaceTB/SmolLM-135M" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HuggingFaceTB/SmolLM-135M</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HuggingFaceTB__SmolLM-135M-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
HuggingFaceTB/SmolLM-135M
eec6e461571fba3e197a57c298f60b75422eae02
6.838197
apache-2.0
172
0
true
true
true
false
false
0.343378
0.212476
21.247623
0.304605
3.2854
0.006798
0.679758
0.258389
1.118568
0.436604
13.342188
0.112201
1.355644
true
2024-07-14
2024-07-18
0
HuggingFaceTB/SmolLM-135M
HuggingFaceTB_SmolLM-135M-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/HuggingFaceTB/SmolLM-135M-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HuggingFaceTB/SmolLM-135M-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HuggingFaceTB__SmolLM-135M-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
HuggingFaceTB/SmolLM-135M-Instruct
8ca7af58e27777cae460ad8ca3ab9db15f5c160d
3.564171
apache-2.0
98
0
true
true
true
false
true
0.467805
0.121401
12.140122
0.301508
2.692958
0
0
0.259228
1.230425
0.363458
3.365625
0.117603
1.955895
true
2024-07-15
2024-10-12
1
HuggingFaceTB/SmolLM-135M
HuggingFaceTB_SmolLM-360M_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/HuggingFaceTB/SmolLM-360M" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HuggingFaceTB/SmolLM-360M</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HuggingFaceTB__SmolLM-360M-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
HuggingFaceTB/SmolLM-360M
318cc630b73730bfd712e5873063156ffb8936b5
6.147596
apache-2.0
61
0
true
true
true
false
false
0.36526
0.213351
21.335058
0.306452
3.284915
0.004532
0.453172
0.267617
2.348993
0.401781
8.089323
0.112367
1.374113
true
2024-07-14
2024-07-18
0
HuggingFaceTB/SmolLM-360M
HuggingFaceTB_SmolLM-360M-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/HuggingFaceTB/SmolLM-360M-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HuggingFaceTB/SmolLM-360M-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HuggingFaceTB__SmolLM-360M-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
HuggingFaceTB/SmolLM-360M-Instruct
8e951de8c220295ea4f85d078c4e320df7137535
4.706784
apache-2.0
76
0
true
true
true
false
true
0.366501
0.195165
19.516549
0.288511
2.080374
0
0
0.264262
1.901566
0.347177
2.897135
0.116606
1.84508
true
2024-07-15
2024-08-20
1
HuggingFaceTB/SmolLM-360M
HuggingFaceTB_SmolLM2-1.7B_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/HuggingFaceTB/SmolLM2-1.7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HuggingFaceTB/SmolLM2-1.7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HuggingFaceTB__SmolLM2-1.7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
HuggingFaceTB/SmolLM2-1.7B
4fa12cab4f5f53670b05125fb9d2873af587d231
9.495504
apache-2.0
71
1
true
true
true
false
false
0.325026
0.244
24.400036
0.345259
9.301788
0.021148
2.114804
0.279362
3.914989
0.348542
4.601042
0.213763
12.640366
true
2024-10-30
2024-11-06
0
HuggingFaceTB/SmolLM2-1.7B
HuggingFaceTB_SmolLM2-1.7B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/HuggingFaceTB/SmolLM2-1.7B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HuggingFaceTB/SmolLM2-1.7B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HuggingFaceTB__SmolLM2-1.7B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
HuggingFaceTB/SmolLM2-1.7B-Instruct
d1bb90bcfbe0f211109880f4da18da66f229c4f6
14.745339
apache-2.0
345
1
true
true
true
false
true
0.324961
0.536784
53.678351
0.359862
10.917989
0.041541
4.154079
0.279362
3.914989
0.342125
4.098958
0.205369
11.707668
true
2024-10-31
2024-11-06
1
HuggingFaceTB/SmolLM2-1.7B-Instruct (Merge)
HuggingFaceTB_SmolLM2-135M_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/HuggingFaceTB/SmolLM2-135M" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HuggingFaceTB/SmolLM2-135M</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HuggingFaceTB__SmolLM2-135M-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
HuggingFaceTB/SmolLM2-135M
28e66ca6931668447a3bac213f23d990ad3b0e2b
5.557677
apache-2.0
27
0
true
true
true
false
false
0.333905
0.1833
18.330031
0.304423
3.708078
0.002266
0.226586
0.248322
0
0.411177
10.030469
0.109458
1.050901
true
2024-10-31
2024-11-06
0
HuggingFaceTB/SmolLM2-135M
HuggingFaceTB_SmolLM2-135M-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/HuggingFaceTB/SmolLM2-135M-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HuggingFaceTB/SmolLM2-135M-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HuggingFaceTB__SmolLM2-135M-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
HuggingFaceTB/SmolLM2-135M-Instruct
5a33ba103645800d7b3790c4448546c1b73efc71
6.467365
apache-2.0
57
0
true
true
true
false
true
0.338376
0.288314
28.83139
0.312432
4.720808
0.003021
0.302115
0.235738
0
0.366219
3.677344
0.111453
1.272533
true
2024-10-31
2024-11-06
0
HuggingFaceTB/SmolLM2-135M-Instruct
HuggingFaceTB_SmolLM2-135M-Instruct_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/HuggingFaceTB/SmolLM2-135M-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HuggingFaceTB/SmolLM2-135M-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HuggingFaceTB__SmolLM2-135M-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
HuggingFaceTB/SmolLM2-135M-Instruct
5a33ba103645800d7b3790c4448546c1b73efc71
2.992599
apache-2.0
57
0
true
true
true
false
false
0.348754
0.059252
5.925167
0.313475
4.796276
0.001511
0.151057
0.23406
0
0.387146
6.059896
0.109209
1.023197
true
2024-10-31
2024-11-14
0
HuggingFaceTB/SmolLM2-135M-Instruct
HuggingFaceTB_SmolLM2-360M_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/HuggingFaceTB/SmolLM2-360M" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HuggingFaceTB/SmolLM2-360M</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HuggingFaceTB__SmolLM2-360M-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
HuggingFaceTB/SmolLM2-360M
3ce05f63c246c44616da500b47b01f082f4d3bcc
6.100225
apache-2.0
23
0
true
true
true
false
false
0.386658
0.211452
21.145228
0.323348
5.543603
0.003021
0.302115
0.245805
0
0.395427
7.728385
0.116938
1.882018
true
2024-10-31
2024-11-06
0
HuggingFaceTB/SmolLM2-360M
HuggingFaceTB_SmolLM2-360M-Instruct_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/HuggingFaceTB/SmolLM2-360M-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HuggingFaceTB/SmolLM2-360M-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HuggingFaceTB__SmolLM2-360M-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
HuggingFaceTB/SmolLM2-360M-Instruct
4873f67095301d304753fae05bc09ec766634e50
3.10002
apache-2.0
47
0
true
true
true
false
false
0.392382
0.083032
8.303191
0.30527
3.299047
0.008308
0.830816
0.265101
2.013423
0.342281
2.751823
0.112616
1.401817
true
2024-10-31
2024-11-14
0
HuggingFaceTB/SmolLM2-360M-Instruct
HuggingFaceTB_SmolLM2-360M-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/HuggingFaceTB/SmolLM2-360M-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HuggingFaceTB/SmolLM2-360M-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HuggingFaceTB__SmolLM2-360M-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
HuggingFaceTB/SmolLM2-360M-Instruct
4873f67095301d304753fae05bc09ec766634e50
8.001097
apache-2.0
47
0
true
true
true
false
true
0.375819
0.38416
38.415959
0.314351
4.173864
0.006798
0.679758
0.255034
0.671141
0.346125
2.765625
0.111702
1.300236
true
2024-10-31
2024-11-06
0
HuggingFaceTB/SmolLM2-360M-Instruct
HumanLLMs_Humanish-LLama3-8B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/HumanLLMs/Humanish-LLama3-8B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HumanLLMs/Humanish-LLama3-8B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HumanLLMs__Humanish-LLama3-8B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
HumanLLMs/Humanish-LLama3-8B-Instruct
42f73ada2b7fb16f18a75404d72b7911bf1e65ce
22.564911
llama3
2
8
true
true
true
false
true
0.748278
0.64979
64.979033
0.496771
28.012477
0.095921
9.592145
0.255872
0.782998
0.358156
2.002865
0.37018
30.019947
false
2024-10-04
2024-10-05
1
meta-llama/Meta-Llama-3-8B-Instruct
HumanLLMs_Humanish-Mistral-Nemo-Instruct-2407_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/HumanLLMs/Humanish-Mistral-Nemo-Instruct-2407" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HumanLLMs/Humanish-Mistral-Nemo-Instruct-2407</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HumanLLMs__Humanish-Mistral-Nemo-Instruct-2407-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
HumanLLMs/Humanish-Mistral-Nemo-Instruct-2407
45b80bdce8d447ef494af06751904afcc607eb37
23.0069
apache-2.0
3
12
true
true
true
false
true
1.620283
0.545127
54.512693
0.526178
32.709613
0.083837
8.383686
0.287752
5.033557
0.39676
9.395052
0.352061
28.006797
false
2024-10-06
2024-10-06
2
mistralai/Mistral-Nemo-Base-2407
HumanLLMs_Humanish-Qwen2.5-7B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/HumanLLMs/Humanish-Qwen2.5-7B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HumanLLMs/Humanish-Qwen2.5-7B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HumanLLMs__Humanish-Qwen2.5-7B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
HumanLLMs/Humanish-Qwen2.5-7B-Instruct
7d2c71d926832d6e257ad2776011494dbac2d151
26.665374
apache-2.0
2
7
true
true
true
false
true
1.193393
0.728425
72.842502
0.536368
34.478998
0
0
0.298658
6.487696
0.398063
8.424479
0.439827
37.75857
false
2024-10-05
2024-10-05
2
Qwen/Qwen2.5-7B
IDEA-CCNL_Ziya-LLaMA-13B-v1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/IDEA-CCNL/Ziya-LLaMA-13B-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">IDEA-CCNL/Ziya-LLaMA-13B-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/IDEA-CCNL__Ziya-LLaMA-13B-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
IDEA-CCNL/Ziya-LLaMA-13B-v1
64d931f346e1a49ea3bbca07a83137075bab1c66
3.906425
gpl-3.0
273
13
true
true
true
false
false
1.108257
0.169686
16.968643
0.287703
1.463617
0
0
0.249161
0
0.375052
3.88151
0.110123
1.124778
true
2023-05-16
2024-06-12
0
IDEA-CCNL/Ziya-LLaMA-13B-v1
Infinirc_Infinirc-Llama3-8B-2G-Release-v1.0_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Infinirc/Infinirc-Llama3-8B-2G-Release-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Infinirc/Infinirc-Llama3-8B-2G-Release-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Infinirc__Infinirc-Llama3-8B-2G-Release-v1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Infinirc/Infinirc-Llama3-8B-2G-Release-v1.0
9c542d9ec3f86e145ae445c200c6ebe9066e8cd6
13.087133
llama3
1
8
true
true
true
false
false
1.818723
0.202434
20.243399
0.435074
20.831165
0.012085
1.208459
0.299497
6.599553
0.460938
16.750521
0.216007
12.889702
false
2024-06-26
2024-09-29
0
Infinirc/Infinirc-Llama3-8B-2G-Release-v1.0
Intel_neural-chat-7b-v3_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Intel/neural-chat-7b-v3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Intel/neural-chat-7b-v3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Intel__neural-chat-7b-v3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Intel/neural-chat-7b-v3
fc679274dfcd28a8b6087634f71af7ed2a0659c4
17.943646
apache-2.0
67
7
true
true
true
false
false
0.48929
0.277797
27.779736
0.504832
30.205692
0.021903
2.190332
0.291946
5.592841
0.50549
23.019531
0.269864
18.873744
true
2023-10-25
2024-06-12
1
mistralai/Mistral-7B-v0.1
Intel_neural-chat-7b-v3-1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Intel/neural-chat-7b-v3-1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Intel/neural-chat-7b-v3-1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Intel__neural-chat-7b-v3-1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Intel/neural-chat-7b-v3-1
c0d379a49c1c0579529d5e6f2e936ddb759552a8
21.004986
apache-2.0
545
7
true
true
true
false
false
0.563692
0.46869
46.868974
0.505157
29.739752
0.031722
3.172205
0.290268
5.369128
0.497896
22.236979
0.267786
18.642878
true
2023-11-14
2024-06-12
1
mistralai/Mistral-7B-v0.1
Intel_neural-chat-7b-v3-2_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Intel/neural-chat-7b-v3-2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Intel/neural-chat-7b-v3-2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Intel__neural-chat-7b-v3-2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Intel/neural-chat-7b-v3-2
0d8f77647810d21d935ea90c66d6339b85e65a75
21.433647
apache-2.0
56
7
true
true
true
false
false
0.560441
0.49884
49.883975
0.503223
30.237458
0.045317
4.531722
0.290268
5.369128
0.489521
20.056771
0.266705
18.522828
true
2023-11-21
2024-06-12
0
Intel/neural-chat-7b-v3-2
Intel_neural-chat-7b-v3-3_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Intel/neural-chat-7b-v3-3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Intel/neural-chat-7b-v3-3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Intel__neural-chat-7b-v3-3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Intel/neural-chat-7b-v3-3
bdd31cf498d13782cc7497cba5896996ce429f91
19.99112
apache-2.0
75
7
true
true
true
false
false
0.559524
0.476259
47.625855
0.487662
27.753851
0.006798
0.679758
0.28943
5.257271
0.485958
20.578125
0.262467
18.051862
true
2023-12-09
2024-06-12
2
mistralai/Mistral-7B-v0.1
IntervitensInc_internlm2_5-20b-llamafied_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/IntervitensInc/internlm2_5-20b-llamafied" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">IntervitensInc/internlm2_5-20b-llamafied</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/IntervitensInc__internlm2_5-20b-llamafied-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
IntervitensInc/internlm2_5-20b-llamafied
0b6fc3cc0b9bf3529816061eb508483c20b77fe9
29.204293
apache-2.0
2
19
true
true
true
false
false
1.381128
0.340995
34.099523
0.747847
63.47058
0.170695
17.069486
0.338087
11.744966
0.447542
14.942708
0.405086
33.898493
false
2024-08-06
2024-11-11
0
IntervitensInc/internlm2_5-20b-llamafied
Isaak-Carter_JOSIEv4o-8b-stage1-v4_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Isaak-Carter/JOSIEv4o-8b-stage1-v4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Isaak-Carter/JOSIEv4o-8b-stage1-v4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Isaak-Carter__JOSIEv4o-8b-stage1-v4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Isaak-Carter/JOSIEv4o-8b-stage1-v4
a8380a7be51b547761824e524b3d95ac73203122
15.567377
apache-2.0
1
8
true
true
true
false
false
0.890582
0.255266
25.526603
0.472497
25.787276
0.046828
4.682779
0.291946
5.592841
0.365438
6.079687
0.331616
25.735077
false
2024-08-03
2024-08-03
0
Isaak-Carter/JOSIEv4o-8b-stage1-v4
Isaak-Carter_JOSIEv4o-8b-stage1-v4_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Isaak-Carter/JOSIEv4o-8b-stage1-v4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Isaak-Carter/JOSIEv4o-8b-stage1-v4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Isaak-Carter__JOSIEv4o-8b-stage1-v4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Isaak-Carter/JOSIEv4o-8b-stage1-v4
a8380a7be51b547761824e524b3d95ac73203122
15.419272
apache-2.0
1
8
true
true
true
false
false
0.879882
0.247697
24.769722
0.475807
25.919578
0.045317
4.531722
0.291107
5.480984
0.364104
6.346354
0.329205
25.467272
false
2024-08-03
2024-08-03
0
Isaak-Carter/JOSIEv4o-8b-stage1-v4
Isaak-Carter_Josiefied-Qwen2.5-7B-Instruct-abliterated_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Isaak-Carter/Josiefied-Qwen2.5-7B-Instruct-abliterated" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Isaak-Carter/Josiefied-Qwen2.5-7B-Instruct-abliterated</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Isaak-Carter__Josiefied-Qwen2.5-7B-Instruct-abliterated-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Isaak-Carter/Josiefied-Qwen2.5-7B-Instruct-abliterated
879168f9ce9fac315a19dd4f4c7df5253bb660f2
26.857295
0
7
false
true
true
false
true
1.076791
0.731747
73.174732
0.539638
34.904316
0
0
0.302852
7.04698
0.408667
9.616667
0.42761
36.401079
false
2024-09-21
0
Removed
Isaak-Carter_Josiefied-Qwen2.5-7B-Instruct-abliterated-v2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Isaak-Carter/Josiefied-Qwen2.5-7B-Instruct-abliterated-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Isaak-Carter/Josiefied-Qwen2.5-7B-Instruct-abliterated-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Isaak-Carter__Josiefied-Qwen2.5-7B-Instruct-abliterated-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Isaak-Carter/Josiefied-Qwen2.5-7B-Instruct-abliterated-v2
5d07f58562422feb9f25c9c048e40356d2cf7e4b
27.81796
apache-2.0
4
7
true
true
true
false
true
1.130915
0.784104
78.410396
0.531092
33.29454
0
0
0.298658
6.487696
0.435396
13.957813
0.412816
34.757314
false
2024-09-20
2024-09-21
1
Qwen/Qwen2.5-7B
J-LAB_Thynk_orpo_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/J-LAB/Thynk_orpo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">J-LAB/Thynk_orpo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/J-LAB__Thynk_orpo-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
J-LAB/Thynk_orpo
c6606d402f26d005b9f1a71a1cde9139d1cffb2a
16.974407
0
3
false
true
true
false
false
1.214764
0.210178
21.017788
0.446311
22.062784
0.130665
13.066465
0.292785
5.704698
0.451479
15.201563
0.323138
24.793144
false
2024-10-14
0
Removed
Jacoby746_Casual-Magnum-34B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Jacoby746/Casual-Magnum-34B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Jacoby746/Casual-Magnum-34B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Jacoby746__Casual-Magnum-34B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Jacoby746/Casual-Magnum-34B
b628c6959441db75460cfd49536322b1ea46130e
23.571335
apache-2.0
1
34
true
false
true
false
false
3.426697
0.193017
19.301675
0.603205
43.051568
0.07855
7.854985
0.372483
16.331096
0.40776
8.403385
0.518368
46.485298
false
2024-10-01
2024-10-23
1
Jacoby746/Casual-Magnum-34B (Merge)
Jacoby746_Inf-Silent-Kunoichi-v0.1-2x7B_float16
float16
🤝 base merges and moerges
🤝
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/Jacoby746/Inf-Silent-Kunoichi-v0.1-2x7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Jacoby746/Inf-Silent-Kunoichi-v0.1-2x7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Jacoby746__Inf-Silent-Kunoichi-v0.1-2x7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Jacoby746/Inf-Silent-Kunoichi-v0.1-2x7B
9ab68beb6fe16cab2ab708b9af4417c89751d297
20.009948
apache-2.0
0
12
true
true
true
false
false
1.860053
0.387982
38.798167
0.518546
32.387004
0.060423
6.042296
0.28943
5.257271
0.428042
12.338542
0.327128
25.236407
false
2024-09-19
2024-09-20
1
Jacoby746/Inf-Silent-Kunoichi-v0.1-2x7B (Merge)
Jacoby746_Inf-Silent-Kunoichi-v0.2-2x7B_float16
float16
🤝 base merges and moerges
🤝
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/Jacoby746/Inf-Silent-Kunoichi-v0.2-2x7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Jacoby746/Inf-Silent-Kunoichi-v0.2-2x7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Jacoby746__Inf-Silent-Kunoichi-v0.2-2x7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Jacoby746/Inf-Silent-Kunoichi-v0.2-2x7B
711263c24f812676eb382a31a5f0fed9bd8c16e4
19.917523
apache-2.0
0
12
true
true
true
false
false
0.866265
0.363602
36.360191
0.520942
32.259184
0.056647
5.664653
0.300336
6.711409
0.431979
13.264062
0.327211
25.245641
false
2024-09-19
2024-09-21
1
Jacoby746/Inf-Silent-Kunoichi-v0.2-2x7B (Merge)
Jacoby746_Proto-Athena-4x7B_float16
float16
🤝 base merges and moerges
🤝
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/Jacoby746/Proto-Athena-4x7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Jacoby746/Proto-Athena-4x7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Jacoby746__Proto-Athena-4x7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Jacoby746/Proto-Athena-4x7B
450fcba7a630fb61a662f71936d37979226fced8
19.649696
apache-2.0
0
24
true
true
true
false
false
1.676614
0.370296
37.029637
0.510655
30.870823
0.057402
5.740181
0.294463
5.928412
0.434771
13.813021
0.320645
24.516105
false
2024-09-21
2024-09-21
1
Jacoby746/Proto-Athena-4x7B (Merge)
Jacoby746_Proto-Athena-v0.2-4x7B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/Jacoby746/Proto-Athena-v0.2-4x7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Jacoby746/Proto-Athena-v0.2-4x7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Jacoby746__Proto-Athena-v0.2-4x7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Jacoby746/Proto-Athena-v0.2-4x7B
01feeded217ea83a8794e7968c8850859b5f0b14
19.143898
apache-2.0
0
24
true
true
true
false
false
1.651372
0.375242
37.524214
0.506773
30.340844
0.05136
5.135952
0.298658
6.487696
0.421281
10.960156
0.319731
24.414524
false
2024-09-21
2024-09-21
1
Jacoby746/Proto-Athena-v0.2-4x7B (Merge)
Jacoby746_Proto-Harpy-Blazing-Light-v0.1-2x7B_float16
float16
🤝 base merges and moerges
🤝
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/Jacoby746/Proto-Harpy-Blazing-Light-v0.1-2x7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Jacoby746/Proto-Harpy-Blazing-Light-v0.1-2x7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Jacoby746__Proto-Harpy-Blazing-Light-v0.1-2x7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Jacoby746/Proto-Harpy-Blazing-Light-v0.1-2x7B
bbb5d7c7a0c9e999e057ffa71eaa93d59d95b36b
22.292392
0
12
false
true
true
false
false
0.881841
0.490472
49.047195
0.518685
32.63253
0.063444
6.344411
0.295302
6.040268
0.444969
14.121094
0.33012
25.568853
false
2024-09-22
2024-09-30
1
Jacoby746/Proto-Harpy-Blazing-Light-v0.1-2x7B (Merge)
Jacoby746_Proto-Harpy-Spark-v0.1-7B_float16
float16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Jacoby746/Proto-Harpy-Spark-v0.1-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Jacoby746/Proto-Harpy-Spark-v0.1-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Jacoby746__Proto-Harpy-Spark-v0.1-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Jacoby746/Proto-Harpy-Spark-v0.1-7B
984cca02cd930b2e1b7b2a7d53471d32d9821cdd
19.862588
apache-2.0
0
7
true
false
true
false
false
0.595805
0.433269
43.326928
0.473577
26.91311
0.062689
6.268882
0.305369
7.38255
0.431667
12.291667
0.306932
22.992391
false
2024-09-22
2024-09-30
1
Jacoby746/Proto-Harpy-Spark-v0.1-7B (Merge)
Jimmy19991222_Llama-3-Instruct-8B-SimPO-v0.2_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Jimmy19991222/Llama-3-Instruct-8B-SimPO-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Jimmy19991222/Llama-3-Instruct-8B-SimPO-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Jimmy19991222__Llama-3-Instruct-8B-SimPO-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Jimmy19991222/Llama-3-Instruct-8B-SimPO-v0.2
53a517ceaef324efc3626be44140b4f18a010591
24.279948
0
8
false
true
true
false
true
0.513152
0.654037
65.403684
0.498371
29.123823
0.043051
4.305136
0.314597
8.612975
0.40125
8.389583
0.3686
29.844489
false
2024-09-06
0
Removed
Jimmy19991222_llama-3-8b-instruct-gapo-v2-bert-f1-beta10-gamma0.3-lr1.0e-6-1minus-rerun_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Jimmy19991222/llama-3-8b-instruct-gapo-v2-bert-f1-beta10-gamma0.3-lr1.0e-6-1minus-rerun" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Jimmy19991222/llama-3-8b-instruct-gapo-v2-bert-f1-beta10-gamma0.3-lr1.0e-6-1minus-rerun</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Jimmy19991222__llama-3-8b-instruct-gapo-v2-bert-f1-beta10-gamma0.3-lr1.0e-6-1minus-rerun-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Jimmy19991222/llama-3-8b-instruct-gapo-v2-bert-f1-beta10-gamma0.3-lr1.0e-6-1minus-rerun
00c02a823b4ff1a6cfcded6085ba9630df633998
23.817704
llama3
0
8
true
true
true
false
true
0.481791
0.671722
67.172214
0.48798
27.755229
0.040785
4.07855
0.294463
5.928412
0.404073
8.709115
0.363364
29.262707
false
2024-09-17
2024-09-18
1
meta-llama/Meta-Llama-3-8B-Instruct
Jimmy19991222_llama-3-8b-instruct-gapo-v2-bert_f1-beta10-gamma0.3-lr1.0e-6-scale-log_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Jimmy19991222/llama-3-8b-instruct-gapo-v2-bert_f1-beta10-gamma0.3-lr1.0e-6-scale-log" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Jimmy19991222/llama-3-8b-instruct-gapo-v2-bert_f1-beta10-gamma0.3-lr1.0e-6-scale-log</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Jimmy19991222__llama-3-8b-instruct-gapo-v2-bert_f1-beta10-gamma0.3-lr1.0e-6-scale-log-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Jimmy19991222/llama-3-8b-instruct-gapo-v2-bert_f1-beta10-gamma0.3-lr1.0e-6-scale-log
99d9e31df5b7e88b1da78b1bd335cac3215dfd6e
23.75627
llama3
0
8
true
true
true
false
true
0.478535
0.655561
65.556058
0.493458
28.613597
0.033988
3.398792
0.30453
7.270694
0.40001
8.167969
0.365775
29.530511
false
2024-09-22
2024-09-22
1
meta-llama/Meta-Llama-3-8B-Instruct
Jimmy19991222_llama-3-8b-instruct-gapo-v2-bert_p-beta10-gamma0.3-lr1.0e-6-scale-log_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Jimmy19991222/llama-3-8b-instruct-gapo-v2-bert_p-beta10-gamma0.3-lr1.0e-6-scale-log" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Jimmy19991222/llama-3-8b-instruct-gapo-v2-bert_p-beta10-gamma0.3-lr1.0e-6-scale-log</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Jimmy19991222__llama-3-8b-instruct-gapo-v2-bert_p-beta10-gamma0.3-lr1.0e-6-scale-log-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Jimmy19991222/llama-3-8b-instruct-gapo-v2-bert_p-beta10-gamma0.3-lr1.0e-6-scale-log
49a029ea2605d768e89b638ad78a59fd62d192ab
22.797979
llama3
0
8
true
true
true
false
true
0.522485
0.631506
63.150552
0.491641
27.666184
0.050604
5.060423
0.286074
4.809843
0.3935
7.0875
0.36112
29.013372
false
2024-09-22
2024-09-22
1
meta-llama/Meta-Llama-3-8B-Instruct
Jimmy19991222_llama-3-8b-instruct-gapo-v2-bleu-beta0.1-no-length-scale-gamma0.4_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Jimmy19991222/llama-3-8b-instruct-gapo-v2-bleu-beta0.1-no-length-scale-gamma0.4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Jimmy19991222/llama-3-8b-instruct-gapo-v2-bleu-beta0.1-no-length-scale-gamma0.4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Jimmy19991222__llama-3-8b-instruct-gapo-v2-bleu-beta0.1-no-length-scale-gamma0.4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Jimmy19991222/llama-3-8b-instruct-gapo-v2-bleu-beta0.1-no-length-scale-gamma0.4
de8bb28ad7a9d1158f318a4461dc47ad03e6e560
22.827312
0
8
false
true
true
false
true
0.480371
0.628458
62.845805
0.498609
29.329732
0.017372
1.73716
0.292785
5.704698
0.401375
9.071875
0.354471
28.274601
false
2024-09-06
0
Removed
Jimmy19991222_llama-3-8b-instruct-gapo-v2-rouge2-beta10-1minus-gamma0.3-rerun_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Jimmy19991222/llama-3-8b-instruct-gapo-v2-rouge2-beta10-1minus-gamma0.3-rerun" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Jimmy19991222/llama-3-8b-instruct-gapo-v2-rouge2-beta10-1minus-gamma0.3-rerun</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Jimmy19991222__llama-3-8b-instruct-gapo-v2-rouge2-beta10-1minus-gamma0.3-rerun-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Jimmy19991222/llama-3-8b-instruct-gapo-v2-rouge2-beta10-1minus-gamma0.3-rerun
e9692d8dbe30273839763757aa9ef07a5fcf0c59
24.159026
llama3
0
8
true
true
true
false
true
1.009359
0.66775
66.775046
0.494046
28.390676
0.047583
4.758308
0.306208
7.494407
0.398708
8.005208
0.365775
29.530511
false
2024-09-14
2024-09-15
1
meta-llama/Meta-Llama-3-8B-Instruct
Jimmy19991222_llama-3-8b-instruct-gapo-v2-rouge2-beta10-gamma0.3-lr1.0e-6-scale-log_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Jimmy19991222/llama-3-8b-instruct-gapo-v2-rouge2-beta10-gamma0.3-lr1.0e-6-scale-log" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Jimmy19991222/llama-3-8b-instruct-gapo-v2-rouge2-beta10-gamma0.3-lr1.0e-6-scale-log</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Jimmy19991222__llama-3-8b-instruct-gapo-v2-rouge2-beta10-gamma0.3-lr1.0e-6-scale-log-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Jimmy19991222/llama-3-8b-instruct-gapo-v2-rouge2-beta10-gamma0.3-lr1.0e-6-scale-log
9ff0ce408abb8dbcf7efb9b6533338f2c344a355
23.858383
llama3
0
8
true
true
true
false
true
0.501994
0.660506
66.050635
0.491601
28.075036
0.044562
4.456193
0.303691
7.158837
0.400042
7.805208
0.366439
29.604388
false
2024-09-22
2024-09-22
1
meta-llama/Meta-Llama-3-8B-Instruct
Jimmy19991222_llama-3-8b-instruct-gapo-v2-rougeL-beta10-gamma0.3-lr1.0e-6-scale-log_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Jimmy19991222/llama-3-8b-instruct-gapo-v2-rougeL-beta10-gamma0.3-lr1.0e-6-scale-log" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Jimmy19991222/llama-3-8b-instruct-gapo-v2-rougeL-beta10-gamma0.3-lr1.0e-6-scale-log</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Jimmy19991222__llama-3-8b-instruct-gapo-v2-rougeL-beta10-gamma0.3-lr1.0e-6-scale-log-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Jimmy19991222/llama-3-8b-instruct-gapo-v2-rougeL-beta10-gamma0.3-lr1.0e-6-scale-log
ec67f95c4d1813a34bbde52d0ad14824fd7111a0
23.742269
llama3
0
8
true
true
true
false
true
0.486586
0.649191
64.919081
0.495249
28.562567
0.045317
4.531722
0.302013
6.935123
0.396135
7.383594
0.371094
30.121528
false
2024-09-22
2024-09-22
1
meta-llama/Meta-Llama-3-8B-Instruct
Joseph717171_Hermes-3-Llama-3.1-8B_TIES_with_Base_Embeds_Initialized_to_Special_Instruct_Toks_dtypeF32_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Joseph717171/Hermes-3-Llama-3.1-8B_TIES_with_Base_Embeds_Initialized_to_Special_Instruct_Toks_dtypeF32" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Joseph717171/Hermes-3-Llama-3.1-8B_TIES_with_Base_Embeds_Initialized_to_Special_Instruct_Toks_dtypeF32</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Joseph717171__Hermes-3-Llama-3.1-8B_TIES_with_Base_Embeds_Initialized_to_Special_Instruct_Toks_dtypeF32-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Joseph717171/Hermes-3-Llama-3.1-8B_TIES_with_Base_Embeds_Initialized_to_Special_Instruct_Toks_dtypeF32
823930851c57b11fd2e25cd65b5c53f909209d0e
23.252877
llama3.1
1
8
true
false
true
false
true
0.707545
0.618541
61.854103
0.517745
30.724097
0.05136
5.135952
0.282718
4.362416
0.436938
13.617187
0.314412
23.823508
false
2024-10-23
2024-10-25
0
Joseph717171/Hermes-3-Llama-3.1-8B_TIES_with_Base_Embeds_Initialized_to_Special_Instruct_Toks_dtypeF32
Joseph717171_Llama-3.1-SuperNova-8B-Lite_TIES_with_Base_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Joseph717171/Llama-3.1-SuperNova-8B-Lite_TIES_with_Base" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Joseph717171/Llama-3.1-SuperNova-8B-Lite_TIES_with_Base</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Joseph717171__Llama-3.1-SuperNova-8B-Lite_TIES_with_Base-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Joseph717171/Llama-3.1-SuperNova-8B-Lite_TIES_with_Base
f1e2cad4dca10f948fd2ee9588f80df0b40d7232
30.081383
llama3.1
8
8
true
false
true
false
true
0.874731
0.809633
80.963289
0.514742
31.465813
0.173716
17.371601
0.309564
7.941834
0.41099
10.740365
0.388049
32.005393
false
2024-10-02
2024-10-03
0
Joseph717171/Llama-3.1-SuperNova-8B-Lite_TIES_with_Base
Josephgflowers_Cinder-Phi-2-V1-F16-gguf_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/Josephgflowers/Cinder-Phi-2-V1-F16-gguf" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Josephgflowers/Cinder-Phi-2-V1-F16-gguf</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Josephgflowers__Cinder-Phi-2-V1-F16-gguf-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Josephgflowers/Cinder-Phi-2-V1-F16-gguf
85629ec9b18efee31d07630664e7a3815121badf
10.855703
mit
4
2
true
true
true
false
true
0.471404
0.235657
23.565695
0.439662
22.453402
0
0
0.281879
4.250559
0.343458
1.965625
0.21609
12.898936
false
2024-02-25
2024-06-26
0
Josephgflowers/Cinder-Phi-2-V1-F16-gguf
Josephgflowers_Differential-Attention-Liquid-Metal-Tinyllama_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Josephgflowers/Differential-Attention-Liquid-Metal-Tinyllama" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Josephgflowers/Differential-Attention-Liquid-Metal-Tinyllama</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Josephgflowers__Differential-Attention-Liquid-Metal-Tinyllama-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Josephgflowers/Differential-Attention-Liquid-Metal-Tinyllama
bdb6c63ff1025241e8e10b1858d67dc410f0a702
4.709671
mit
0
1
true
true
true
false
true
0.173794
0.222692
22.269246
0.292556
2.552224
0
0
0.250839
0.111857
0.335552
0.94401
0.121426
2.380689
false
2024-11-05
2024-11-07
0
Josephgflowers/Differential-Attention-Liquid-Metal-Tinyllama
Josephgflowers_TinyLlama-Cinder-Agent-v1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Josephgflowers/TinyLlama-Cinder-Agent-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Josephgflowers/TinyLlama-Cinder-Agent-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Josephgflowers__TinyLlama-Cinder-Agent-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Josephgflowers/TinyLlama-Cinder-Agent-v1
a9cd8b48bfe30f29bb1f819213da9a4c41eee67f
5.816564
mit
1
1
true
true
true
false
true
0.237832
0.266956
26.695612
0.311604
3.804167
0.003776
0.377644
0.244128
0
0.339458
2.232292
0.116107
1.789672
false
2024-05-21
2024-06-26
4
Josephgflowers/TinyLlama-3T-Cinder-v1.2
Josephgflowers_TinyLlama-v1.1-Cinders-World_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Josephgflowers/TinyLlama-v1.1-Cinders-World" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Josephgflowers/TinyLlama-v1.1-Cinders-World</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Josephgflowers__TinyLlama-v1.1-Cinders-World-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Josephgflowers/TinyLlama-v1.1-Cinders-World
11a2c305f787a7908dd87c4e5a7d0f1e314a1f05
5.129125
mit
0
1
true
true
true
false
true
0.257383
0.246923
24.692261
0.299797
3.107714
0.001511
0.151057
0.244128
0
0.335615
0.61849
0.119847
2.20523
false
2024-10-12
2024-10-13
0
Josephgflowers/TinyLlama-v1.1-Cinders-World
Josephgflowers_TinyLlama_v1.1_math_code-world-test-1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Josephgflowers/TinyLlama_v1.1_math_code-world-test-1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Josephgflowers/TinyLlama_v1.1_math_code-world-test-1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Josephgflowers__TinyLlama_v1.1_math_code-world-test-1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Josephgflowers/TinyLlama_v1.1_math_code-world-test-1
6f7c2aaf0b8723bc6a1dc23a4a1ff0ec24dc11ec
1.839166
mit
0
1
true
true
true
false
false
0.272944
0.007844
0.784363
0.314635
4.164017
0.009819
0.981873
0.23406
0
0.349906
3.638281
0.113198
1.46646
false
2024-06-23
2024-09-09
0
Josephgflowers/TinyLlama_v1.1_math_code-world-test-1
KSU-HW-SEC_Llama3-70b-SVA-FT-1415_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/KSU-HW-SEC/Llama3-70b-SVA-FT-1415" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">KSU-HW-SEC/Llama3-70b-SVA-FT-1415</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/KSU-HW-SEC__Llama3-70b-SVA-FT-1415-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
KSU-HW-SEC/Llama3-70b-SVA-FT-1415
1c09728455567898116d2d9cfb6cbbbbd4ee730c
36.119233
0
70
false
true
true
false
false
9.601029
0.617991
61.799137
0.665015
51.328741
0.219789
21.978852
0.375
16.666667
0.456542
17.801042
0.524269
47.140957
false
2024-09-08
0
Removed
KSU-HW-SEC_Llama3-70b-SVA-FT-500_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/KSU-HW-SEC/Llama3-70b-SVA-FT-500" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">KSU-HW-SEC/Llama3-70b-SVA-FT-500</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/KSU-HW-SEC__Llama3-70b-SVA-FT-500-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
KSU-HW-SEC/Llama3-70b-SVA-FT-500
856a23f28aeada23d1135c86a37e05524307e8ed
35.953712
0
70
false
true
true
false
false
9.473738
0.610522
61.05223
0.669224
51.887026
0.213746
21.374622
0.380872
17.449664
0.451146
16.993229
0.522689
46.965499
false
2024-09-08
0
Removed
KSU-HW-SEC_Llama3-70b-SVA-FT-final_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/KSU-HW-SEC/Llama3-70b-SVA-FT-final" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">KSU-HW-SEC/Llama3-70b-SVA-FT-final</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/KSU-HW-SEC__Llama3-70b-SVA-FT-final-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
KSU-HW-SEC/Llama3-70b-SVA-FT-final
391bbd94173b34975d1aa2c7356977a630253b75
36.093837
0
70
false
true
true
false
false
9.656199
0.616468
61.646764
0.665015
51.328741
0.219789
21.978852
0.375
16.666667
0.456542
17.801042
0.524269
47.140957
false
2024-09-08
0
Removed
KSU-HW-SEC_Llama3.1-70b-SVA-FT-1000step_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/KSU-HW-SEC/Llama3.1-70b-SVA-FT-1000step" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">KSU-HW-SEC/Llama3.1-70b-SVA-FT-1000step</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/KSU-HW-SEC__Llama3.1-70b-SVA-FT-1000step-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
KSU-HW-SEC/Llama3.1-70b-SVA-FT-1000step
b195fea0d8f350ff29243d4e88654b1baa5af79e
40.750259
0
70
false
true
true
false
false
12.554447
0.723804
72.380395
0.690312
55.485365
0.320997
32.099698
0.395973
19.463087
0.459177
17.830469
0.525183
47.242538
false
2024-09-08
0
Removed
Kimargin_GPT-NEO-1.3B-wiki_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GPTNeoForCausalLM
<a target="_blank" href="https://huggingface.co/Kimargin/GPT-NEO-1.3B-wiki" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Kimargin/GPT-NEO-1.3B-wiki</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Kimargin__GPT-NEO-1.3B-wiki-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Kimargin/GPT-NEO-1.3B-wiki
92fa51fa6589f6e8fdfcc83f085216b3dae11da5
5.248478
apache-2.0
1
1
true
true
true
false
false
0.832745
0.192068
19.206816
0.302634
3.423612
0.008308
0.830816
0.244966
0
0.38826
6.932552
0.109874
1.097074
false
2024-10-23
2024-10-24
1
Kimargin/GPT-NEO-1.3B-wiki (Merge)
KingNish_Qwen2.5-0.5b-Test-ft_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/KingNish/Qwen2.5-0.5b-Test-ft" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">KingNish/Qwen2.5-0.5b-Test-ft</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/KingNish__Qwen2.5-0.5b-Test-ft-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
KingNish/Qwen2.5-0.5b-Test-ft
f905bb1d37c7853fb5c7157d8d3ad0f062b65c0f
7.475184
apache-2.0
4
0
true
true
true
false
false
0.66869
0.267081
26.708134
0.323153
6.058845
0.012085
1.208459
0.263423
1.789709
0.342125
1.432292
0.168883
7.653664
false
2024-09-26
2024-09-29
1
KingNish/Qwen2.5-0.5b-Test-ft (Merge)
KingNish_Reasoning-Llama-3b-v0.1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/KingNish/Reasoning-Llama-3b-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">KingNish/Reasoning-Llama-3b-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/KingNish__Reasoning-Llama-3b-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
KingNish/Reasoning-Llama-3b-v0.1
d164caf591c42a4cbc3b21d46493e72fbdbd9de8
19.859912
llama3.2
9
3
true
true
true
false
true
0.675235
0.622463
62.246284
0.434336
19.862451
0.108761
10.876133
0.259228
1.230425
0.31676
2.395052
0.302942
22.549128
false
2024-10-10
2024-10-26
1
meta-llama/Llama-3.2-3B-Instruct
Kquant03_CognitiveFusion2-4x7B-BF16_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/Kquant03/CognitiveFusion2-4x7B-BF16" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Kquant03/CognitiveFusion2-4x7B-BF16</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Kquant03__CognitiveFusion2-4x7B-BF16-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Kquant03/CognitiveFusion2-4x7B-BF16
db45b86c462bb93db7ba4f2c3fe3517582c859a1
15.591291
apache-2.0
3
24
true
false
false
false
true
1.666035
0.356657
35.6657
0.410783
17.689003
0.055136
5.513595
0.286074
4.809843
0.414552
9.952344
0.279255
19.917258
false
2024-04-06
2024-07-31
0
Kquant03/CognitiveFusion2-4x7B-BF16
Kquant03_L3-Pneuma-8B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Kquant03/L3-Pneuma-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Kquant03/L3-Pneuma-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Kquant03__L3-Pneuma-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Kquant03/L3-Pneuma-8B
257aa8d00e82f91b7a780384aa76573c2ea614a8
16.642746
llama3
1
8
true
true
true
false
false
0.803811
0.237406
23.740564
0.495504
28.820202
0.052115
5.21148
0.307047
7.606264
0.417156
10.211198
0.318401
24.26677
false
2024-10-13
2024-10-16
1
meta-llama/Meta-Llama-3-8B
Kukedlc_NeuralExperiment-7b-MagicCoder-v7.5_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Kukedlc/NeuralExperiment-7b-MagicCoder-v7.5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Kukedlc/NeuralExperiment-7b-MagicCoder-v7.5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Kukedlc__NeuralExperiment-7b-MagicCoder-v7.5-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Kukedlc/NeuralExperiment-7b-MagicCoder-v7.5
43ea8d27d652dc15e4d27f665c5d636a5937780b
18.031181
apache-2.0
6
7
true
true
true
false
true
0.45486
0.455251
45.525096
0.398845
16.386034
0.067976
6.797583
0.296141
6.152125
0.428198
13.058073
0.282414
20.268174
false
2024-03-07
2024-07-30
0
Kukedlc/NeuralExperiment-7b-MagicCoder-v7.5
Kukedlc_NeuralLLaMa-3-8b-DT-v0.1_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Kukedlc/NeuralLLaMa-3-8b-DT-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Kukedlc/NeuralLLaMa-3-8b-DT-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Kukedlc__NeuralLLaMa-3-8b-DT-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Kukedlc/NeuralLLaMa-3-8b-DT-v0.1
1fe849c1e7e4793c2fdd869fcfb51e0d1910674f
21.259599
other
1
8
true
false
true
false
false
0.852964
0.437141
43.714123
0.498677
28.008308
0.080816
8.081571
0.302852
7.04698
0.407115
9.689323
0.379156
31.017287
false
2024-05-11
2024-09-17
1
Kukedlc/NeuralLLaMa-3-8b-DT-v0.1 (Merge)
Kukedlc_NeuralLLaMa-3-8b-ORPO-v0.3_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Kukedlc/NeuralLLaMa-3-8b-ORPO-v0.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Kukedlc/NeuralLLaMa-3-8b-ORPO-v0.3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Kukedlc__NeuralLLaMa-3-8b-ORPO-v0.3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Kukedlc/NeuralLLaMa-3-8b-ORPO-v0.3
aa176c0db7791a1c09039135791145b0704a5f46
17.597684
apache-2.0
1
8
true
true
true
false
true
0.91551
0.527591
52.759124
0.455714
22.391712
0.039275
3.927492
0.239094
0
0.370031
3.653906
0.305685
22.853871
false
2024-05-14
2024-07-28
1
Kukedlc/NeuralLLaMa-3-8b-ORPO-v0.3 (Merge)
Kukedlc_NeuralSynthesis-7B-v0.1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Kukedlc/NeuralSynthesis-7B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Kukedlc/NeuralSynthesis-7B-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Kukedlc__NeuralSynthesis-7B-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Kukedlc/NeuralSynthesis-7B-v0.1
547a5dc8963e127a9638256bb80eb3a36da1cc5d
19.977912
apache-2.0
3
7
true
false
true
false
false
0.596299
0.418456
41.845636
0.514475
31.834395
0.061178
6.117825
0.28104
4.138702
0.433281
13.160156
0.304937
22.770759
false
2024-04-06
2024-06-29
0
Kukedlc/NeuralSynthesis-7B-v0.1
Kukedlc_NeuralSynthesis-7B-v0.3_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Kukedlc/NeuralSynthesis-7B-v0.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Kukedlc/NeuralSynthesis-7B-v0.3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Kukedlc__NeuralSynthesis-7B-v0.3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Kukedlc/NeuralSynthesis-7B-v0.3
090fab29146f8e55066bce2f5f5859ab2d6027f4
20.082685
apache-2.0
0
7
true
false
true
false
false
0.583572
0.40784
40.784009
0.513808
31.811748
0.077039
7.703927
0.280201
4.026846
0.434583
13.389583
0.30502
22.779994
false
2024-04-07
2024-07-31
0
Kukedlc/NeuralSynthesis-7B-v0.3
Kukedlc_NeuralSynthesis-7b-v0.4-slerp_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Kukedlc/NeuralSynthesis-7b-v0.4-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Kukedlc/NeuralSynthesis-7b-v0.4-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Kukedlc__NeuralSynthesis-7b-v0.4-slerp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Kukedlc/NeuralSynthesis-7b-v0.4-slerp
bb3bd36fce162f472668dbd91960cd1525b45f30
19.543101
apache-2.0
0
7
true
false
true
false
false
0.599064
0.394726
39.472599
0.514293
31.997187
0.063444
6.344411
0.277685
3.691275
0.43325
13.05625
0.304272
22.696882
false
2024-04-12
2024-07-31
1
Kukedlc/NeuralSynthesis-7b-v0.4-slerp (Merge)
Kumar955_Hemanth-llm_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Kumar955/Hemanth-llm" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Kumar955/Hemanth-llm</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Kumar955__Hemanth-llm-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Kumar955/Hemanth-llm
871325cc04f57cd953c161a0ace49c47af8eca4c
22.143018
0
7
false
true
true
false
false
1.990378
0.50451
50.451026
0.522495
33.044262
0.070242
7.024169
0.282718
4.362416
0.448563
14.503646
0.311253
23.472592
false
2024-09-24
2024-09-24
1
Kumar955/Hemanth-llm (Merge)
L-RAGE_3_PRYMMAL-ECE-7B-SLERP-V1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/L-RAGE/3_PRYMMAL-ECE-7B-SLERP-V1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">L-RAGE/3_PRYMMAL-ECE-7B-SLERP-V1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/L-RAGE__3_PRYMMAL-ECE-7B-SLERP-V1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
L-RAGE/3_PRYMMAL-ECE-7B-SLERP-V1
483902db68f99affe1d7f1139755dfd115abbca5
14.476674
apache-2.0
0
1
true
false
true
false
false
0.589778
0.274226
27.422572
0.422794
19.083009
0.085347
8.534743
0.281879
4.250559
0.384135
6.183594
0.29247
21.385564
false
2024-10-29
2024-10-29
1
L-RAGE/3_PRYMMAL-ECE-7B-SLERP-V1 (Merge)
LEESM_llama-2-7b-hf-lora-oki100p_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/LEESM/llama-2-7b-hf-lora-oki100p" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">LEESM/llama-2-7b-hf-lora-oki100p</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/LEESM__llama-2-7b-hf-lora-oki100p-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
LEESM/llama-2-7b-hf-lora-oki100p
4bfd99888bf37e23d966f1e537fe199992c27a72
8.70733
mit
2
6
true
true
true
false
false
0.483818
0.251294
25.129434
0.349168
10.265743
0.012085
1.208459
0.269295
2.572707
0.368729
3.557813
0.185588
9.509826
false
2024-07-17
2024-11-08
0
LEESM/llama-2-7b-hf-lora-oki100p
LEESM_llama-2-7b-hf-lora-oki10p_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/LEESM/llama-2-7b-hf-lora-oki10p" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">LEESM/llama-2-7b-hf-lora-oki10p</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/LEESM__llama-2-7b-hf-lora-oki10p-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
LEESM/llama-2-7b-hf-lora-oki10p
d6e5af01616a038ac2b5cb83f458e490e1102244
7.090032
mit
0
6
true
true
true
false
false
0.983653
0.22609
22.609011
0.353093
9.438287
0.01284
1.283988
0.254195
0.559284
0.347521
1.106771
0.167886
7.542849
false
2024-04-03
2024-11-08
0
LEESM/llama-2-7b-hf-lora-oki10p
LEESM_llama-3-8b-bnb-4b-kowiki231101_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/LEESM/llama-3-8b-bnb-4b-kowiki231101" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">LEESM/llama-3-8b-bnb-4b-kowiki231101</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/LEESM__llama-3-8b-bnb-4b-kowiki231101-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
LEESM/llama-3-8b-bnb-4b-kowiki231101
63b8f715daab6a0c7196a20855be8e85fe7ddcb4
9.271088
apache-2.0
0
8
true
true
true
false
false
0.756888
0.168487
16.848739
0.413081
16.934868
0.001511
0.151057
0.270973
2.796421
0.355146
3.059896
0.24252
15.83555
false
2024-11-08
2024-11-08
2
meta-llama/Meta-Llama-3.1-8B
LEESM_llama-3-Korean-Bllossom-8B-trexlab-oki10p_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/LEESM/llama-3-Korean-Bllossom-8B-trexlab-oki10p" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">LEESM/llama-3-Korean-Bllossom-8B-trexlab-oki10p</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/LEESM__llama-3-Korean-Bllossom-8B-trexlab-oki10p-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
LEESM/llama-3-Korean-Bllossom-8B-trexlab-oki10p
d105e0365510f9e5f8550558343083cab8523524
12.943198
mit
0
8
true
true
true
false
false
0.758358
0.213725
21.372514
0.434301
19.797436
0.01284
1.283988
0.275168
3.355705
0.386927
7.665885
0.317653
24.183658
false
2024-07-22
2024-11-08
0
LEESM/llama-3-Korean-Bllossom-8B-trexlab-oki10p
LGAI-EXAONE_EXAONE-3.0-7.8B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
ExaoneForCausalLM
<a target="_blank" href="https://huggingface.co/LGAI-EXAONE/EXAONE-3.0-7.8B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">LGAI-EXAONE/EXAONE-3.0-7.8B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/LGAI-EXAONE__EXAONE-3.0-7.8B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
LGAI-EXAONE/EXAONE-3.0-7.8B-Instruct
7f15baedd46858153d817445aff032f4d6cf4939
21.403463
other
377
7
true
true
true
false
true
0.825128
0.719283
71.928261
0.417443
17.977335
0.044562
4.456193
0.26594
2.12528
0.366125
3.298958
0.357713
28.634752
false
2024-07-31
2024-08-18
0
LGAI-EXAONE/EXAONE-3.0-7.8B-Instruct
LLM360_K2_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/LLM360/K2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">LLM360/K2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/LLM360__K2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
LLM360/K2
49d159b6f2b64d562e745f0ff06e65b9a4c28ead
14.568225
apache-2.0
80
65
true
true
true
false
false
8.838206
0.225216
22.521576
0.497184
28.220403
0.022659
2.265861
0.276846
3.579418
0.398
8.55
0.300449
22.272089
true
2024-04-17
2024-06-26
0
LLM360/K2
LLM360_K2-Chat_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/LLM360/K2-Chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">LLM360/K2-Chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/LLM360__K2-Chat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
LLM360/K2-Chat
5454f2d28031c9127e4227c873ca2f154e02e4c7
22.939512
apache-2.0
33
65
true
true
true
false
true
17.259828
0.515176
51.51764
0.53581
33.793829
0.016616
1.661631
0.306208
7.494407
0.457
16.825
0.337101
26.344563
true
2024-05-22
2024-06-12
0
LLM360/K2-Chat
LLM4Binary_llm4decompile-1.3b-v2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/LLM4Binary/llm4decompile-1.3b-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">LLM4Binary/llm4decompile-1.3b-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/LLM4Binary__llm4decompile-1.3b-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
LLM4Binary/llm4decompile-1.3b-v2
a347dabcb1ea9f21c9339bd764c150262e993b95
6.850908
mit
6
1
true
true
true
false
false
0.247583
0.226789
22.678936
0.327181
5.915475
0.007553
0.755287
0.235738
0
0.407177
9.430469
0.120928
2.325281
false
2024-06-18
2024-11-16
0
LLM4Binary/llm4decompile-1.3b-v2
Lambent_qwen2.5-reinstruct-alternate-lumen-14B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Lambent/qwen2.5-reinstruct-alternate-lumen-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Lambent/qwen2.5-reinstruct-alternate-lumen-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Lambent__qwen2.5-reinstruct-alternate-lumen-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Lambent/qwen2.5-reinstruct-alternate-lumen-14B
dac3be334098338fb6c02636349e8ed53f18c4a4
33.966892
3
14
false
true
true
false
false
2.264651
0.479381
47.938137
0.645899
48.989609
0.216012
21.601208
0.376678
16.89038
0.477
19.625
0.538813
48.757018
false
2024-09-23
2024-09-28
1
Lambent/qwen2.5-reinstruct-alternate-lumen-14B (Merge)
Langboat_Mengzi3-8B-Chat_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Langboat/Mengzi3-8B-Chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Langboat/Mengzi3-8B-Chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Langboat__Mengzi3-8B-Chat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Langboat/Mengzi3-8B-Chat
128fffd3dac7c6067ca4d1a650e836e3ef46c013
19.822532
apache-2.0
1
8
true
true
true
false
true
0.851936
0.513977
51.397736
0.468373
25.188298
0.062689
6.268882
0.274329
3.243848
0.407792
9.040625
0.314162
23.795804
false
2024-09-14
2024-10-21
0
Langboat/Mengzi3-8B-Chat
LenguajeNaturalAI_leniachat-gemma-2b-v0_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
GemmaForCausalLM
<a target="_blank" href="https://huggingface.co/LenguajeNaturalAI/leniachat-gemma-2b-v0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">LenguajeNaturalAI/leniachat-gemma-2b-v0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/LenguajeNaturalAI__leniachat-gemma-2b-v0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
LenguajeNaturalAI/leniachat-gemma-2b-v0
e5691dcc682a10dc9ef4bdbb3dc896fcf271018e
5.598772
apache-2.0
12
2
true
true
true
false
true
0.966578
0.214974
21.497405
0.307402
4.138297
0.003021
0.302115
0.26594
2.12528
0.365906
3.638281
0.117021
1.891253
false
2024-04-09
2024-09-01
1
google/gemma-2b
LenguajeNaturalAI_leniachat-qwen2-1.5B-v0_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/LenguajeNaturalAI/leniachat-qwen2-1.5B-v0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">LenguajeNaturalAI/leniachat-qwen2-1.5B-v0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/LenguajeNaturalAI__leniachat-qwen2-1.5B-v0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
LenguajeNaturalAI/leniachat-qwen2-1.5B-v0
031a2efebb3cc1150e46f42ba0bea9fa7b855436
8.543039
apache-2.0
19
1
true
true
true
false
true
0.844562
0.222118
22.211842
0.368356
12.771666
0.010574
1.057402
0.261745
1.565996
0.37499
3.873698
0.187999
9.77763
false
2024-06-16
2024-09-30
1
Qwen/Qwen2-1.5B
LeroyDyer_LCARS_AI_001_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/LeroyDyer/LCARS_AI_001" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">LeroyDyer/LCARS_AI_001</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/LeroyDyer__LCARS_AI_001-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
LeroyDyer/LCARS_AI_001
3452e84fbfd92c62085fdce3834eff5c9cd87d4f
14.416618
0
7
false
true
true
false
false
0.581992
0.310945
31.094496
0.425789
19.460967
0.022659
2.265861
0.263423
1.789709
0.438365
13.328906
0.267038
18.559767
false
2024-08-06
0
Removed
LeroyDyer_LCARS_AI_1x4_003_SuperAI_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/LeroyDyer/LCARS_AI_1x4_003_SuperAI" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">LeroyDyer/LCARS_AI_1x4_003_SuperAI</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/LeroyDyer__LCARS_AI_1x4_003_SuperAI-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
LeroyDyer/LCARS_AI_1x4_003_SuperAI
917c84d241bfff8b8648d9d865ae4b5bead68c6b
19.467877
apache-2.0
2
24
true
true
true
false
false
1.657259
0.411113
41.111251
0.491985
28.423431
0.054381
5.438066
0.282718
4.362416
0.450615
15.560156
0.297207
21.911939
false
2024-04-03
2024-08-07
1
LeroyDyer/LCARS_AI_1x4_003_SuperAI (Merge)
LeroyDyer_LCARS_AI_StarTrek_Computer_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/LeroyDyer/LCARS_AI_StarTrek_Computer" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">LeroyDyer/LCARS_AI_StarTrek_Computer</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/LeroyDyer__LCARS_AI_StarTrek_Computer-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
LeroyDyer/LCARS_AI_StarTrek_Computer
9d4af4ab13df574ad0d40ed71de7d43c17f59a94
14.576129
mit
3
7
true
true
true
false
false
0.661931
0.358256
35.825609
0.444619
21.781003
0.03852
3.851964
0.267617
2.348993
0.395021
7.444271
0.245844
16.204935
false
2024-05-11
2024-08-07
0
LeroyDyer/LCARS_AI_StarTrek_Computer
LeroyDyer_LCARS_TOP_SCORE_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/LeroyDyer/LCARS_TOP_SCORE" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">LeroyDyer/LCARS_TOP_SCORE</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/LeroyDyer__LCARS_TOP_SCORE-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
LeroyDyer/LCARS_TOP_SCORE
ada3e3ac6ae162503da5158e72851053f4c7dac8
20.322005
openrail
2
7
true
true
true
false
false
0.613318
0.437066
43.706587
0.512737
31.699127
0.067221
6.722054
0.286074
4.809843
0.429281
12.426823
0.303108
22.567598
false
2024-03-30
2024-08-08
1
LeroyDyer/LCARS_TOP_SCORE (Merge)
LeroyDyer_Mixtral_AI_SwahiliTron_7b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/LeroyDyer/Mixtral_AI_SwahiliTron_7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">LeroyDyer/Mixtral_AI_SwahiliTron_7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/LeroyDyer__Mixtral_AI_SwahiliTron_7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
LeroyDyer/Mixtral_AI_SwahiliTron_7b
fd997ccdee03788e7e79944d26d9c641dc4fcd4c
4.270545
mit
3
7
true
true
false
false
true
0.698496
0.1534
15.339965
0.305509
3.211683
0.008308
0.830816
0.265101
2.013423
0.342031
1.920573
0.120761
2.306811
false
2024-04-10
2024-07-12
0
LeroyDyer/Mixtral_AI_SwahiliTron_7b
LeroyDyer_SpydazWebAI_Human_AGI_float16
float16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/LeroyDyer/SpydazWebAI_Human_AGI" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">LeroyDyer/SpydazWebAI_Human_AGI</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/LeroyDyer__SpydazWebAI_Human_AGI-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
LeroyDyer/SpydazWebAI_Human_AGI
0bc02d34a0b49c3473505d8df757de211af37131
9.907409
apache-2.0
2
7
true
false
true
false
false
0.667715
0.338822
33.88221
0.337486
7.445696
0.010574
1.057402
0.282718
4.362416
0.396635
7.379427
0.147856
5.317302
false
2024-10-16
2024-10-16
1
LeroyDyer/SpydazWebAI_Human_AGI (Merge)
LeroyDyer_SpydazWebAI_Human_AGI_001_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/LeroyDyer/SpydazWebAI_Human_AGI_001" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">LeroyDyer/SpydazWebAI_Human_AGI_001</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/LeroyDyer__SpydazWebAI_Human_AGI_001-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
LeroyDyer/SpydazWebAI_Human_AGI_001
4ed76e404deb425d5c934cdbbb4b99b4c1017433
10.120978
apache-2.0
1
7
true
false
true
false
false
0.442407
0.311819
31.181931
0.343342
8.6612
0.01435
1.435045
0.298658
6.487696
0.399396
8.224479
0.14262
4.73552
false
2024-10-17
2024-11-18
1
LeroyDyer/SpydazWebAI_Human_AGI_001 (Merge)
LeroyDyer_SpydazWeb_AI_CyberTron_Ultra_7b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/LeroyDyer/SpydazWeb_AI_CyberTron_Ultra_7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">LeroyDyer/SpydazWeb_AI_CyberTron_Ultra_7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/LeroyDyer__SpydazWeb_AI_CyberTron_Ultra_7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
LeroyDyer/SpydazWeb_AI_CyberTron_Ultra_7b
50c69e539578ab5384eb018a60cc1268637becae
13.478559
apache-2.0
4
7
true
true
true
false
false
0.656546
0.155573
15.557277
0.481077
27.745532
0.008308
0.830816
0.292785
5.704698
0.413625
10.303125
0.286569
20.729905
false
2024-04-14
2024-07-12
1
LeroyDyer/Mixtral_AI_CyberTron_Ultra
LeroyDyer_SpydazWeb_AI_HumanAI_001_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/LeroyDyer/SpydazWeb_AI_HumanAI_001" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">LeroyDyer/SpydazWeb_AI_HumanAI_001</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/LeroyDyer__SpydazWeb_AI_HumanAI_001-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
LeroyDyer/SpydazWeb_AI_HumanAI_001
7d664b94eb7c50bd0314ee74b7ac564c55efa878
7.678967
0
7
false
true
true
false
false
0.644177
0.225166
22.516589
0.334404
8.065262
0.01284
1.283988
0.288591
5.145414
0.386031
6.053906
0.127078
3.008644
false
2024-10-17
0
Removed
LeroyDyer_SpydazWeb_AI_HumanAI_006_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/LeroyDyer/SpydazWeb_AI_HumanAI_006" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">LeroyDyer/SpydazWeb_AI_HumanAI_006</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/LeroyDyer__SpydazWeb_AI_HumanAI_006-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
LeroyDyer/SpydazWeb_AI_HumanAI_006
c3ef6d31d58344f6d67825769a304b9ac5e702ca
4.758101
apache-2.0
1
7
true
true
true
false
false
0.448911
0.143008
14.300833
0.33018
6.72532
0.002266
0.226586
0.280201
4.026846
0.356792
1.765625
0.113531
1.503398
false
2024-11-01
2024-11-18
2
LeroyDyer/SpydazWeb_AI_HumanAI_005 (Merge)
LeroyDyer_SpydazWeb_AI_HumanAI_007_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/LeroyDyer/SpydazWeb_AI_HumanAI_007" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">LeroyDyer/SpydazWeb_AI_HumanAI_007</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/LeroyDyer__SpydazWeb_AI_HumanAI_007-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
LeroyDyer/SpydazWeb_AI_HumanAI_007
38d8c760a50e09cc877497275701de207ed54953
10.297736
apache-2.0
0
7
true
false
true
false
false
0.448741
0.335175
33.517511
0.341567
8.462819
0.015106
1.510574
0.288591
5.145414
0.409625
9.236458
0.135223
3.913638
false
2024-11-01
2024-11-18
1
LeroyDyer/SpydazWeb_AI_HumanAI_007 (Merge)
LeroyDyer_SpydazWeb_AI_HumanAI_RP_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/LeroyDyer/SpydazWeb_AI_HumanAI_RP" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">LeroyDyer/SpydazWeb_AI_HumanAI_RP</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/LeroyDyer__SpydazWeb_AI_HumanAI_RP-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
LeroyDyer/SpydazWeb_AI_HumanAI_RP
0569cca30df948b9f2e5145ce5c2b5a03ec025ae
7.668943
apache-2.0
1
7
true
false
true
false
false
0.443709
0.254117
25.411685
0.332302
7.176495
0.006042
0.60423
0.275168
3.355705
0.38826
5.865885
0.132397
3.59966
false
2024-10-20
2024-11-18
1
LeroyDyer/SpydazWeb_AI_HumanAI_RP (Merge)
LeroyDyer_SpydazWeb_AI_HumanAI_TextVision_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/LeroyDyer/SpydazWeb_AI_HumanAI_TextVision" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">LeroyDyer/SpydazWeb_AI_HumanAI_TextVision</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/LeroyDyer__SpydazWeb_AI_HumanAI_TextVision-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
LeroyDyer/SpydazWeb_AI_HumanAI_TextVision
ba0dcf52fec492cc5d91b3297c08c5581d893607
9.345527
apache-2.0
1
7
true
true
true
false
false
0.455715
0.306274
30.627402
0.335366
7.525593
0.005287
0.528701
0.291946
5.592841
0.393844
7.497135
0.138713
4.301492
false
2024-10-25
2024-11-18
2
LeroyDyer/SpydazWeb_AI_HumanAI_RP (Merge)
LeroyDyer_SpydazWeb_HumanAI_M1_float16
float16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/LeroyDyer/SpydazWeb_HumanAI_M1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">LeroyDyer/SpydazWeb_HumanAI_M1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/LeroyDyer__SpydazWeb_HumanAI_M1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
LeroyDyer/SpydazWeb_HumanAI_M1
c9bb5fdc262f9c68d02b798eb867495199bf3dbf
10.391053
0
7
false
true
true
false
false
0.66089
0.358206
35.820623
0.356327
10.027543
0.024924
2.492447
0.267617
2.348993
0.367115
4.289323
0.166307
7.367391
false
2024-10-16
0
Removed
LeroyDyer_SpydazWeb_HumanAI_M2_float16
float16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/LeroyDyer/SpydazWeb_HumanAI_M2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">LeroyDyer/SpydazWeb_HumanAI_M2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/LeroyDyer__SpydazWeb_HumanAI_M2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
LeroyDyer/SpydazWeb_HumanAI_M2
82fd99df73eeaf8ce12add6e74fda7901c75f86c
12.44635
0
7
false
true
true
false
false
0.63106
0.375017
37.501718
0.393088
15.397194
0.026435
2.643505
0.279362
3.914989
0.375146
3.993229
0.201047
11.227467
false
2024-10-16
0
Removed
LeroyDyer_SpydazWeb_HumanAI_M3_float16
float16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/LeroyDyer/SpydazWeb_HumanAI_M3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">LeroyDyer/SpydazWeb_HumanAI_M3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/LeroyDyer__SpydazWeb_HumanAI_M3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
LeroyDyer/SpydazWeb_HumanAI_M3
01dbeb9536ad2cba5a3c4fbeef77e6b3f692adc5
5.405096
0
7
false
true
true
false
false
0.687549
0.157871
15.787112
0.312726
4.765389
0.003021
0.302115
0.270973
2.796421
0.391427
7.128385
0.11486
1.651152
false
2024-10-16
0
Removed