eval_name
stringlengths
12
111
Precision
stringclasses
3 values
Type
stringclasses
6 values
T
stringclasses
6 values
Weight type
stringclasses
2 values
Architecture
stringclasses
47 values
Model
stringlengths
355
650
fullname
stringlengths
4
102
Model sha
stringlengths
0
40
Average ⬆️
float64
1.41
51.2
Hub License
stringclasses
24 values
Hub ❤️
int64
0
5.81k
#Params (B)
int64
-1
140
Available on the hub
bool
2 classes
Not_Merged
bool
2 classes
MoE
bool
2 classes
Flagged
bool
1 class
Chat Template
bool
2 classes
CO₂ Emissions for Evaluation (kg)
float64
0.04
107
IFEval Raw
float64
0
0.87
IFEval
float64
0
86.7
BBH Raw
float64
0.28
0.75
BBH
float64
0.81
62.8
MATH Lvl 5 Raw
float64
0
0.51
MATH Lvl 5
float64
0
50.7
GPQA Raw
float64
0.22
0.41
GPQA
float64
0
21.6
MUSR Raw
float64
0.29
0.59
MUSR
float64
0
36.4
MMLU-PRO Raw
float64
0.1
0.7
MMLU-PRO
float64
0
66.8
Maintainer's Highlight
bool
2 classes
Upload To Hub Date
stringlengths
0
10
Submission Date
stringclasses
138 values
Generation
int64
0
6
Base Model
stringlengths
4
102
djuna_L3.1-Suze-Vume-calc_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/djuna/L3.1-Suze-Vume-calc" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">djuna/L3.1-Suze-Vume-calc</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/djuna__L3.1-Suze-Vume-calc-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
djuna/L3.1-Suze-Vume-calc
830c07d136ecd8171805078606f00c4ee69f21c3
25.975608
1
8
false
true
true
false
true
0.804519
0.729674
72.967393
0.516421
31.136638
0.112538
11.253776
0.281879
4.250559
0.384292
8.303125
0.351479
27.942154
false
2024-08-26
2024-09-04
1
djuna/L3.1-Suze-Vume-calc (Merge)
djuna_MN-Chinofun_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/djuna/MN-Chinofun" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">djuna/MN-Chinofun</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/djuna__MN-Chinofun-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
djuna/MN-Chinofun
71b47c86f32e107b407fada44ec6b893c5eb8bb0
24.369131
3
12
false
true
true
false
true
1.446493
0.611022
61.102209
0.49527
28.483575
0.111782
11.178248
0.296141
6.152125
0.408354
10.377604
0.360289
28.921025
false
2024-09-16
2024-09-23
1
djuna/MN-Chinofun (Merge)
djuna-test-lab_TEST-L3.2-ReWish-3B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/djuna-test-lab/TEST-L3.2-ReWish-3B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">djuna-test-lab/TEST-L3.2-ReWish-3B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/djuna-test-lab__TEST-L3.2-ReWish-3B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
djuna-test-lab/TEST-L3.2-ReWish-3B
0cb7d434c4647faed475f17d74e9047007cd3782
22.445512
1
3
false
true
true
false
true
0.640631
0.636776
63.677598
0.449541
22.0667
0.129154
12.915408
0.283557
4.474273
0.37775
7.91875
0.312583
23.620346
false
2024-10-23
2024-10-24
1
djuna-test-lab/TEST-L3.2-ReWish-3B (Merge)
djuna-test-lab_TEST-L3.2-ReWish-3B-ties-w-base_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/djuna-test-lab/TEST-L3.2-ReWish-3B-ties-w-base" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">djuna-test-lab/TEST-L3.2-ReWish-3B-ties-w-base</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/djuna-test-lab__TEST-L3.2-ReWish-3B-ties-w-base-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
djuna-test-lab/TEST-L3.2-ReWish-3B-ties-w-base
ebab6c0266ae7846b2bb9a595a2651a23b031372
22.420117
0
3
false
true
true
false
true
1.281374
0.635252
63.525224
0.449541
22.0667
0.129154
12.915408
0.283557
4.474273
0.37775
7.91875
0.312583
23.620346
false
2024-10-23
2024-10-23
1
djuna-test-lab/TEST-L3.2-ReWish-3B-ties-w-base (Merge)
dnhkng_RYS-Medium_bfloat16
bfloat16
🟩 continuously pretrained
🟩
Original
Phi3ForCausalLM
<a target="_blank" href="https://huggingface.co/dnhkng/RYS-Medium" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">dnhkng/RYS-Medium</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/dnhkng__RYS-Medium-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
dnhkng/RYS-Medium
de09a79e6b2efdcc97490a37b770764e62749fd0
25.944227
mit
3
18
true
true
true
false
false
2.136378
0.440613
44.061313
0.628473
47.734201
0.077795
7.779456
0.32802
10.402685
0.406927
8.732552
0.432596
36.955157
false
2024-07-17
2024-07-17
0
dnhkng/RYS-Medium
dnhkng_RYS-Llama-3-8B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/dnhkng/RYS-Llama-3-8B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">dnhkng/RYS-Llama-3-8B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/dnhkng__RYS-Llama-3-8B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
dnhkng/RYS-Llama-3-8B-Instruct
293ab00d1e2be2752f97d5568fde2b09f6a1caae
21.910187
mit
1
8
true
true
true
false
true
0.805187
0.695777
69.57772
0.480871
25.373015
0.067976
6.797583
0.25755
1.006711
0.338344
0.292969
0.355718
28.413121
false
2024-08-06
2024-08-07
0
dnhkng/RYS-Llama-3-8B-Instruct
dnhkng_RYS-Llama-3-Huge-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/dnhkng/RYS-Llama-3-Huge-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">dnhkng/RYS-Llama-3-Huge-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/dnhkng__RYS-Llama-3-Huge-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
dnhkng/RYS-Llama-3-Huge-Instruct
cfe14a5339e88a7a89f075d9d48215d45f64acaf
34.68177
mit
1
99
true
true
true
false
true
14.736988
0.768592
76.859178
0.648087
49.073721
0.231118
23.111782
0.260906
1.454139
0.42076
11.928385
0.510971
45.663416
false
2024-08-06
2024-08-07
0
dnhkng/RYS-Llama-3-Huge-Instruct
dnhkng_RYS-Llama-3-Large-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/dnhkng/RYS-Llama-3-Large-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">dnhkng/RYS-Llama-3-Large-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/dnhkng__RYS-Llama-3-Large-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
dnhkng/RYS-Llama-3-Large-Instruct
01e3208aaf7bf6d2b09737960c701ec6628977fe
36.094509
mit
1
73
true
true
true
false
true
9.811517
0.805062
80.506168
0.652527
49.665539
0.23716
23.716012
0.28943
5.257271
0.418031
11.453906
0.513713
45.968159
false
2024-08-06
2024-08-07
0
dnhkng/RYS-Llama-3-Large-Instruct
dnhkng_RYS-Llama-3.1-8B-Instruct_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
?
<a target="_blank" href="https://huggingface.co/dnhkng/RYS-Llama-3.1-8B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">dnhkng/RYS-Llama-3.1-8B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/dnhkng__RYS-Llama-3.1-8B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
dnhkng/RYS-Llama-3.1-8B-Instruct
d4e2393403dcae19860da7c29519c8fe6fbf2fad
26.650662
mit
10
8
true
true
true
false
true
0.971672
0.768492
76.849205
0.516365
31.085445
0.126133
12.613293
0.267617
2.348993
0.368104
7.679688
0.363946
29.327349
false
2024-08-08
2024-08-30
0
dnhkng/RYS-Llama-3.1-8B-Instruct
dnhkng_RYS-Llama3.1-Large_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/dnhkng/RYS-Llama3.1-Large" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">dnhkng/RYS-Llama3.1-Large</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/dnhkng__RYS-Llama3.1-Large-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
dnhkng/RYS-Llama3.1-Large
52cc979de78155b33689efa48f52a8aab184bd86
41.937416
mit
1
81
true
true
true
false
true
15.406329
0.8492
84.920012
0.689911
55.414864
0.304381
30.438066
0.374161
16.55481
0.455396
17.091146
0.52485
47.2056
false
2024-08-11
2024-08-22
0
dnhkng/RYS-Llama3.1-Large
dnhkng_RYS-Phi-3-medium-4k-instruct_bfloat16
bfloat16
🟢 pretrained
🟢
Original
Phi3ForCausalLM
<a target="_blank" href="https://huggingface.co/dnhkng/RYS-Phi-3-medium-4k-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">dnhkng/RYS-Phi-3-medium-4k-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/dnhkng__RYS-Phi-3-medium-4k-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
dnhkng/RYS-Phi-3-medium-4k-instruct
1009e916b1ff8c9a53bc9d8ff48bea2a15ccde26
28.464284
mit
1
17
true
true
true
false
false
2.310547
0.439139
43.913926
0.622631
46.748971
0.123112
12.311178
0.354866
13.982103
0.425281
11.09349
0.484624
42.736037
false
2024-08-06
2024-08-07
0
dnhkng/RYS-Phi-3-medium-4k-instruct
dnhkng_RYS-XLarge_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/dnhkng/RYS-XLarge" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">dnhkng/RYS-XLarge</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/dnhkng__RYS-XLarge-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
dnhkng/RYS-XLarge
0f84dd9dde60f383e1e2821496befb4ce9a11ef6
45.131222
mit
73
77
true
true
true
false
false
13.576083
0.799566
79.956626
0.705003
58.773567
0.412387
41.238671
0.384228
17.897092
0.496969
23.721094
0.542803
49.200281
false
2024-07-24
2024-08-07
0
dnhkng/RYS-XLarge
dnhkng_RYS-XLarge-base_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
?
<a target="_blank" href="https://huggingface.co/dnhkng/RYS-XLarge-base" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">dnhkng/RYS-XLarge-base</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/dnhkng__RYS-XLarge-base-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
dnhkng/RYS-XLarge-base
c718b3d9e24916e3b0347d3fdaa5e5a097c2f603
43.970955
mit
4
77
true
true
true
false
true
13.587524
0.791023
79.102337
0.704729
58.692146
0.371601
37.160121
0.379195
17.225951
0.490271
22.417188
0.543052
49.227985
false
2024-08-02
2024-08-30
0
dnhkng/RYS-XLarge-base
dnhkng_RYS-XLarge2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/dnhkng/RYS-XLarge2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">dnhkng/RYS-XLarge2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/dnhkng__RYS-XLarge2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
dnhkng/RYS-XLarge2
3ce16c9427e93e09ce10a28fa644469d49a51113
35.001876
0
77
false
true
true
false
true
13.375885
0.490197
49.019712
0.657395
51.549936
0.271903
27.190332
0.374161
16.55481
0.450802
17.05026
0.537816
48.646203
false
2024-10-11
0
Removed
dreamgen_WizardLM-2-7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/dreamgen/WizardLM-2-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">dreamgen/WizardLM-2-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/dreamgen__WizardLM-2-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
dreamgen/WizardLM-2-7B
b5f2d7bff91445a47331dcce588aee009d11d255
14.82719
apache-2.0
36
7
true
true
true
false
true
0.566725
0.458298
45.829843
0.348679
9.213114
0.030211
3.021148
0.286913
4.9217
0.394094
7.528385
0.266041
18.448951
false
2024-04-16
2024-06-27
0
dreamgen/WizardLM-2-7B
dustinwloring1988_Reflexis-8b-chat-v1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/dustinwloring1988/Reflexis-8b-chat-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">dustinwloring1988/Reflexis-8b-chat-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/dustinwloring1988__Reflexis-8b-chat-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
dustinwloring1988/Reflexis-8b-chat-v1
e96bd9694ae87a4f612825310eb7afaea5b0aa28
17.340651
0
8
false
true
true
false
true
0.891142
0.365775
36.577503
0.46636
24.109958
0.114804
11.480363
0.254195
0.559284
0.375396
4.824479
0.338431
26.492317
false
2024-09-14
0
Removed
dustinwloring1988_Reflexis-8b-chat-v2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/dustinwloring1988/Reflexis-8b-chat-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">dustinwloring1988/Reflexis-8b-chat-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/dustinwloring1988__Reflexis-8b-chat-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
dustinwloring1988/Reflexis-8b-chat-v2
817408ebfaa7ba0ea9433e1de4bfa120d38d2a0f
18.364751
0
8
false
true
true
false
true
0.94037
0.391204
39.120423
0.47238
24.892196
0.121601
12.160121
0.270134
2.684564
0.352635
4.91276
0.337766
26.41844
false
2024-09-14
0
Removed
dustinwloring1988_Reflexis-8b-chat-v3_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/dustinwloring1988/Reflexis-8b-chat-v3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">dustinwloring1988/Reflexis-8b-chat-v3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/dustinwloring1988__Reflexis-8b-chat-v3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
dustinwloring1988/Reflexis-8b-chat-v3
dcfa1a6a9f94a099286891d732b17cbbe97a644e
20.500265
0
8
false
true
true
false
true
0.891467
0.536734
53.673364
0.465831
24.168293
0.120846
12.084592
0.24245
0
0.351177
4.763802
0.354804
28.31154
false
2024-09-14
0
Removed
dustinwloring1988_Reflexis-8b-chat-v4_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/dustinwloring1988/Reflexis-8b-chat-v4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">dustinwloring1988/Reflexis-8b-chat-v4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/dustinwloring1988__Reflexis-8b-chat-v4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
dustinwloring1988/Reflexis-8b-chat-v4
81e20c2e40f2028818d5d6d27ec9e0d503ae8cc1
18.530939
0
8
false
true
true
false
true
0.88527
0.469789
46.978905
0.468601
24.33177
0.102719
10.271903
0.23406
0
0.339302
3.046094
0.339013
26.556959
false
2024-09-14
0
Removed
dustinwloring1988_Reflexis-8b-chat-v5_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/dustinwloring1988/Reflexis-8b-chat-v5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">dustinwloring1988/Reflexis-8b-chat-v5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/dustinwloring1988__Reflexis-8b-chat-v5-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
dustinwloring1988/Reflexis-8b-chat-v5
12970eec99f458a3982eb502b71b6df0bc74bb52
18.586622
0
8
false
true
true
false
true
0.913096
0.423752
42.375231
0.478169
25.195784
0.124622
12.462236
0.270973
2.796421
0.335365
4.053906
0.321725
24.636155
false
2024-09-14
0
Removed
dustinwloring1988_Reflexis-8b-chat-v6_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/dustinwloring1988/Reflexis-8b-chat-v6" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">dustinwloring1988/Reflexis-8b-chat-v6</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/dustinwloring1988__Reflexis-8b-chat-v6-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
dustinwloring1988/Reflexis-8b-chat-v6
a0b30a21a8eea9a32a2767755dc2dbd44eeb383f
20.445597
0
8
false
true
true
false
true
0.899203
0.493894
49.389398
0.480954
26.116103
0.135952
13.595166
0.262584
1.677852
0.375333
4.35
0.347906
27.545065
false
2024-09-14
0
Removed
dustinwloring1988_Reflexis-8b-chat-v7_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/dustinwloring1988/Reflexis-8b-chat-v7" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">dustinwloring1988/Reflexis-8b-chat-v7</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/dustinwloring1988__Reflexis-8b-chat-v7-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
dustinwloring1988/Reflexis-8b-chat-v7
e8d990012ccd855e65d51cb7cfd1762632a8f217
18.843739
0
8
false
true
true
false
true
0.902111
0.398048
39.804829
0.480983
25.987497
0.148036
14.803625
0.261745
1.565996
0.322156
1.536198
0.364279
29.364288
false
2024-09-14
0
Removed
dwikitheduck_gemma-2-2b-id_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/dwikitheduck/gemma-2-2b-id" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">dwikitheduck/gemma-2-2b-id</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/dwikitheduck__gemma-2-2b-id-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
dwikitheduck/gemma-2-2b-id
6f191d4a7618664619adda1cd96d9d1bf72f33b2
11.037655
gemma
0
2
true
true
true
false
false
1.622564
0.180552
18.055205
0.411897
17.116136
0.02568
2.567976
0.260906
1.454139
0.428167
11.954167
0.235705
15.07831
false
2024-10-24
2024-10-25
0
dwikitheduck/gemma-2-2b-id
dzakwan_dzakwan-MoE-4x7b-Beta_float16
float16
🤝 base merges and moerges
🤝
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/dzakwan/dzakwan-MoE-4x7b-Beta" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">dzakwan/dzakwan-MoE-4x7b-Beta</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/dzakwan__dzakwan-MoE-4x7b-Beta-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
dzakwan/dzakwan-MoE-4x7b-Beta
e89f82f2afa1961335de5a6d6d05bd850d1d61d9
20.756715
apache-2.0
0
24
true
false
false
false
false
1.456028
0.44426
44.426012
0.514044
32.074208
0.077039
7.703927
0.286074
4.809843
0.42674
12.109115
0.310755
23.417184
false
2024-05-26
2024-08-05
1
dzakwan/dzakwan-MoE-4x7b-Beta (Merge)
ehristoforu_Gemma2-9B-it-psy10k-mental_health_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/ehristoforu/Gemma2-9B-it-psy10k-mental_health" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ehristoforu/Gemma2-9B-it-psy10k-mental_health</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ehristoforu__Gemma2-9B-it-psy10k-mental_health-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ehristoforu/Gemma2-9B-it-psy10k-mental_health
4adc2d61d530d23026493d29e6191e06cf549fc6
26.764494
apache-2.0
1
9
true
true
true
false
true
2.27683
0.588666
58.866585
0.553938
35.566009
0.137462
13.746224
0.337248
11.63311
0.408604
9.342188
0.382896
31.432846
false
2024-07-16
2024-07-31
2
unsloth/gemma-2-9b-it-bnb-4bit
ehristoforu_Gemma2-9b-it-train6_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/ehristoforu/Gemma2-9b-it-train6" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ehristoforu/Gemma2-9b-it-train6</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ehristoforu__Gemma2-9b-it-train6-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ehristoforu/Gemma2-9b-it-train6
e72bf00b427c22c48b468818cf75300a373a0c8a
28.897532
apache-2.0
2
9
true
true
true
false
true
1.993683
0.702522
70.252153
0.589809
40.987625
0.0929
9.29003
0.328859
10.514541
0.408417
9.652083
0.394199
32.688756
false
2024-07-22
2024-07-31
6
unsloth/gemma-2-9b-it-bnb-4bit
elinas_Chronos-Gold-12B-1.0_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/elinas/Chronos-Gold-12B-1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">elinas/Chronos-Gold-12B-1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/elinas__Chronos-Gold-12B-1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
elinas/Chronos-Gold-12B-1.0
cf76a4621b9dfc0c2e6d930756e6c7c9ce2b260b
21.488289
apache-2.0
31
12
true
false
true
false
true
1.502531
0.316566
31.65656
0.551466
35.908947
0.049094
4.909366
0.317953
9.060403
0.47399
19.415365
0.351812
27.979093
false
2024-08-21
2024-09-15
1
mistralai/Mistral-Nemo-Base-2407
euclaise_ReMask-3B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
StableLmForCausalLM
<a target="_blank" href="https://huggingface.co/euclaise/ReMask-3B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">euclaise/ReMask-3B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/euclaise__ReMask-3B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
euclaise/ReMask-3B
e094dae96097c2bc6f758101ee269c089b65a2cf
7.25664
cc-by-sa-4.0
15
2
true
true
true
false
true
0.44684
0.241927
24.192698
0.351678
8.742083
0.017372
1.73716
0.266779
2.237136
0.334094
2.661719
0.135721
3.969046
false
2024-03-28
2024-08-10
0
euclaise/ReMask-3B
facebook_opt-1.3b_float16
float16
🟢 pretrained
🟢
Original
OPTForCausalLM
<a target="_blank" href="https://huggingface.co/facebook/opt-1.3b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">facebook/opt-1.3b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/facebook__opt-1.3b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
facebook/opt-1.3b
3f5c25d0bc631cb57ac65913f76e22c2dfb61d62
5.251513
other
152
1
true
true
true
false
false
0.403005
0.23833
23.832985
0.309395
3.648052
0.007553
0.755287
0.24245
0
0.342
2.083333
0.110705
1.189421
true
2022-05-11
2024-06-12
0
facebook/opt-1.3b
facebook_opt-30b_float16
float16
🟢 pretrained
🟢
Original
OPTForCausalLM
<a target="_blank" href="https://huggingface.co/facebook/opt-30b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">facebook/opt-30b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/facebook__opt-30b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
facebook/opt-30b
ceea0a90ac0f6fae7c2c34bcb40477438c152546
6.201345
other
132
30
true
true
true
false
false
2.999845
0.245299
24.529914
0.307034
3.498429
0.006042
0.60423
0.269295
2.572707
0.360417
4.185417
0.116356
1.817376
true
2022-05-11
2024-06-12
0
facebook/opt-30b
failspy_Llama-3-8B-Instruct-MopeyMule_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/failspy/Llama-3-8B-Instruct-MopeyMule" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">failspy/Llama-3-8B-Instruct-MopeyMule</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/failspy__Llama-3-8B-Instruct-MopeyMule-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
failspy/Llama-3-8B-Instruct-MopeyMule
d1cbf407efe727c6b9fc94f22d51ff4915e1856e
15.612956
other
67
8
true
true
true
false
true
0.823136
0.675044
67.504444
0.383874
13.620496
0.018127
1.812689
0.239094
0
0.351302
2.246094
0.176446
8.494016
false
2024-05-30
2024-09-21
0
failspy/Llama-3-8B-Instruct-MopeyMule
failspy_Llama-3-8B-Instruct-abliterated_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/failspy/Llama-3-8B-Instruct-abliterated" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">failspy/Llama-3-8B-Instruct-abliterated</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/failspy__Llama-3-8B-Instruct-abliterated-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
failspy/Llama-3-8B-Instruct-abliterated
dd67dd055661e4cbcedb0ed2431693d9cc3be6e0
19.177668
llama3
9
8
true
true
true
false
true
0.741906
0.590889
59.088884
0.435375
18.864599
0.037764
3.776435
0.276007
3.467562
0.411583
10.514583
0.274186
19.353945
false
2024-05-07
2024-07-03
0
failspy/Llama-3-8B-Instruct-abliterated
failspy_Meta-Llama-3-70B-Instruct-abliterated-v3.5_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
?
<a target="_blank" href="https://huggingface.co/failspy/Meta-Llama-3-70B-Instruct-abliterated-v3.5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">failspy/Meta-Llama-3-70B-Instruct-abliterated-v3.5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/failspy__Meta-Llama-3-70B-Instruct-abliterated-v3.5-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
failspy/Meta-Llama-3-70B-Instruct-abliterated-v3.5
fc951b03d92972ab52ad9392e620eba6173526b9
30.204883
llama3
37
70
true
true
true
false
true
9.204711
0.774687
77.468672
0.57471
37.871333
0.132931
13.293051
0.29698
6.263982
0.398187
7.973438
0.445229
38.358821
false
2024-05-28
2024-08-30
0
failspy/Meta-Llama-3-70B-Instruct-abliterated-v3.5
failspy_Phi-3-medium-4k-instruct-abliterated-v3_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Phi3ForCausalLM
<a target="_blank" href="https://huggingface.co/failspy/Phi-3-medium-4k-instruct-abliterated-v3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">failspy/Phi-3-medium-4k-instruct-abliterated-v3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/failspy__Phi-3-medium-4k-instruct-abliterated-v3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
failspy/Phi-3-medium-4k-instruct-abliterated-v3
959b09eacf6cae85a8eb21b25e998addc89a367b
31.775592
mit
21
13
true
true
true
false
true
1.520981
0.63193
63.192995
0.63048
46.732839
0.154834
15.483384
0.317114
8.948546
0.460417
18.51875
0.439993
37.777039
false
2024-05-22
2024-07-29
0
failspy/Phi-3-medium-4k-instruct-abliterated-v3
failspy_llama-3-70B-Instruct-abliterated_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/failspy/llama-3-70B-Instruct-abliterated" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">failspy/llama-3-70B-Instruct-abliterated</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/failspy__llama-3-70B-Instruct-abliterated-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
failspy/llama-3-70B-Instruct-abliterated
53ae9dafe8b3d163e05d75387575f8e9f43253d0
36.091429
llama3
94
70
true
true
true
false
true
9.374129
0.802339
80.233891
0.646485
48.939818
0.255287
25.528701
0.28943
5.257271
0.41276
10.528385
0.514545
46.060505
false
2024-05-07
2024-07-03
0
failspy/llama-3-70B-Instruct-abliterated
fblgit_TheBeagle-v2beta-32B-MGS_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/fblgit/TheBeagle-v2beta-32B-MGS" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">fblgit/TheBeagle-v2beta-32B-MGS</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/fblgit__TheBeagle-v2beta-32B-MGS-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
fblgit/TheBeagle-v2beta-32B-MGS
56830f63e4a40378b7721ae966637b4678cc8784
41.622408
other
7
32
true
true
true
false
false
32.879107
0.518074
51.807427
0.703263
58.027976
0.433535
43.353474
0.38255
17.673378
0.50075
24.260417
0.591506
54.611776
false
2024-10-20
2024-10-30
1
fblgit/TheBeagle-v2beta-32B-MGS (Merge)
fblgit_TheBeagle-v2beta-32B-MGS_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/fblgit/TheBeagle-v2beta-32B-MGS" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">fblgit/TheBeagle-v2beta-32B-MGS</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/fblgit__TheBeagle-v2beta-32B-MGS-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
fblgit/TheBeagle-v2beta-32B-MGS
56830f63e4a40378b7721ae966637b4678cc8784
40.28667
other
7
32
true
true
true
false
false
11.366068
0.450305
45.030519
0.703542
58.06603
0.39426
39.425982
0.401007
20.134228
0.502115
24.497656
0.59109
54.565603
false
2024-10-20
2024-10-20
1
fblgit/TheBeagle-v2beta-32B-MGS (Merge)
fblgit_UNA-SimpleSmaug-34b-v1beta_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/fblgit/UNA-SimpleSmaug-34b-v1beta" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">fblgit/UNA-SimpleSmaug-34b-v1beta</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/fblgit__UNA-SimpleSmaug-34b-v1beta-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
fblgit/UNA-SimpleSmaug-34b-v1beta
4b62fccfc7e44c0a02c11a5279d98fafa6b922ba
23.121397
apache-2.0
20
34
true
true
true
false
true
3.164466
0.455626
45.562552
0.528665
32.775789
0.001511
0.151057
0.317114
8.948546
0.425563
11.961979
0.453956
39.328457
false
2024-02-05
2024-06-30
2
jondurbin/bagel-34b-v0.2
fblgit_UNA-TheBeagle-7b-v1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/fblgit/UNA-TheBeagle-7b-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">fblgit/UNA-TheBeagle-7b-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/fblgit__UNA-TheBeagle-7b-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
fblgit/UNA-TheBeagle-7b-v1
866d3ee19f983728e21a624f8a27574960073f27
19.633583
cc-by-nc-nd-4.0
36
7
true
true
true
false
false
0.560639
0.368872
36.887237
0.502869
30.173397
0.076284
7.628399
0.284396
4.58613
0.456438
16.088021
0.301945
22.438313
false
2024-01-09
2024-06-30
0
fblgit/UNA-TheBeagle-7b-v1
fblgit_UNA-ThePitbull-21.4B-v2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/fblgit/UNA-ThePitbull-21.4B-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">fblgit/UNA-ThePitbull-21.4B-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/fblgit__UNA-ThePitbull-21.4B-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
fblgit/UNA-ThePitbull-21.4B-v2
f12aac93ae9c852550a16816e16116c4f8e7dec0
22.799983
afl-3.0
15
21
true
true
true
false
true
2.298414
0.379039
37.903873
0.635039
46.788074
0.108006
10.800604
0.302013
6.935123
0.392167
6.420833
0.351563
27.951389
false
2024-05-28
2024-06-30
0
fblgit/UNA-ThePitbull-21.4B-v2
fblgit_cybertron-v4-qw7B-MGS_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/fblgit/cybertron-v4-qw7B-MGS" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">fblgit/cybertron-v4-qw7B-MGS</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/fblgit__cybertron-v4-qw7B-MGS-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
fblgit/cybertron-v4-qw7B-MGS
ea2aaf4f4000190235722a9ad4f5cd9e9091a64e
31.207648
other
9
7
true
true
true
false
false
1.246739
0.626385
62.638466
0.559177
37.041623
0.27719
27.719033
0.310403
8.053691
0.437094
13.203385
0.447307
38.589687
false
2024-10-29
2024-10-29
1
fblgit/cybertron-v4-qw7B-MGS (Merge)
fblgit_juanako-7b-UNA_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/fblgit/juanako-7b-UNA" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">fblgit/juanako-7b-UNA</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/fblgit__juanako-7b-UNA-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
fblgit/juanako-7b-UNA
b8ac85b603d5ee1ac619b2e1d0b3bb86c4eecb0c
20.825304
apache-2.0
23
7
true
true
true
false
false
0.63179
0.483728
48.372762
0.507001
30.415072
0.031722
3.172205
0.296141
6.152125
0.4645
17.1625
0.277094
19.677157
false
2023-11-27
2024-06-30
0
fblgit/juanako-7b-UNA
fblgit_miniclaus-qw1.5B-UNAMGS_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/fblgit/miniclaus-qw1.5B-UNAMGS" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">fblgit/miniclaus-qw1.5B-UNAMGS</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/fblgit__miniclaus-qw1.5B-UNAMGS-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
fblgit/miniclaus-qw1.5B-UNAMGS
de590536ba82ffb7b4001dffb5f8b60d2087c319
16.868868
other
3
1
true
true
true
false
false
0.591743
0.334801
33.480055
0.423859
18.562864
0.098187
9.818731
0.291946
5.592841
0.429344
12.234635
0.293717
21.524084
false
2024-11-01
2024-11-01
2
Qwen/Qwen2.5-1.5B
fblgit_pancho-v1-qw25-3B-UNAMGS_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/fblgit/pancho-v1-qw25-3B-UNAMGS" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">fblgit/pancho-v1-qw25-3B-UNAMGS</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/fblgit__pancho-v1-qw25-3B-UNAMGS-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
fblgit/pancho-v1-qw25-3B-UNAMGS
01143501cbc2c90961be5397c6945c6789815a60
23.646637
other
1
3
true
true
true
false
false
0.780382
0.536134
53.613412
0.492583
28.66965
0.14426
14.425982
0.29698
6.263982
0.40274
8.175781
0.376579
30.731014
false
2024-11-04
2024-11-06
2
Qwen/Qwen2.5-3B
fblgit_una-cybertron-7b-v2-bf16_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/fblgit/una-cybertron-7b-v2-bf16" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">fblgit/una-cybertron-7b-v2-bf16</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/fblgit__una-cybertron-7b-v2-bf16-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
fblgit/una-cybertron-7b-v2-bf16
7ab101a153740aec39e95ec02831c56f4eab7910
17.17956
apache-2.0
116
7
true
true
true
false
true
0.634206
0.473711
47.371086
0.397339
14.966965
0.03852
3.851964
0.297819
6.375839
0.447323
14.482031
0.244265
16.029477
false
2023-12-02
2024-06-30
0
fblgit/una-cybertron-7b-v2-bf16
flammenai_Llama3.1-Flammades-70B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/flammenai/Llama3.1-Flammades-70B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">flammenai/Llama3.1-Flammades-70B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/flammenai__Llama3.1-Flammades-70B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
flammenai/Llama3.1-Flammades-70B
48909a734460e667e3a7e91bd25f124ec3b2ba74
35.898954
llama3.1
2
70
true
true
true
false
true
10.284833
0.705844
70.584383
0.665972
52.547943
0.143505
14.350453
0.354027
13.870246
0.487052
22.348177
0.475233
41.692524
false
2024-10-12
2024-10-13
1
flammenai/Llama3.1-Flammades-70B (Merge)
flammenai_Mahou-1.2a-llama3-8B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/flammenai/Mahou-1.2a-llama3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">flammenai/Mahou-1.2a-llama3-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/flammenai__Mahou-1.2a-llama3-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
flammenai/Mahou-1.2a-llama3-8B
3318b6f5f1839644bee287a3e5390f3e9f565a9e
21.841614
llama3
6
8
true
true
true
false
false
0.932412
0.509257
50.925655
0.509366
28.972588
0.086858
8.685801
0.288591
5.145414
0.384667
6.016667
0.381732
31.303561
false
2024-05-25
2024-09-03
1
flammenai/Mahou-1.2a-llama3-8B (Merge)
flammenai_Mahou-1.2a-mistral-7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/flammenai/Mahou-1.2a-mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">flammenai/Mahou-1.2a-mistral-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/flammenai__Mahou-1.2a-mistral-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
flammenai/Mahou-1.2a-mistral-7B
d45f61cca04da0c3359573102853fca1a0d3b252
19.503462
apache-2.0
6
7
true
true
true
false
false
1.805622
0.455201
45.520109
0.511811
31.16675
0.064199
6.41994
0.271812
2.908277
0.389625
6.969792
0.316323
24.035904
false
2024-05-18
2024-09-03
1
flammenai/Mahou-1.2a-mistral-7B (Merge)
flammenai_Mahou-1.5-llama3.1-70B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/flammenai/Mahou-1.5-llama3.1-70B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">flammenai/Mahou-1.5-llama3.1-70B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/flammenai__Mahou-1.5-llama3.1-70B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
flammenai/Mahou-1.5-llama3.1-70B
49f45cc4c21e2ba7ed5c5e71f90ffd0bd9169e2d
36.237159
llama3.1
6
70
true
true
true
false
true
10.259992
0.714662
71.466154
0.665086
52.369577
0.143505
14.350453
0.354027
13.870246
0.495021
23.710938
0.4749
41.655585
false
2024-10-14
2024-10-14
1
flammenai/Mahou-1.5-llama3.1-70B (Merge)
flammenai_Mahou-1.5-mistral-nemo-12B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/flammenai/Mahou-1.5-mistral-nemo-12B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">flammenai/Mahou-1.5-mistral-nemo-12B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/flammenai__Mahou-1.5-mistral-nemo-12B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
flammenai/Mahou-1.5-mistral-nemo-12B
852561e74f1785bf7225bb28395db1fd9431fe31
26.381801
apache-2.0
17
12
true
true
true
false
true
1.482632
0.675144
67.514417
0.552236
36.26051
0.056647
5.664653
0.276007
3.467562
0.452042
16.471875
0.360206
28.911791
false
2024-10-06
2024-10-07
1
flammenai/Mahou-1.5-mistral-nemo-12B (Merge)
flammenai_flammen15-gutenberg-DPO-v1-7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/flammenai/flammen15-gutenberg-DPO-v1-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">flammenai/flammen15-gutenberg-DPO-v1-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/flammenai__flammen15-gutenberg-DPO-v1-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
flammenai/flammen15-gutenberg-DPO-v1-7B
550cd9548cba1265cb1771c85ebe498789fdecb5
21.574934
apache-2.0
2
7
true
true
true
false
false
0.62753
0.479806
47.98058
0.520298
32.665113
0.074018
7.401813
0.284396
4.58613
0.429313
12.530729
0.318567
24.285239
false
2024-04-05
2024-07-10
1
flammenai/flammen15-gutenberg-DPO-v1-7B (Merge)
freewheelin_free-evo-qwen72b-v0.8-re_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/freewheelin/free-evo-qwen72b-v0.8-re" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">freewheelin/free-evo-qwen72b-v0.8-re</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/freewheelin__free-evo-qwen72b-v0.8-re-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
freewheelin/free-evo-qwen72b-v0.8-re
24e301d8fbef8ada12be42156b01c827ff594962
32.424578
mit
4
72
true
true
true
false
false
11.789791
0.533087
53.308665
0.612748
45.317403
0.177492
17.749245
0.356544
14.205817
0.487167
20.9625
0.487035
43.003842
false
2024-05-02
2024-09-15
0
freewheelin/free-evo-qwen72b-v0.8-re
freewheelin_free-solar-evo-v0.1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/freewheelin/free-solar-evo-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">freewheelin/free-solar-evo-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/freewheelin__free-solar-evo-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
freewheelin/free-solar-evo-v0.1
233efd607ae0abbd7b46eded2ee7889892b7bdbb
16.295571
mit
1
10
true
true
true
false
true
0.801111
0.205007
20.500716
0.450221
22.635183
0.000755
0.075529
0.291107
5.480984
0.494583
22.25625
0.341423
26.824764
false
2024-04-18
2024-08-07
0
freewheelin/free-solar-evo-v0.1
freewheelin_free-solar-evo-v0.11_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/freewheelin/free-solar-evo-v0.11" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">freewheelin/free-solar-evo-v0.11</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/freewheelin__free-solar-evo-v0.11-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
freewheelin/free-solar-evo-v0.11
17fc24a557bd3c3836abc9f6a367c803cba0cccd
16.641294
mit
0
10
true
true
true
false
true
0.813502
0.202659
20.265894
0.454516
23.182425
0
0
0.285235
4.697987
0.505219
24.285677
0.346742
27.41578
false
2024-04-24
2024-08-07
0
freewheelin/free-solar-evo-v0.11
freewheelin_free-solar-evo-v0.13_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/freewheelin/free-solar-evo-v0.13" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">freewheelin/free-solar-evo-v0.13</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/freewheelin__free-solar-evo-v0.13-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
freewheelin/free-solar-evo-v0.13
2a7eb72f84c54898630f9db470eee0f936a64396
17.204491
mit
1
10
true
true
true
false
true
0.815956
0.23206
23.205982
0.455484
23.354204
0
0
0.288591
5.145414
0.505156
24.077865
0.346991
27.443484
false
2024-04-28
2024-08-07
0
freewheelin/free-solar-evo-v0.13
gabrielmbmb_SmolLM-1.7B-Instruct-IFEval_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/gabrielmbmb/SmolLM-1.7B-Instruct-IFEval" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">gabrielmbmb/SmolLM-1.7B-Instruct-IFEval</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/gabrielmbmb__SmolLM-1.7B-Instruct-IFEval-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
gabrielmbmb/SmolLM-1.7B-Instruct-IFEval
ac5d711adc05ccfe1b1b912d5561d98f6afeeb40
5.222836
0
1
false
true
true
false
true
0.134745
0.230586
23.058596
0.313843
4.501675
0
0
0.253356
0.447427
0.33276
1.595052
0.115608
1.734264
false
2024-10-01
2024-10-11
2
HuggingFaceTB/SmolLM-1.7B
gaverfraxz_Meta-Llama-3.1-8B-Instruct-HalfAbliterated-DELLA_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/gaverfraxz/Meta-Llama-3.1-8B-Instruct-HalfAbliterated-DELLA" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">gaverfraxz/Meta-Llama-3.1-8B-Instruct-HalfAbliterated-DELLA</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/gaverfraxz__Meta-Llama-3.1-8B-Instruct-HalfAbliterated-DELLA-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
gaverfraxz/Meta-Llama-3.1-8B-Instruct-HalfAbliterated-DELLA
6b0271a98b8875a65972ed54b0d636d8236ea60b
11.919582
llama3.1
0
8
true
false
true
false
false
1.345674
0.400946
40.094616
0.398484
15.276579
0.008308
0.830816
0.284396
4.58613
0.365042
3.463542
0.165392
7.26581
false
2024-09-22
2024-09-23
1
gaverfraxz/Meta-Llama-3.1-8B-Instruct-HalfAbliterated-DELLA (Merge)
gaverfraxz_Meta-Llama-3.1-8B-Instruct-HalfAbliterated-TIES_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/gaverfraxz/Meta-Llama-3.1-8B-Instruct-HalfAbliterated-TIES" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">gaverfraxz/Meta-Llama-3.1-8B-Instruct-HalfAbliterated-TIES</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/gaverfraxz__Meta-Llama-3.1-8B-Instruct-HalfAbliterated-TIES-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
gaverfraxz/Meta-Llama-3.1-8B-Instruct-HalfAbliterated-TIES
80569e49b5aba960a5cd91281dd9eef92aeff9a3
20.986454
llama3.1
1
8
true
false
true
false
true
0.961357
0.455051
45.505149
0.504366
28.914235
0.129154
12.915408
0.266779
2.237136
0.37375
6.585417
0.367852
29.761377
false
2024-09-19
2024-09-19
1
gaverfraxz/Meta-Llama-3.1-8B-Instruct-HalfAbliterated-TIES (Merge)
gbueno86_Brinebreath-Llama-3.1-70B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/gbueno86/Brinebreath-Llama-3.1-70B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">gbueno86/Brinebreath-Llama-3.1-70B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/gbueno86__Brinebreath-Llama-3.1-70B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
gbueno86/Brinebreath-Llama-3.1-70B
c508ecf356167e8c498c6fa3937ba30a82208983
36.292756
llama3.1
1
70
true
false
true
false
true
10.559754
0.553295
55.329526
0.688056
55.463618
0.299849
29.984894
0.346477
12.863535
0.454063
17.491146
0.519614
46.623818
false
2024-08-23
2024-08-29
1
gbueno86/Brinebreath-Llama-3.1-70B (Merge)
gbueno86_Meta-LLama-3-Cat-Smaug-LLama-70b_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/gbueno86/Meta-LLama-3-Cat-Smaug-LLama-70b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">gbueno86/Meta-LLama-3-Cat-Smaug-LLama-70b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/gbueno86__Meta-LLama-3-Cat-Smaug-LLama-70b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
gbueno86/Meta-LLama-3-Cat-Smaug-LLama-70b
2d73b7e1c7157df482555944d6a6b1362bc6c3c5
38.268137
llama3
1
70
true
false
true
false
true
10.902293
0.807185
80.718494
0.667431
51.508386
0.268127
26.812689
0.327181
10.290828
0.436823
15.002865
0.50748
45.275561
false
2024-05-24
2024-06-27
1
gbueno86/Meta-LLama-3-Cat-Smaug-LLama-70b (Merge)
ghost-x_ghost-8b-beta-1608_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ghost-x/ghost-8b-beta-1608" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ghost-x/ghost-8b-beta-1608</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ghost-x__ghost-8b-beta-1608-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ghost-x/ghost-8b-beta-1608
6d1b3853aab774af5a4db21ff9d5764918fb48f5
15.103135
other
28
8
true
true
true
false
true
0.848931
0.427274
42.727408
0.451655
23.463964
0.01284
1.283988
0.258389
1.118568
0.351583
1.58125
0.283993
20.443632
false
2024-08-18
2024-09-17
1
ghost-x/ghost-8b-beta
glaiveai_Reflection-Llama-3.1-70B_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/glaiveai/Reflection-Llama-3.1-70B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">glaiveai/Reflection-Llama-3.1-70B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/glaiveai__Reflection-Llama-3.1-70B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
glaiveai/Reflection-Llama-3.1-70B
086bd2658e00345808b31758ebb8f7e2c6d9897c
29.924816
9
69
true
true
true
false
true
25.243776
0.599057
59.905717
0.568101
37.960486
0
0
0.314597
8.612975
0.438031
13.720573
0.634142
59.349143
false
2024-09-19
2024-10-07
0
glaiveai/Reflection-Llama-3.1-70B
google_codegemma-1.1-2b_bfloat16
bfloat16
🟢 pretrained
🟢
Original
GemmaForCausalLM
<a target="_blank" href="https://huggingface.co/google/codegemma-1.1-2b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/codegemma-1.1-2b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__codegemma-1.1-2b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/codegemma-1.1-2b
9d69e500da236427eab5867552ffc87108964f4d
7.033163
gemma
17
2
true
true
true
false
false
0.949883
0.229363
22.936254
0.335342
7.551225
0.006798
0.679758
0.265101
2.013423
0.387146
5.926563
0.127826
3.091755
true
2024-04-30
2024-08-12
0
google/codegemma-1.1-2b
google_flan-t5-base_float16
float16
🟢 pretrained
🟢
Original
T5ForConditionalGeneration
<a target="_blank" href="https://huggingface.co/google/flan-t5-base" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/flan-t5-base</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__flan-t5-base-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/flan-t5-base
7bcac572ce56db69c1ea7c8af255c5d7c9672fc2
6.239408
apache-2.0
795
0
true
true
true
false
false
0.156621
0.189071
18.907056
0.352598
11.337694
0
0
0.238255
0
0.367115
3.222656
0.135721
3.969046
true
2022-10-21
2024-08-14
0
google/flan-t5-base
google_flan-t5-large_float16
float16
🟢 pretrained
🟢
Original
T5ForConditionalGeneration
<a target="_blank" href="https://huggingface.co/google/flan-t5-large" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/flan-t5-large</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__flan-t5-large-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/flan-t5-large
0613663d0d48ea86ba8cb3d7a44f0f65dc596a2a
9.418949
apache-2.0
601
0
true
true
true
false
false
0.233491
0.220095
22.00949
0.415312
17.510018
0
0
0.250839
0.111857
0.408323
9.007031
0.170878
7.875296
true
2022-10-21
2024-08-14
0
google/flan-t5-large
google_flan-t5-small_float16
float16
🟢 pretrained
🟢
Original
T5ForConditionalGeneration
<a target="_blank" href="https://huggingface.co/google/flan-t5-small" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/flan-t5-small</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__flan-t5-small-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/flan-t5-small
0fc9ddf78a1e988dac52e2dac162b0ede4fd74ab
6.003781
apache-2.0
272
0
true
true
true
false
false
0.14313
0.152426
15.242556
0.32829
6.363112
0
0
0.260906
1.454139
0.412292
10.369792
0.123338
2.593085
true
2022-10-21
2024-06-27
0
google/flan-t5-small
google_flan-t5-xl_float16
float16
🟢 pretrained
🟢
Original
T5ForConditionalGeneration
<a target="_blank" href="https://huggingface.co/google/flan-t5-xl" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/flan-t5-xl</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__flan-t5-xl-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/flan-t5-xl
7d6315df2c2fb742f0f5b556879d730926ca9001
11.59178
apache-2.0
468
2
true
true
true
false
false
0.348929
0.223742
22.374189
0.453106
22.695056
0.000755
0.075529
0.252517
0.33557
0.418094
11.328385
0.214678
12.741947
true
2022-10-21
2024-08-07
0
google/flan-t5-xl
google_flan-t5-xl_bfloat16
bfloat16
🟢 pretrained
🟢
Original
T5ForConditionalGeneration
<a target="_blank" href="https://huggingface.co/google/flan-t5-xl" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/flan-t5-xl</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__flan-t5-xl-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/flan-t5-xl
7d6315df2c2fb742f0f5b556879d730926ca9001
11.587167
apache-2.0
468
2
true
true
true
false
false
0.285352
0.220694
22.069442
0.453722
22.837588
0.000755
0.075529
0.245805
0
0.422031
11.853906
0.214179
12.68654
true
2022-10-21
2024-08-07
0
google/flan-t5-xl
google_flan-t5-xxl_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
T5ForConditionalGeneration
<a target="_blank" href="https://huggingface.co/google/flan-t5-xxl" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/flan-t5-xxl</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__flan-t5-xxl-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/flan-t5-xxl
ae7c9136adc7555eeccc78cdd960dfd60fb346ce
13.485843
apache-2.0
1,199
11
true
true
true
false
false
0.706477
0.220045
22.004504
0.506589
30.119256
0
0
0.270134
2.684564
0.42175
11.185417
0.234292
14.921321
true
2022-10-21
2024-09-06
0
google/flan-t5-xxl
google_flan-ul2_bfloat16
bfloat16
🟢 pretrained
🟢
Original
T5ForConditionalGeneration
<a target="_blank" href="https://huggingface.co/google/flan-ul2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/flan-ul2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__flan-ul2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/flan-ul2
452d74ce28ac4a7f211d6ba3ef0717027f7a8074
13.550118
apache-2.0
553
19
true
true
true
false
false
0.559966
0.239254
23.925407
0.505374
30.02029
0.001511
0.151057
0.287752
5.033557
0.384354
5.577604
0.249335
16.59279
true
2023-03-03
2024-08-07
0
google/flan-ul2
google_gemma-1.1-2b-it_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
GemmaForCausalLM
<a target="_blank" href="https://huggingface.co/google/gemma-1.1-2b-it" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/gemma-1.1-2b-it</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__gemma-1.1-2b-it-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/gemma-1.1-2b-it
bf4924f313df5166dee1467161e886e55f2eb4d4
7.776435
gemma
151
2
true
true
true
false
true
0.329215
0.306748
30.674832
0.318463
5.862827
0.001511
0.151057
0.269295
2.572707
0.339396
2.024479
0.148354
5.37271
true
2024-03-26
2024-06-12
0
google/gemma-1.1-2b-it
google_gemma-1.1-7b-it_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
GemmaForCausalLM
<a target="_blank" href="https://huggingface.co/google/gemma-1.1-7b-it" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/gemma-1.1-7b-it</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__gemma-1.1-7b-it-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/gemma-1.1-7b-it
16128b0aeb50762ea96430c0c06a37941bf9f274
17.479586
gemma
263
8
true
true
true
false
true
0.578299
0.503911
50.391073
0.39353
15.934209
0.036254
3.625378
0.293624
5.816555
0.423021
11.510938
0.258394
17.599365
true
2024-03-26
2024-06-12
0
google/gemma-1.1-7b-it
google_gemma-2-27b_bfloat16
bfloat16
🟢 pretrained
🟢
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/google/gemma-2-27b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/gemma-2-27b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__gemma-2-27b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/gemma-2-27b
938270f5272feb02779b55c2bb2fffdd0f53ff0c
23.850639
gemma
176
27
true
true
true
false
false
5.614249
0.247522
24.752213
0.564291
37.390737
0.161631
16.163142
0.350671
13.422819
0.439635
13.921094
0.437084
37.453827
true
2024-06-24
2024-08-24
0
google/gemma-2-27b
google_gemma-2-27b-it_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/google/gemma-2-27b-it" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/gemma-2-27b-it</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__gemma-2-27b-it-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/gemma-2-27b-it
f6c533e5eb013c7e31fc74ef042ac4f3fb5cf40b
32.322319
gemma
437
27
true
true
true
false
true
4.826211
0.797768
79.77677
0.645139
49.272842
0.007553
0.755287
0.375
16.666667
0.403302
9.11276
0.445146
38.349586
true
2024-06-24
2024-08-07
1
google/gemma-2-27b
google_gemma-2-2b_bfloat16
bfloat16
🟢 pretrained
🟢
Original
InternLM2ForCausalLM
<a target="_blank" href="https://huggingface.co/google/gemma-2-2b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/gemma-2-2b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__gemma-2-2b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/gemma-2-2b
4d05c88d00441bf62bf87dcfd29e204c05089f36
10.129463
gemma
410
2
true
true
true
false
true
1.518796
0.199312
19.931227
0.365597
11.755808
0.028701
2.870091
0.262584
1.677852
0.423177
11.430469
0.218002
13.111333
true
2024-07-16
2024-07-31
0
google/gemma-2-2b
google_gemma-2-2b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/google/gemma-2-2b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/gemma-2-2b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__gemma-2-2b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/gemma-2-2b
0738188b3055bc98daf0fe7211f0091357e5b979
10.334439
gemma
410
2
true
true
true
false
false
1.418257
0.20176
20.176022
0.370867
12.497306
0.028701
2.870091
0.262584
1.677852
0.421875
11.267708
0.221659
13.517657
true
2024-07-16
2024-08-04
0
google/gemma-2-2b
google_gemma-2-2b-it_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
InternLM2ForCausalLM
<a target="_blank" href="https://huggingface.co/google/gemma-2-2b-it" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/gemma-2-2b-it</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__gemma-2-2b-it-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/gemma-2-2b-it
2b6ac3ff954ad896c115bbfa1b571cd93ea2c20f
17.046939
gemma
656
2
true
true
true
false
true
1.234743
0.566834
56.683378
0.419923
17.980793
0.000755
0.075529
0.274329
3.243848
0.392885
7.077344
0.254987
17.220745
true
2024-07-16
2024-07-31
1
google/gemma-2-2b
google_gemma-2-2b-jpn-it_float16
float16
🟢 pretrained
🟢
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/google/gemma-2-2b-jpn-it" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/gemma-2-2b-jpn-it</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__gemma-2-2b-jpn-it-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/gemma-2-2b-jpn-it
6b046bbc091084a1ec89fe03e58871fde10868eb
17.115406
gemma
130
2
true
true
true
false
false
1.011437
0.507783
50.778268
0.422557
18.525626
0.034743
3.47432
0.285235
4.697987
0.396385
7.68151
0.257813
17.534722
true
2024-09-25
2024-10-11
2
google/gemma-2-2b
google_gemma-2-2b-jpn-it_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/google/gemma-2-2b-jpn-it" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/gemma-2-2b-jpn-it</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__gemma-2-2b-jpn-it-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/gemma-2-2b-jpn-it
6b046bbc091084a1ec89fe03e58871fde10868eb
15.885579
gemma
130
2
true
true
true
false
true
0.8544
0.52884
52.884014
0.417844
17.848086
0
0
0.275168
3.355705
0.37276
4.928385
0.246676
16.297281
true
2024-09-25
2024-10-14
2
google/gemma-2-2b
google_gemma-2-9b_bfloat16
bfloat16
🟢 pretrained
🟢
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/google/gemma-2-9b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/gemma-2-9b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__gemma-2-9b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/gemma-2-9b
beb0c08e9eeb0548f3aca2ac870792825c357b7d
21.154934
gemma
582
9
true
true
true
false
false
5.663186
0.203983
20.398321
0.537737
34.096819
0.13142
13.141994
0.328859
10.514541
0.446115
14.297656
0.410322
34.480275
true
2024-06-24
2024-07-11
0
google/gemma-2-9b
google_gemma-2-9b-it_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/google/gemma-2-9b-it" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/gemma-2-9b-it</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__gemma-2-9b-it-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/gemma-2-9b-it
1937c70277fcc5f7fb0fc772fc5bc69378996e71
28.86279
gemma
540
9
true
true
true
false
true
5.014497
0.743563
74.356264
0.599034
42.13662
0.002266
0.226586
0.360738
14.765101
0.407271
9.742188
0.38755
31.949985
true
2024-06-24
2024-07-11
1
google/gemma-2-9b
google_gemma-2b_bfloat16
bfloat16
🟢 pretrained
🟢
Original
GemmaForCausalLM
<a target="_blank" href="https://huggingface.co/google/gemma-2b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/gemma-2b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__gemma-2b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/gemma-2b
2ac59a5d7bf4e1425010f0d457dde7d146658953
7.358701
gemma
909
2
true
true
true
false
false
1.236251
0.203758
20.375825
0.338099
8.466713
0.030211
3.021148
0.255034
0.671141
0.397781
7.55599
0.136553
4.061392
true
2024-02-08
2024-06-12
0
google/gemma-2b
google_gemma-2b-it_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
GemmaForCausalLM
<a target="_blank" href="https://huggingface.co/google/gemma-2b-it" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/gemma-2b-it</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__gemma-2b-it-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/gemma-2b-it
de144fb2268dee1066f515465df532c05e699d48
7.221454
gemma
670
2
true
true
true
false
true
0.35295
0.26903
26.902951
0.315082
5.214303
0.004532
0.453172
0.278523
3.803132
0.334125
3.032292
0.135306
3.922872
true
2024-02-08
2024-06-12
0
google/gemma-2b-it
google_gemma-7b_bfloat16
bfloat16
🟢 pretrained
🟢
Original
GemmaForCausalLM
<a target="_blank" href="https://huggingface.co/google/gemma-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/gemma-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__gemma-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/gemma-7b
a0eac5b80dba224e6ed79d306df50b1e92c2125d
15.455407
gemma
3,049
8
true
true
true
false
false
1.254914
0.265932
26.593217
0.436153
21.116099
0.074773
7.477341
0.286913
4.9217
0.40624
10.979948
0.294797
21.644134
true
2024-02-08
2024-06-08
0
google/gemma-7b
google_gemma-7b-it_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
GemmaForCausalLM
<a target="_blank" href="https://huggingface.co/google/gemma-7b-it" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/gemma-7b-it</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__gemma-7b-it-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/gemma-7b-it
18329f019fb74ca4b24f97371785268543d687d2
12.868142
gemma
1,137
8
true
true
true
false
true
1.099954
0.386832
38.683249
0.364558
11.880091
0.018127
1.812689
0.284396
4.58613
0.427427
12.528385
0.169465
7.718307
true
2024-02-13
2024-06-12
1
google/gemma-7b
google_mt5-base_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MT5ForConditionalGeneration
<a target="_blank" href="https://huggingface.co/google/mt5-base" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/mt5-base</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__mt5-base-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/mt5-base
2eb15465c5dd7f72a8f7984306ad05ebc3dd1e1f
3.565282
apache-2.0
189
0
true
true
true
false
false
0.20004
0.164516
16.451571
0.288316
1.298551
0
0
0.239094
0
0.367208
2.867708
0.106965
0.773862
true
2022-03-02
2024-09-06
0
google/mt5-base
google_mt5-small_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MT5ForConditionalGeneration
<a target="_blank" href="https://huggingface.co/google/mt5-small" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/mt5-small</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__mt5-small-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/mt5-small
73fb5dbe4756edadc8fbe8c769b0a109493acf7a
4.255928
apache-2.0
104
0
true
true
true
false
false
0.180494
0.17181
17.180969
0.276584
1.070971
0
0
0.24245
0
0.38575
5.91875
0.112284
1.364879
true
2022-03-02
2024-09-06
0
google/mt5-small
google_mt5-xl_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MT5ForConditionalGeneration
<a target="_blank" href="https://huggingface.co/google/mt5-xl" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/mt5-xl</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__mt5-xl-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/mt5-xl
63fc6450d80515b48e026b69ef2fbbd426433e84
5.19142
apache-2.0
20
3
true
true
true
false
false
0.903767
0.195964
19.596449
0.304736
3.282462
0
0
0.264262
1.901566
0.379521
5.040104
0.111951
1.32794
true
2022-03-02
2024-09-06
0
google/mt5-xl
google_mt5-xxl_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
T5ForConditionalGeneration
<a target="_blank" href="https://huggingface.co/google/mt5-xxl" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/mt5-xxl</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__mt5-xxl-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/mt5-xxl
e07c395916dfbc315d4e5e48b4a54a1e8821b5c0
5.103077
apache-2.0
65
11
true
true
true
false
false
2.281939
0.235757
23.575668
0.295934
2.504711
0
0
0.241611
0
0.368948
3.551823
0.108876
0.986259
true
2022-03-02
2024-09-06
0
google/mt5-xxl
google_recurrentgemma-2b_bfloat16
bfloat16
🟢 pretrained
🟢
Original
RecurrentGemmaForCausalLM
<a target="_blank" href="https://huggingface.co/google/recurrentgemma-2b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/recurrentgemma-2b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__recurrentgemma-2b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/recurrentgemma-2b
195f13c55b371fc721eda0662c00c64642c70e17
6.952186
gemma
91
2
true
true
true
false
false
3.692653
0.301703
30.170282
0.319736
4.820362
0.016616
1.661631
0.245805
0
0.344573
3.104948
0.117603
1.955895
true
2024-04-06
2024-06-13
0
google/recurrentgemma-2b
google_recurrentgemma-2b-it_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
RecurrentGemmaForCausalLM
<a target="_blank" href="https://huggingface.co/google/recurrentgemma-2b-it" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/recurrentgemma-2b-it</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__recurrentgemma-2b-it-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/recurrentgemma-2b-it
150248167d171fbdf4b02e7d28a4b3d749e570f6
7.945553
gemma
108
2
true
true
true
false
true
1.933036
0.294933
29.4933
0.333
7.978764
0.016616
1.661631
0.253356
0.447427
0.334063
3.624479
0.140209
4.467716
true
2024-04-08
2024-06-12
0
google/recurrentgemma-2b-it
google_recurrentgemma-9b_bfloat16
bfloat16
🟢 pretrained
🟢
Original
RecurrentGemmaForCausalLM
<a target="_blank" href="https://huggingface.co/google/recurrentgemma-9b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/recurrentgemma-9b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__recurrentgemma-9b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/recurrentgemma-9b
7b0ed98fb889ba8bdfa7c690f08f2e57a7c48dae
13.684285
gemma
59
9
true
true
true
false
false
23.20619
0.311594
31.159435
0.395626
15.323369
0.064955
6.495468
0.285235
4.697987
0.38026
6.599219
0.260472
17.83023
true
2024-06-07
2024-07-04
0
google/recurrentgemma-9b
google_recurrentgemma-9b-it_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
RecurrentGemmaForCausalLM
<a target="_blank" href="https://huggingface.co/google/recurrentgemma-9b-it" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/recurrentgemma-9b-it</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__recurrentgemma-9b-it-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/recurrentgemma-9b-it
43e62f98c3d496a5469ef4b18c1b11e417d68d1d
19.230703
gemma
49
9
true
true
true
false
true
13.362608
0.501038
50.103836
0.436719
21.62158
0.067221
6.722054
0.270134
2.684564
0.437906
13.771615
0.284325
20.48057
true
2024-06-07
2024-07-05
0
google/recurrentgemma-9b-it
google_switch-base-8_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
SwitchTransformersForConditionalGeneration
<a target="_blank" href="https://huggingface.co/google/switch-base-8" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/switch-base-8</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__switch-base-8-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/switch-base-8
92fe2d22b024d9937146fe097ba3d3a7ba146e1b
3.29595
apache-2.0
14
0
true
true
true
false
false
0.146703
0.158521
15.85205
0.287631
1.702478
0
0
0.25
0
0.35174
1.133333
0.109791
1.08784
true
2022-10-24
2024-09-06
0
google/switch-base-8
google_umt5-base_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
UMT5ForConditionalGeneration
<a target="_blank" href="https://huggingface.co/google/umt5-base" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/umt5-base</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__umt5-base-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/umt5-base
0de9394d54f8975e71838d309de1cb496c894ab9
3.441046
apache-2.0
12
-1
true
true
true
false
false
0.668046
0.174632
17.46322
0.278773
0.813553
0
0
0.254195
0.559284
0.338219
0.94401
0.107796
0.866209
true
2023-07-02
2024-09-06
0
google/umt5-base
gpt2_bfloat16
bfloat16
🟢 pretrained
🟢
Original
GPT2LMHeadModel
<a target="_blank" href="https://huggingface.co/gpt2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">gpt2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/gpt2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
gpt2
607a30d783dfa663caf39e06633721c8d4cfcd7e
6.536203
mit
2,350
0
true
true
true
false
false
0.125522
0.180777
18.077701
0.303571
2.674981
0.002266
0.226586
0.258389
1.118568
0.447052
15.348177
0.115941
1.771203
true
2022-03-02
2024-06-09
0
gpt2
gpt2_float16
float16
🟢 pretrained
🟢
Original
GPT2LMHeadModel
<a target="_blank" href="https://huggingface.co/gpt2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">gpt2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/gpt2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
gpt2
607a30d783dfa663caf39e06633721c8d4cfcd7e
5.977737
mit
2,350
0
true
true
true
false
false
0.039245
0.083333
8.333333
0.308333
9.199755
0
0
0.233333
0
0.433333
18.333333
0.1
0
true
2022-03-02
2024-06-26
0
gpt2
gradientai_Llama-3-8B-Instruct-Gradient-1048k_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/gradientai/Llama-3-8B-Instruct-Gradient-1048k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">gradientai/Llama-3-8B-Instruct-Gradient-1048k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/gradientai__Llama-3-8B-Instruct-Gradient-1048k-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
gradientai/Llama-3-8B-Instruct-Gradient-1048k
8697fb25cb77c852311e03b4464b8467471d56a4
18.24557
llama3
672
8
true
true
true
false
true
0.887164
0.445559
44.555889
0.43459
21.010529
0.05136
5.135952
0.277685
3.691275
0.42975
13.51875
0.294049
21.561022
true
2024-04-29
2024-06-12
0
gradientai/Llama-3-8B-Instruct-Gradient-1048k
grimjim_Llama-3-Instruct-8B-SPPO-Iter3-SimPO-merge_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/grimjim/Llama-3-Instruct-8B-SPPO-Iter3-SimPO-merge" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">grimjim/Llama-3-Instruct-8B-SPPO-Iter3-SimPO-merge</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/grimjim__Llama-3-Instruct-8B-SPPO-Iter3-SimPO-merge-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
grimjim/Llama-3-Instruct-8B-SPPO-Iter3-SimPO-merge
7a8d334dce0a2ce948f75612b8d3a61c53d094aa
20.887036
llama3
2
8
true
false
true
false
false
0.547548
0.427124
42.712447
0.496169
28.258015
0.102719
10.271903
0.290268
5.369128
0.404323
9.540365
0.362533
29.170361
false
2024-06-28
2024-06-29
1
grimjim/Llama-3-Instruct-8B-SPPO-Iter3-SimPO-merge (Merge)
grimjim_Llama-3-Instruct-8B-SimPO-SPPO-Iter3-merge_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/grimjim/Llama-3-Instruct-8B-SimPO-SPPO-Iter3-merge" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">grimjim/Llama-3-Instruct-8B-SimPO-SPPO-Iter3-merge</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/grimjim__Llama-3-Instruct-8B-SimPO-SPPO-Iter3-merge-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
grimjim/Llama-3-Instruct-8B-SimPO-SPPO-Iter3-merge
8f4d460ea20e24e48914156af7def305c0cd347f
23.688475
llama3
2
8
true
false
true
false
true
0.616942
0.68059
68.058972
0.502173
29.073286
0.067976
6.797583
0.262584
1.677852
0.38851
6.697135
0.368434
29.82602
false
2024-06-28
2024-09-17
1
grimjim/Llama-3-Instruct-8B-SimPO-SPPO-Iter3-merge (Merge)