eval_name
stringlengths 12
111
| Precision
stringclasses 3
values | Type
stringclasses 6
values | T
stringclasses 6
values | Weight type
stringclasses 2
values | Architecture
stringclasses 47
values | Model
stringlengths 355
650
| fullname
stringlengths 4
102
| Model sha
stringlengths 0
40
| Average ⬆️
float64 1.41
51.2
| Hub License
stringclasses 24
values | Hub ❤️
int64 0
5.81k
| #Params (B)
int64 -1
140
| Available on the hub
bool 2
classes | Not_Merged
bool 2
classes | MoE
bool 2
classes | Flagged
bool 1
class | Chat Template
bool 2
classes | CO₂ Emissions for Evaluation (kg)
float64 0.04
107
| IFEval Raw
float64 0
0.87
| IFEval
float64 0
86.7
| BBH Raw
float64 0.28
0.75
| BBH
float64 0.81
62.8
| MATH Lvl 5 Raw
float64 0
0.51
| MATH Lvl 5
float64 0
50.7
| GPQA Raw
float64 0.22
0.41
| GPQA
float64 0
21.6
| MUSR Raw
float64 0.29
0.59
| MUSR
float64 0
36.4
| MMLU-PRO Raw
float64 0.1
0.7
| MMLU-PRO
float64 0
66.8
| Maintainer's Highlight
bool 2
classes | Upload To Hub Date
stringlengths 0
10
| Submission Date
stringclasses 138
values | Generation
int64 0
6
| Base Model
stringlengths 4
102
|
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
EpistemeAI_Athena-gemma-2-2b-it_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/EpistemeAI/Athena-gemma-2-2b-it" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Athena-gemma-2-2b-it</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Athena-gemma-2-2b-it-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | EpistemeAI/Athena-gemma-2-2b-it | 661c1dc6a1a096222e33416e099bd02b7b970405 | 14.294329 | apache-2.0 | 2 | 2 | true | true | true | false | false | 2.036798 | 0.313417 | 31.341729 | 0.426423 | 19.417818 | 0.033988 | 3.398792 | 0.268456 | 2.46085 | 0.435052 | 13.348177 | 0.242188 | 15.798611 | false | 2024-08-29 | 2024-09-06 | 2 | unsloth/gemma-2-9b-it-bnb-4bit |
EpistemeAI_Athena-gemma-2-2b-it-Philos_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/EpistemeAI/Athena-gemma-2-2b-it-Philos" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Athena-gemma-2-2b-it-Philos</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Athena-gemma-2-2b-it-Philos-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | EpistemeAI/Athena-gemma-2-2b-it-Philos | dea2b35d496bd32ed3c88d42ff3022654153f2e1 | 15.122657 | apache-2.0 | 0 | 2 | true | true | true | false | true | 1.128593 | 0.462095 | 46.209502 | 0.379478 | 13.212088 | 0.004532 | 0.453172 | 0.28104 | 4.138702 | 0.431365 | 12.853906 | 0.224817 | 13.868573 | false | 2024-09-05 | 2024-09-05 | 1 | unsloth/gemma-2-2b-it-bnb-4bit |
EpistemeAI_Athene-codegemma-2-7b-it-alpaca-v1.3_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | GemmaForCausalLM | <a target="_blank" href="https://huggingface.co/EpistemeAI/Athene-codegemma-2-7b-it-alpaca-v1.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Athene-codegemma-2-7b-it-alpaca-v1.3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Athene-codegemma-2-7b-it-alpaca-v1.3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | EpistemeAI/Athene-codegemma-2-7b-it-alpaca-v1.3 | 9c26e1242a11178b53937bc0e9a744ef6141e05a | 17.314022 | apache-2.0 | 0 | 7 | true | true | true | false | false | 0.971978 | 0.402994 | 40.299406 | 0.433192 | 20.873795 | 0.061934 | 6.193353 | 0.280201 | 4.026846 | 0.450302 | 14.854427 | 0.258727 | 17.636303 | false | 2024-09-06 | 2024-09-06 | 2 | Removed |
EpistemeAI_FineLlama3.1-8B-Instruct_4bit | 4bit | 🔶 fine-tuned on domain-specific datasets | 🔶 | Adapter | ? | <a target="_blank" href="https://huggingface.co/EpistemeAI/FineLlama3.1-8B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/FineLlama3.1-8B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__FineLlama3.1-8B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | EpistemeAI/FineLlama3.1-8B-Instruct | a8b0fc584b10e0110e04f9d21c7f10d24391c1d5 | 11.100787 | 0 | 14 | false | true | true | false | false | 2.354961 | 0.08001 | 8.000993 | 0.455736 | 23.506619 | 0.026435 | 2.643505 | 0.280201 | 4.026846 | 0.348167 | 4.954167 | 0.311253 | 23.472592 | false | 2024-08-10 | 0 | Removed |
||
EpistemeAI_Fireball-12B_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/EpistemeAI/Fireball-12B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Fireball-12B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Fireball-12B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | EpistemeAI/Fireball-12B | e2ed12c3244f2502321fb20e76dfc72ad7817d6e | 15.509355 | apache-2.0 | 1 | 12 | true | true | true | false | false | 1.618521 | 0.18335 | 18.335018 | 0.511089 | 30.666712 | 0.039275 | 3.927492 | 0.261745 | 1.565996 | 0.423635 | 12.521094 | 0.334358 | 26.03982 | false | 2024-08-20 | 2024-08-21 | 2 | Removed |
EpistemeAI_Fireball-12B-v1.13a-philosophers_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/EpistemeAI/Fireball-12B-v1.13a-philosophers" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Fireball-12B-v1.13a-philosophers</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Fireball-12B-v1.13a-philosophers-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | EpistemeAI/Fireball-12B-v1.13a-philosophers | 7fa824d4a40abca3f8c75d432ea151dc0d1d67d6 | 14.440865 | apache-2.0 | 2 | 12 | true | true | true | false | false | 1.662663 | 0.087553 | 8.755325 | 0.51027 | 30.336233 | 0.044562 | 4.456193 | 0.301174 | 6.823266 | 0.408073 | 9.975781 | 0.336686 | 26.298389 | false | 2024-08-28 | 2024-09-03 | 1 | Removed |
EpistemeAI_Fireball-Alpaca-Llama-3.1-8B-Philos-DPO-200_float16 | float16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/EpistemeAI/Fireball-Alpaca-Llama-3.1-8B-Philos-DPO-200" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Fireball-Alpaca-Llama-3.1-8B-Philos-DPO-200</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Fireball-Alpaca-Llama-3.1-8B-Philos-DPO-200-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | EpistemeAI/Fireball-Alpaca-Llama-3.1-8B-Philos-DPO-200 | 27d67626304954db71f21fec9e7fc516421274ec | 21.066974 | apache-2.0 | 0 | 8 | true | true | true | false | false | 0.922381 | 0.457724 | 45.772439 | 0.48384 | 26.377774 | 0.119335 | 11.933535 | 0.300336 | 6.711409 | 0.394458 | 6.907292 | 0.358295 | 28.699394 | false | 2024-09-16 | 2024-09-16 | 3 | unsloth/Meta-Llama-3.1-8B |
EpistemeAI_Fireball-Alpaca-Llama3.1.07-8B-Philos-Math-KTO-beta_float16 | float16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/EpistemeAI/Fireball-Alpaca-Llama3.1.07-8B-Philos-Math-KTO-beta" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Fireball-Alpaca-Llama3.1.07-8B-Philos-Math-KTO-beta</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Fireball-Alpaca-Llama3.1.07-8B-Philos-Math-KTO-beta-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | EpistemeAI/Fireball-Alpaca-Llama3.1.07-8B-Philos-Math-KTO-beta | 2851384717556dd6ac14c00ed87aac1f267eb263 | 25.179287 | apache-2.0 | 0 | 8 | true | true | true | false | true | 0.885645 | 0.727401 | 72.740107 | 0.486489 | 26.897964 | 0.148792 | 14.879154 | 0.280201 | 4.026846 | 0.361938 | 4.275521 | 0.354305 | 28.256132 | false | 2024-09-12 | 2024-09-14 | 4 | unsloth/Meta-Llama-3.1-8B |
EpistemeAI_Fireball-Alpaca-Llama3.1.08-8B-Philos-C-R2_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/EpistemeAI/Fireball-Alpaca-Llama3.1.08-8B-Philos-C-R2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Fireball-Alpaca-Llama3.1.08-8B-Philos-C-R2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Fireball-Alpaca-Llama3.1.08-8B-Philos-C-R2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | EpistemeAI/Fireball-Alpaca-Llama3.1.08-8B-Philos-C-R2 | b19336101aa5f4807d1574f4c11eebc1c1a1c34e | 22.537889 | apache-2.0 | 0 | 8 | true | true | true | false | false | 0.811743 | 0.467316 | 46.731561 | 0.493203 | 28.247009 | 0.123112 | 12.311178 | 0.286074 | 4.809843 | 0.462365 | 16.995573 | 0.335189 | 26.132166 | false | 2024-09-14 | 2024-09-14 | 2 | unsloth/Meta-Llama-3.1-8B |
EpistemeAI_Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K | b4a88fb5fb27fc5d8a503303cdb7aaeff373fd92 | 20.627168 | apache-2.0 | 3 | 8 | true | true | true | false | false | 0.814786 | 0.445734 | 44.573399 | 0.489732 | 28.025161 | 0.120846 | 12.084592 | 0.294463 | 5.928412 | 0.376229 | 4.895312 | 0.354305 | 28.256132 | false | 2024-09-26 | 2024-10-05 | 1 | Removed |
EpistemeAI_Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code | 8e8f1569a8a01ed3d6588f2669c730d4993355b5 | 23.89695 | apache-2.0 | 2 | 8 | true | true | true | false | false | 0.854318 | 0.597533 | 59.753343 | 0.490419 | 28.171888 | 0.13142 | 13.141994 | 0.302013 | 6.935123 | 0.401031 | 8.46224 | 0.342254 | 26.91711 | false | 2024-10-04 | 2024-10-05 | 2 | Removed |
EpistemeAI_Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds | 8b73dd02349f0544c48c581cc73ada5cac6ff946 | 22.993108 | llama3.1 | 2 | 8 | true | true | true | false | true | 1.716734 | 0.669099 | 66.90991 | 0.466807 | 24.462654 | 0.124622 | 12.462236 | 0.272651 | 3.020134 | 0.341781 | 4.55599 | 0.33893 | 26.547725 | false | 2024-10-14 | 2024-10-15 | 4 | Removed |
EpistemeAI_Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds-auto_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds-auto" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds-auto</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds-auto-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds-auto | f18598c62a783bcc0d436a35df0c8a335e8ee5d7 | 23.083598 | apache-2.0 | 5 | 8 | true | true | true | false | true | 1.526689 | 0.705894 | 70.58937 | 0.464178 | 23.750555 | 0.117069 | 11.706949 | 0.264262 | 1.901566 | 0.342094 | 3.728385 | 0.341423 | 26.824764 | false | 2024-10-21 | 2024-10-29 | 1 | EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-ds-auto (Merge) |
EpistemeAI_Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.004-128K-code-COT_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.004-128K-code-COT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.004-128K-code-COT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.004-128K-code-COT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.004-128K-code-COT | bb90c19dc7c4a509e7bd73f4620dca818b58be25 | 20.832251 | apache-2.0 | 0 | 8 | true | true | true | false | false | 0.839037 | 0.457824 | 45.782413 | 0.476052 | 25.820865 | 0.136707 | 13.670695 | 0.293624 | 5.816555 | 0.388135 | 6.45026 | 0.347074 | 27.452719 | false | 2024-10-11 | 2024-10-11 | 3 | Removed |
EpistemeAI_Fireball-Meta-Llama-3.1-8B-Instruct-Math_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Math" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Math</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Fireball-Meta-Llama-3.1-8B-Instruct-Math-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | EpistemeAI/Fireball-Meta-Llama-3.1-8B-Instruct-Math | 677c97b4f92bfc330d4fae628e9a1df1ef606dcc | 20.545341 | apache-2.0 | 0 | 8 | true | true | true | false | false | 0.910272 | 0.462296 | 46.22956 | 0.498295 | 28.959344 | 0.107251 | 10.725076 | 0.291107 | 5.480984 | 0.364073 | 5.975781 | 0.333112 | 25.9013 | false | 2024-09-23 | 2024-09-23 | 2 | unsloth/Meta-Llama-3.1-8B |
EpistemeAI_Fireball-Meta-Llama-3.2-8B-Instruct-agent-003-128k-code-DPO_float16 | float16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/EpistemeAI/Fireball-Meta-Llama-3.2-8B-Instruct-agent-003-128k-code-DPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Fireball-Meta-Llama-3.2-8B-Instruct-agent-003-128k-code-DPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Fireball-Meta-Llama-3.2-8B-Instruct-agent-003-128k-code-DPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | EpistemeAI/Fireball-Meta-Llama-3.2-8B-Instruct-agent-003-128k-code-DPO | b3c0fce7daa359cd8ed5be6595dd1a76ca2cfea2 | 21.205445 | apache-2.0 | 0 | 8 | true | true | true | false | false | 0.833576 | 0.461097 | 46.109656 | 0.480101 | 26.317878 | 0.120091 | 12.009063 | 0.300336 | 6.711409 | 0.399823 | 8.077865 | 0.352061 | 28.006797 | false | 2024-10-08 | 2024-10-09 | 3 | Removed |
EpistemeAI_Fireball-Mistral-Nemo-Base-2407-v1-DPO2_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/EpistemeAI/Fireball-Mistral-Nemo-Base-2407-v1-DPO2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Fireball-Mistral-Nemo-Base-2407-v1-DPO2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Fireball-Mistral-Nemo-Base-2407-v1-DPO2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | EpistemeAI/Fireball-Mistral-Nemo-Base-2407-v1-DPO2 | 2cf732fbffefdf37341b946edd7995f14d3f9487 | 15.2764 | apache-2.0 | 0 | 12 | true | true | true | false | false | 1.771269 | 0.186073 | 18.607295 | 0.496777 | 28.567825 | 0.032477 | 3.247734 | 0.291946 | 5.592841 | 0.40401 | 9.501302 | 0.335273 | 26.141401 | false | 2024-08-19 | 2024-08-19 | 1 | Removed |
EpistemeAI_Llama-3.2-3B-Agent007-Coder_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/EpistemeAI/Llama-3.2-3B-Agent007-Coder" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Llama-3.2-3B-Agent007-Coder</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Llama-3.2-3B-Agent007-Coder-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | EpistemeAI/Llama-3.2-3B-Agent007-Coder | 7ff4e77796b6d308e96d0150e1a01081c0b82e01 | 18.901974 | apache-2.0 | 0 | 3 | true | true | true | false | false | 0.710816 | 0.539956 | 53.995621 | 0.430376 | 19.025809 | 0.110272 | 11.02719 | 0.25755 | 1.006711 | 0.366802 | 7.783594 | 0.285156 | 20.572917 | false | 2024-10-08 | 2024-10-08 | 2 | meta-llama/Llama-3.2-3B-Instruct |
EpistemeAI_Mistral-Nemo-Instruct-12B-Philosophy-Math_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/EpistemeAI/Mistral-Nemo-Instruct-12B-Philosophy-Math" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI/Mistral-Nemo-Instruct-12B-Philosophy-Math</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI__Mistral-Nemo-Instruct-12B-Philosophy-Math-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | EpistemeAI/Mistral-Nemo-Instruct-12B-Philosophy-Math | 1ac4205f8da109326b4a5cf173e5491a20087d76 | 16.566232 | apache-2.0 | 0 | 12 | true | true | true | false | false | 1.363607 | 0.069468 | 6.94679 | 0.536493 | 33.835811 | 0.093656 | 9.365559 | 0.331376 | 10.850112 | 0.429219 | 12.885677 | 0.329621 | 25.513446 | false | 2024-09-15 | 2024-09-26 | 1 | unsloth/Mistral-Nemo-Instruct-2407-bnb-4bit |
EpistemeAI2_Athene-codegemma-2-7b-it-alpaca-v1.2_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | GemmaForCausalLM | <a target="_blank" href="https://huggingface.co/EpistemeAI2/Athene-codegemma-2-7b-it-alpaca-v1.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI2/Athene-codegemma-2-7b-it-alpaca-v1.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI2__Athene-codegemma-2-7b-it-alpaca-v1.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | EpistemeAI2/Athene-codegemma-2-7b-it-alpaca-v1.2 | 21b31062334a316b50680e8c3a141a72e4c30b61 | 15.693215 | apache-2.0 | 0 | 7 | true | true | true | false | false | 0.969635 | 0.435118 | 43.511771 | 0.417542 | 18.97137 | 0.040785 | 4.07855 | 0.270973 | 2.796421 | 0.416969 | 10.38776 | 0.229721 | 14.413416 | false | 2024-08-26 | 2024-08-26 | 2 | Removed |
EpistemeAI2_Fireball-12B-v1.2_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/EpistemeAI2/Fireball-12B-v1.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI2/Fireball-12B-v1.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI2__Fireball-12B-v1.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | EpistemeAI2/Fireball-12B-v1.2 | 57af42edf8232189ee99e9a21e33a0c306e3f561 | 15.162522 | apache-2.0 | 1 | 12 | true | true | true | false | false | 1.872565 | 0.135539 | 13.553926 | 0.501858 | 29.776014 | 0.039275 | 3.927492 | 0.298658 | 6.487696 | 0.417313 | 11.264062 | 0.333693 | 25.965943 | false | 2024-08-27 | 2024-08-28 | 1 | Removed |
EpistemeAI2_Fireball-Alpaca-Llama3.1-8B-Philos_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/EpistemeAI2/Fireball-Alpaca-Llama3.1-8B-Philos" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI2/Fireball-Alpaca-Llama3.1-8B-Philos</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI2__Fireball-Alpaca-Llama3.1-8B-Philos-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | EpistemeAI2/Fireball-Alpaca-Llama3.1-8B-Philos | 3dcca4cf9bdd9003c8dc91f5c78cefef1d4ae0d7 | 22.539085 | apache-2.0 | 1 | 8 | true | true | true | false | false | 0.848332 | 0.49864 | 49.864027 | 0.497758 | 29.259226 | 0.117825 | 11.782477 | 0.292785 | 5.704698 | 0.427667 | 11.891667 | 0.340592 | 26.732417 | false | 2024-08-29 | 2024-08-29 | 2 | unsloth/Meta-Llama-3.1-8B |
EpistemeAI2_Fireball-Alpaca-Llama3.1.01-8B-Philos_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/EpistemeAI2/Fireball-Alpaca-Llama3.1.01-8B-Philos" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI2/Fireball-Alpaca-Llama3.1.01-8B-Philos</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI2__Fireball-Alpaca-Llama3.1.01-8B-Philos-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | EpistemeAI2/Fireball-Alpaca-Llama3.1.01-8B-Philos | f97293ed5cec7fb9482b16600259967c6c923e4b | 21.567144 | apache-2.0 | 0 | 8 | true | true | true | false | false | 0.870572 | 0.421179 | 42.117914 | 0.495611 | 28.628475 | 0.135952 | 13.595166 | 0.288591 | 5.145414 | 0.437062 | 13.432813 | 0.338348 | 26.483082 | false | 2024-09-03 | 2024-09-03 | 2 | unsloth/Meta-Llama-3.1-8B |
EpistemeAI2_Fireball-Alpaca-Llama3.1.03-8B-Philos_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/EpistemeAI2/Fireball-Alpaca-Llama3.1.03-8B-Philos" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI2/Fireball-Alpaca-Llama3.1.03-8B-Philos</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI2__Fireball-Alpaca-Llama3.1.03-8B-Philos-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | EpistemeAI2/Fireball-Alpaca-Llama3.1.03-8B-Philos | 6e60f783f80f7d126b8e4f2b417e14dea63d2c4f | 20.29975 | apache-2.0 | 0 | 8 | true | true | true | false | false | 0.797523 | 0.388081 | 38.80814 | 0.495087 | 27.992549 | 0.129909 | 12.990937 | 0.278523 | 3.803132 | 0.42801 | 12.034635 | 0.335522 | 26.169105 | false | 2024-09-04 | 2024-09-04 | 2 | unsloth/Meta-Llama-3.1-8B |
EpistemeAI2_Fireball-Alpaca-Llama3.1.04-8B-Philos_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/EpistemeAI2/Fireball-Alpaca-Llama3.1.04-8B-Philos" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI2/Fireball-Alpaca-Llama3.1.04-8B-Philos</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI2__Fireball-Alpaca-Llama3.1.04-8B-Philos-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | EpistemeAI2/Fireball-Alpaca-Llama3.1.04-8B-Philos | efd0c251373e1a2fa2bc8cead502c03ff6dc7c8b | 21.031577 | apache-2.0 | 0 | 8 | true | true | true | false | false | 0.765248 | 0.40844 | 40.843961 | 0.493001 | 27.963798 | 0.116314 | 11.63142 | 0.290268 | 5.369128 | 0.437219 | 13.685677 | 0.340259 | 26.695479 | false | 2024-09-05 | 2024-09-05 | 2 | unsloth/Meta-Llama-3.1-8B |
EpistemeAI2_Fireball-Alpaca-Llama3.1.06-8B-Philos-dpo_float16 | float16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/EpistemeAI2/Fireball-Alpaca-Llama3.1.06-8B-Philos-dpo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI2/Fireball-Alpaca-Llama3.1.06-8B-Philos-dpo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI2__Fireball-Alpaca-Llama3.1.06-8B-Philos-dpo-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | EpistemeAI2/Fireball-Alpaca-Llama3.1.06-8B-Philos-dpo | 3e76f190b505b515479cc25e92f8229c2b05159f | 21.829867 | apache-2.0 | 0 | 8 | true | true | true | false | false | 0.934774 | 0.486576 | 48.657562 | 0.488077 | 27.207177 | 0.128399 | 12.839879 | 0.297819 | 6.375839 | 0.393188 | 6.848437 | 0.361453 | 29.05031 | false | 2024-09-09 | 2024-09-09 | 4 | unsloth/Meta-Llama-3.1-8B |
EpistemeAI2_Fireball-Alpaca-Llama3.1.07-8B-Philos-Math_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/EpistemeAI2/Fireball-Alpaca-Llama3.1.07-8B-Philos-Math" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI2/Fireball-Alpaca-Llama3.1.07-8B-Philos-Math</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI2__Fireball-Alpaca-Llama3.1.07-8B-Philos-Math-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | EpistemeAI2/Fireball-Alpaca-Llama3.1.07-8B-Philos-Math | 0b2842bddfa6c308f67eb5a20daf04536a4e6d1a | 21.870165 | apache-2.0 | 0 | 8 | true | true | true | false | false | 0.90203 | 0.507908 | 50.790791 | 0.484702 | 26.901201 | 0.114048 | 11.404834 | 0.296141 | 6.152125 | 0.406302 | 7.854427 | 0.353059 | 28.117612 | false | 2024-09-10 | 2024-09-10 | 3 | unsloth/Meta-Llama-3.1-8B |
EpistemeAI2_Fireball-Alpaca-Llama3.1.08-8B-C-R1-KTO-Reflection_float16 | float16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/EpistemeAI2/Fireball-Alpaca-Llama3.1.08-8B-C-R1-KTO-Reflection" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI2/Fireball-Alpaca-Llama3.1.08-8B-C-R1-KTO-Reflection</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI2__Fireball-Alpaca-Llama3.1.08-8B-C-R1-KTO-Reflection-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | EpistemeAI2/Fireball-Alpaca-Llama3.1.08-8B-C-R1-KTO-Reflection | dc900138b4406353b7e84251bc8649d70c16f13f | 20.882037 | apache-2.0 | 0 | 8 | true | true | true | false | false | 0.883974 | 0.395226 | 39.522578 | 0.495531 | 27.571611 | 0.123867 | 12.386707 | 0.299497 | 6.599553 | 0.404813 | 10.401563 | 0.359292 | 28.81021 | false | 2024-09-16 | 2024-09-16 | 5 | unsloth/Meta-Llama-3.1-8B |
EpistemeAI2_Fireball-Alpaca-Llama3.1.08-8B-Philos-C-R1_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/EpistemeAI2/Fireball-Alpaca-Llama3.1.08-8B-Philos-C-R1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI2/Fireball-Alpaca-Llama3.1.08-8B-Philos-C-R1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI2__Fireball-Alpaca-Llama3.1.08-8B-Philos-C-R1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | EpistemeAI2/Fireball-Alpaca-Llama3.1.08-8B-Philos-C-R1 | c57c786426123635baf6c8b4d30638d2053f4565 | 22.410483 | apache-2.0 | 0 | 8 | true | true | true | false | false | 0.909759 | 0.531638 | 53.163828 | 0.482793 | 26.763685 | 0.117825 | 11.782477 | 0.29698 | 6.263982 | 0.410302 | 8.454427 | 0.352311 | 28.034501 | false | 2024-09-13 | 2024-09-13 | 3 | unsloth/Meta-Llama-3.1-8B |
EpistemeAI2_Fireball-Llama-3.1-8B-Philos-Reflection_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/EpistemeAI2/Fireball-Llama-3.1-8B-Philos-Reflection" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI2/Fireball-Llama-3.1-8B-Philos-Reflection</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI2__Fireball-Llama-3.1-8B-Philos-Reflection-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | EpistemeAI2/Fireball-Llama-3.1-8B-Philos-Reflection | 4b0b75d9235886e8a947c45b94f87c5a65a81467 | 20.389309 | apache-2.0 | 0 | 8 | true | true | true | false | false | 0.894943 | 0.359605 | 35.960474 | 0.489769 | 27.769796 | 0.129154 | 12.915408 | 0.307886 | 7.718121 | 0.395729 | 9.632813 | 0.355053 | 28.339243 | false | 2024-09-17 | 2024-09-17 | 4 | unsloth/Meta-Llama-3.1-8B |
EpistemeAI2_Fireball-MathMistral-Nemo-Base-2407-v2dpo_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/EpistemeAI2/Fireball-MathMistral-Nemo-Base-2407-v2dpo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI2/Fireball-MathMistral-Nemo-Base-2407-v2dpo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI2__Fireball-MathMistral-Nemo-Base-2407-v2dpo-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | EpistemeAI2/Fireball-MathMistral-Nemo-Base-2407-v2dpo | 6b7d851c66359f39d16da6fbcf810b816dc6e4bc | 11.332218 | apache-2.0 | 1 | 11 | true | true | true | false | true | 1.881426 | 0.30972 | 30.972043 | 0.432764 | 21.145528 | 0.034743 | 3.47432 | 0.263423 | 1.789709 | 0.402958 | 8.969792 | 0.114777 | 1.641918 | false | 2024-08-21 | 2024-08-24 | 2 | unsloth/Mistral-Nemo-Base-2407-bnb-4bit |
EpistemeAI2_Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-math_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/EpistemeAI2/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-math" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI2/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-math</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI2__Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-math-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | EpistemeAI2/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.003-128K-code-math | aa21037cf0984cb293facb69c41895e7fccb1340 | 22.677605 | apache-2.0 | 0 | 8 | true | true | true | false | false | 0.791683 | 0.551547 | 55.154656 | 0.480756 | 26.743767 | 0.132175 | 13.217523 | 0.30453 | 7.270694 | 0.36925 | 6.789583 | 0.342005 | 26.889406 | false | 2024-10-11 | 2024-10-12 | 3 | Removed |
EpistemeAI2_Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.005-128K-code-COT_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/EpistemeAI2/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.005-128K-code-COT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI2/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.005-128K-code-COT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI2__Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.005-128K-code-COT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | EpistemeAI2/Fireball-Meta-Llama-3.1-8B-Instruct-Agent-0.005-128K-code-COT | cf8b99d4aa00c18fdaebfb24fa3c674ee6defa1a | 20.999994 | apache-2.0 | 0 | 8 | true | true | true | false | false | 0.800818 | 0.46332 | 46.331955 | 0.479083 | 26.400992 | 0.114804 | 11.480363 | 0.312081 | 8.277405 | 0.377438 | 5.013021 | 0.356466 | 28.496232 | false | 2024-10-11 | 2024-10-11 | 3 | Removed |
EpistemeAI2_Fireball-Phi-3-medium-4k-inst-Philos_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/EpistemeAI2/Fireball-Phi-3-medium-4k-inst-Philos" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EpistemeAI2/Fireball-Phi-3-medium-4k-inst-Philos</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EpistemeAI2__Fireball-Phi-3-medium-4k-inst-Philos-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | EpistemeAI2/Fireball-Phi-3-medium-4k-inst-Philos | 147715051102034fac98091e2a0cae6cade15ae0 | 29.172842 | apache-2.0 | 0 | 13 | true | true | true | false | true | 0.771814 | 0.531288 | 53.128809 | 0.617784 | 46.208873 | 0.140483 | 14.048338 | 0.332215 | 10.961969 | 0.413906 | 10.704948 | 0.459857 | 39.984116 | false | 2024-09-19 | 2024-09-20 | 1 | unsloth/phi-3-medium-4k-instruct-bnb-4bit |
Eric111_CatunaMayo_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/Eric111/CatunaMayo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Eric111/CatunaMayo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Eric111__CatunaMayo-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Eric111/CatunaMayo | 23337893381293975cbcc35f75b634954fbcefaf | 21.299155 | apache-2.0 | 0 | 7 | true | false | true | false | false | 0.550825 | 0.407416 | 40.741566 | 0.524364 | 33.299426 | 0.086103 | 8.610272 | 0.291946 | 5.592841 | 0.45399 | 15.348698 | 0.317819 | 24.202128 | false | 2024-02-15 | 2024-07-03 | 0 | Eric111/CatunaMayo |
Eric111_CatunaMayo-DPO_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/Eric111/CatunaMayo-DPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Eric111/CatunaMayo-DPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Eric111__CatunaMayo-DPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Eric111/CatunaMayo-DPO | 6bdbe06c10d57d152dd8a79a71edd8e30135b689 | 21.255121 | apache-2.0 | 0 | 7 | true | false | true | false | false | 0.554023 | 0.421454 | 42.145396 | 0.522399 | 33.089952 | 0.079305 | 7.930514 | 0.291946 | 5.592841 | 0.445031 | 14.66224 | 0.316988 | 24.109781 | false | 2024-02-21 | 2024-06-27 | 0 | Eric111/CatunaMayo-DPO |
Etherll_Chocolatine-3B-Instruct-DPO-Revised-Ties_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Phi3ForCausalLM | <a target="_blank" href="https://huggingface.co/Etherll/Chocolatine-3B-Instruct-DPO-Revised-Ties" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Etherll/Chocolatine-3B-Instruct-DPO-Revised-Ties</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Etherll__Chocolatine-3B-Instruct-DPO-Revised-Ties-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Etherll/Chocolatine-3B-Instruct-DPO-Revised-Ties | 8a9c3d745e0805e769b544622b3f5c039abc9b07 | 24.402767 | 0 | 3 | false | true | true | false | false | 0.635497 | 0.372469 | 37.246949 | 0.541065 | 35.583343 | 0.128399 | 12.839879 | 0.323826 | 9.8434 | 0.464938 | 17.817187 | 0.397773 | 33.085845 | false | 2024-10-28 | 0 | Removed |
||
Etherll_Chocolatine-3B-Instruct-DPO-Revised-Ties-v2_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Phi3ForCausalLM | <a target="_blank" href="https://huggingface.co/Etherll/Chocolatine-3B-Instruct-DPO-Revised-Ties-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Etherll/Chocolatine-3B-Instruct-DPO-Revised-Ties-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Etherll__Chocolatine-3B-Instruct-DPO-Revised-Ties-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Etherll/Chocolatine-3B-Instruct-DPO-Revised-Ties-v2 | 121b0831361743558e1a56fd89ae3d3c03272cc4 | 24.428163 | 0 | 3 | false | true | true | false | false | 0.631296 | 0.373993 | 37.399323 | 0.541065 | 35.583343 | 0.128399 | 12.839879 | 0.323826 | 9.8434 | 0.464938 | 17.817187 | 0.397773 | 33.085845 | false | 2024-10-29 | 0 | Removed |
||
Etherll_Herplete-LLM-Llama-3.1-8b_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/Etherll/Herplete-LLM-Llama-3.1-8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Etherll/Herplete-LLM-Llama-3.1-8b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Etherll__Herplete-LLM-Llama-3.1-8b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Etherll/Herplete-LLM-Llama-3.1-8b | b3829cf437216f099c031a9ab5e4c8ec974766dd | 19.588708 | 5 | 8 | false | true | true | false | true | 0.973685 | 0.467191 | 46.71915 | 0.501343 | 28.952591 | 0.027946 | 2.794562 | 0.286074 | 4.809843 | 0.386 | 6.683333 | 0.348155 | 27.572769 | false | 2024-08-24 | 2024-08-29 | 1 | Etherll/Herplete-LLM-Llama-3.1-8b (Merge) |
|
Etherll_Herplete-LLM-Llama-3.1-8b_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/Etherll/Herplete-LLM-Llama-3.1-8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Etherll/Herplete-LLM-Llama-3.1-8b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Etherll__Herplete-LLM-Llama-3.1-8b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Etherll/Herplete-LLM-Llama-3.1-8b | d1383d993fad005d515be4d815797019601c679f | 26.260139 | 5 | 8 | false | true | true | false | false | 0.854807 | 0.610598 | 61.059766 | 0.534725 | 33.206608 | 0.154834 | 15.483384 | 0.314597 | 8.612975 | 0.399052 | 8.614844 | 0.375249 | 30.583259 | false | 2024-08-24 | 2024-10-18 | 1 | Etherll/Herplete-LLM-Llama-3.1-8b (Merge) |
|
Etherll_Herplete-LLM-Llama-3.1-8b-Ties_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/Etherll/Herplete-LLM-Llama-3.1-8b-Ties" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Etherll/Herplete-LLM-Llama-3.1-8b-Ties</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Etherll__Herplete-LLM-Llama-3.1-8b-Ties-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Etherll/Herplete-LLM-Llama-3.1-8b-Ties | 26.571056 | 0 | 8 | false | true | true | false | false | 0.862201 | 0.616368 | 61.63679 | 0.533798 | 33.07089 | 0.162387 | 16.238671 | 0.317114 | 8.948546 | 0.401719 | 8.948177 | 0.375249 | 30.583259 | false | 2024-10-03 | 2024-10-17 | 1 | Etherll/Herplete-LLM-Llama-3.1-8b-Ties (Merge) |
||
Etherll_Qwen2.5-7B-della-test_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/Etherll/Qwen2.5-7B-della-test" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Etherll/Qwen2.5-7B-della-test</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Etherll__Qwen2.5-7B-della-test-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Etherll/Qwen2.5-7B-della-test | c2b2ffc38627e68e7b43a1b596dc16ee93c1c63b | 31.829957 | 0 | 7 | false | true | true | false | false | 0.686843 | 0.629532 | 62.953186 | 0.557457 | 36.850547 | 0.305891 | 30.589124 | 0.317114 | 8.948546 | 0.435667 | 12.891667 | 0.44872 | 38.746676 | false | 2024-11-01 | 2024-11-03 | 1 | Etherll/Qwen2.5-7B-della-test (Merge) |
|
Etherll_Qwen2.5-Coder-7B-Instruct-Ties_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/Etherll/Qwen2.5-Coder-7B-Instruct-Ties" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Etherll/Qwen2.5-Coder-7B-Instruct-Ties</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Etherll__Qwen2.5-Coder-7B-Instruct-Ties-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Etherll/Qwen2.5-Coder-7B-Instruct-Ties | d8c1624a2fa60f05030e34a128af391b5d8be332 | 24.474445 | 0 | 7 | false | true | true | false | false | 1.197181 | 0.500539 | 50.053857 | 0.489514 | 28.008294 | 0.169184 | 16.918429 | 0.329698 | 10.626398 | 0.437281 | 13.426823 | 0.350316 | 27.812869 | false | 2024-09-30 | 2024-10-28 | 1 | Etherll/Qwen2.5-Coder-7B-Instruct-Ties (Merge) |
|
Etherll_Replete-LLM-V3-Llama-3.1-8b_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/Etherll/Replete-LLM-V3-Llama-3.1-8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Etherll/Replete-LLM-V3-Llama-3.1-8b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Etherll__Replete-LLM-V3-Llama-3.1-8b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Etherll/Replete-LLM-V3-Llama-3.1-8b | e79849d72f70ef74677ed81a8885403973b2470c | 17.927882 | 5 | 8 | false | true | true | false | true | 0.789329 | 0.526292 | 52.629246 | 0.454338 | 22.902455 | 0.000755 | 0.075529 | 0.268456 | 2.46085 | 0.351646 | 2.055729 | 0.346991 | 27.443484 | false | 2024-08-24 | 2024-08-26 | 1 | Etherll/Replete-LLM-V3-Llama-3.1-8b (Merge) |
|
Etherll_SuperHermes_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/Etherll/SuperHermes" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Etherll/SuperHermes</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Etherll__SuperHermes-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Etherll/SuperHermes | 7edd56cb37722d09b0334826e0532b223d334939 | 26.604602 | 1 | 8 | false | true | true | false | false | 0.750015 | 0.545902 | 54.590154 | 0.528953 | 32.840317 | 0.146526 | 14.652568 | 0.323826 | 9.8434 | 0.440042 | 14.938542 | 0.394864 | 32.762633 | false | 2024-10-27 | 2024-10-27 | 1 | Etherll/SuperHermes (Merge) |
|
Eurdem_Defne-llama3.1-8B_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/Eurdem/Defne-llama3.1-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Eurdem/Defne-llama3.1-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Eurdem__Defne-llama3.1-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Eurdem/Defne-llama3.1-8B | 7832ba3066636bf4dab3e7d658c0b3ded12491ae | 25.095429 | llama3.1 | 2 | 8 | true | true | true | false | false | 1.7203 | 0.503612 | 50.361153 | 0.532098 | 32.822381 | 0.15861 | 15.861027 | 0.296141 | 6.152125 | 0.433094 | 13.536719 | 0.386553 | 31.83917 | false | 2024-07-29 | 2024-08-14 | 0 | Eurdem/Defne-llama3.1-8B |
FallenMerick_Chewy-Lemon-Cookie-11B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/FallenMerick/Chewy-Lemon-Cookie-11B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FallenMerick/Chewy-Lemon-Cookie-11B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FallenMerick__Chewy-Lemon-Cookie-11B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | FallenMerick/Chewy-Lemon-Cookie-11B | 0f5d0d6d218b3ef034f58eba32d6fe7ac4c237ae | 22.018549 | cc-by-4.0 | 0 | 10 | true | false | true | false | false | 0.857274 | 0.487524 | 48.752421 | 0.525112 | 33.0143 | 0.05287 | 5.287009 | 0.279362 | 3.914989 | 0.454552 | 15.952344 | 0.326712 | 25.190233 | false | 2024-06-06 | 2024-06-27 | 1 | FallenMerick/Chewy-Lemon-Cookie-11B (Merge) |
Felladrin_Llama-160M-Chat-v1_float16 | float16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/Felladrin/Llama-160M-Chat-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Felladrin/Llama-160M-Chat-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Felladrin__Llama-160M-Chat-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Felladrin/Llama-160M-Chat-v1 | e7f50665676821867ee7dfad32d0ca9fb68fc6bc | 4.101061 | apache-2.0 | 15 | 0 | true | true | true | false | true | 0.181581 | 0.157546 | 15.754642 | 0.303608 | 3.166756 | 0 | 0 | 0.25755 | 1.006711 | 0.366125 | 3.165625 | 0.113614 | 1.512633 | false | 2023-12-20 | 2024-07-23 | 1 | JackFram/llama-160m |
Felladrin_Minueza-32M-UltraChat_float16 | float16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/Felladrin/Minueza-32M-UltraChat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Felladrin/Minueza-32M-UltraChat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Felladrin__Minueza-32M-UltraChat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Felladrin/Minueza-32M-UltraChat | 28506b99c5902d2215eb378ec91d4226a7396c49 | 3.848727 | apache-2.0 | 4 | 0 | true | true | true | false | true | 0.168067 | 0.137563 | 13.756278 | 0.294148 | 2.43729 | 0 | 0 | 0.255872 | 0.782998 | 0.374187 | 4.640104 | 0.113281 | 1.475694 | false | 2024-02-27 | 2024-07-23 | 1 | Felladrin/Minueza-32M-Base |
FuJhen_ft-openhermes-25-mistral-7b-irca-dpo-pairs_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Adapter | ? | <a target="_blank" href="https://huggingface.co/FuJhen/ft-openhermes-25-mistral-7b-irca-dpo-pairs" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FuJhen/ft-openhermes-25-mistral-7b-irca-dpo-pairs</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FuJhen__ft-openhermes-25-mistral-7b-irca-dpo-pairs-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | FuJhen/ft-openhermes-25-mistral-7b-irca-dpo-pairs | 24c0bea14d53e6f67f1fbe2eca5bfe7cae389b33 | 19.615525 | apache-2.0 | 0 | 14 | true | true | true | false | true | 1.002048 | 0.542004 | 54.20041 | 0.477303 | 26.596861 | 0.001511 | 0.151057 | 0.278523 | 3.803132 | 0.417375 | 11.205208 | 0.295628 | 21.73648 | false | 2024-09-12 | 2024-09-12 | 1 | FuJhen/ft-openhermes-25-mistral-7b-irca-dpo-pairs (Merge) |
FuJhen_mistral-instruct-7B-DPO_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Adapter | ? | <a target="_blank" href="https://huggingface.co/FuJhen/mistral-instruct-7B-DPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FuJhen/mistral-instruct-7B-DPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FuJhen__mistral-instruct-7B-DPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | FuJhen/mistral-instruct-7B-DPO | e0bc86c23ce5aae1db576c8cca6f06f1f73af2db | 19.016943 | apache-2.0 | 0 | 14 | true | true | true | false | true | 1.009647 | 0.496842 | 49.684171 | 0.462391 | 24.925827 | 0.037764 | 3.776435 | 0.277685 | 3.691275 | 0.401563 | 9.428646 | 0.303358 | 22.595301 | false | 2024-09-12 | 2024-09-12 | 1 | FuJhen/mistral-instruct-7B-DPO (Merge) |
FuJhen_mistral_7b_v0.1_structedData_e2e_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Adapter | ? | <a target="_blank" href="https://huggingface.co/FuJhen/mistral_7b_v0.1_structedData_e2e" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FuJhen/mistral_7b_v0.1_structedData_e2e</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FuJhen__mistral_7b_v0.1_structedData_e2e-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | FuJhen/mistral_7b_v0.1_structedData_e2e | 7231864981174d9bee8c7687c24c8344414eae6b | 10.871547 | apache-2.0 | 0 | 7 | true | true | true | false | false | 1.080246 | 0.172684 | 17.268403 | 0.411391 | 18.062424 | 0.002266 | 0.226586 | 0.279362 | 3.914989 | 0.372292 | 5.636458 | 0.281084 | 20.12042 | false | 2024-09-13 | 2024-09-13 | 1 | FuJhen/mistral_7b_v0.1_structedData_e2e (Merge) |
FuJhen_mistral_7b_v0.1_structedData_viggo_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Adapter | ? | <a target="_blank" href="https://huggingface.co/FuJhen/mistral_7b_v0.1_structedData_viggo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FuJhen/mistral_7b_v0.1_structedData_viggo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FuJhen__mistral_7b_v0.1_structedData_viggo-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | FuJhen/mistral_7b_v0.1_structedData_viggo | 7231864981174d9bee8c7687c24c8344414eae6b | 12.352466 | apache-2.0 | 0 | 14 | true | true | true | false | false | 1.076114 | 0.178329 | 17.832906 | 0.452386 | 23.960172 | 0.023414 | 2.34139 | 0.283557 | 4.474273 | 0.373813 | 3.926563 | 0.294215 | 21.579492 | false | 2024-09-13 | 2024-09-13 | 1 | FuJhen/mistral_7b_v0.1_structedData_viggo (Merge) |
GalrionSoftworks_MN-LooseCannon-12B-v1_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/GalrionSoftworks/MN-LooseCannon-12B-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">GalrionSoftworks/MN-LooseCannon-12B-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/GalrionSoftworks__MN-LooseCannon-12B-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | GalrionSoftworks/MN-LooseCannon-12B-v1 | 21.885253 | 7 | 12 | false | true | true | false | true | 1.52902 | 0.541779 | 54.177915 | 0.512818 | 29.976062 | 0.070997 | 7.099698 | 0.285235 | 4.697987 | 0.413844 | 10.963802 | 0.319564 | 24.396055 | false | 2024-08-09 | 2024-09-05 | 1 | GalrionSoftworks/MN-LooseCannon-12B-v1 (Merge) |
||
GalrionSoftworks_MagnusIntellectus-12B-v1_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/GalrionSoftworks/MagnusIntellectus-12B-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">GalrionSoftworks/MagnusIntellectus-12B-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/GalrionSoftworks__MagnusIntellectus-12B-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | GalrionSoftworks/MagnusIntellectus-12B-v1 | fc83cb3eec2f8328448c5fe3cb830fc77983a6b9 | 21.622238 | apache-2.0 | 4 | 12 | true | false | true | false | true | 1.624264 | 0.442137 | 44.213686 | 0.532301 | 33.262254 | 0.055891 | 5.589124 | 0.284396 | 4.58613 | 0.442802 | 15.183594 | 0.342088 | 26.898641 | false | 2024-08-13 | 2024-09-05 | 1 | GalrionSoftworks/MagnusIntellectus-12B-v1 (Merge) |
Goekdeniz-Guelmez_Josiefied-Qwen2.5-1.5B-Instruct-abliterated-v1_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/Goekdeniz-Guelmez/Josiefied-Qwen2.5-1.5B-Instruct-abliterated-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Goekdeniz-Guelmez/Josiefied-Qwen2.5-1.5B-Instruct-abliterated-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Goekdeniz-Guelmez__Josiefied-Qwen2.5-1.5B-Instruct-abliterated-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Goekdeniz-Guelmez/Josiefied-Qwen2.5-1.5B-Instruct-abliterated-v1 | eca7edeba61e894597e9940348e8d90817c1ad79 | 15.294146 | apache-2.0 | 4 | 1 | true | true | true | false | true | 0.783381 | 0.476858 | 47.685807 | 0.418601 | 18.306013 | 0.019637 | 1.963746 | 0.243289 | 0 | 0.36749 | 4.002865 | 0.278258 | 19.806442 | false | 2024-09-20 | 2024-09-28 | 1 | Qwen/Qwen2.5-1.5B |
Goekdeniz-Guelmez_Josiefied-Qwen2.5-1.5B-Instruct-abliterated-v2_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/Goekdeniz-Guelmez/Josiefied-Qwen2.5-1.5B-Instruct-abliterated-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Goekdeniz-Guelmez/Josiefied-Qwen2.5-1.5B-Instruct-abliterated-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Goekdeniz-Guelmez__Josiefied-Qwen2.5-1.5B-Instruct-abliterated-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Goekdeniz-Guelmez/Josiefied-Qwen2.5-1.5B-Instruct-abliterated-v2 | ff4a6eff69adb015dfcfbff7a2d2dc43b34afe89 | 13.665944 | apache-2.0 | 1 | 1 | true | true | true | false | true | 0.719243 | 0.421554 | 42.15537 | 0.404189 | 16.499503 | 0.01284 | 1.283988 | 0.239933 | 0 | 0.376854 | 4.706771 | 0.25615 | 17.35003 | false | 2024-09-28 | 2024-09-28 | 2 | Qwen/Qwen2.5-1.5B |
Goekdeniz-Guelmez_Josiefied-Qwen2.5-1.5B-Instruct-abliterated-v3_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/Goekdeniz-Guelmez/Josiefied-Qwen2.5-1.5B-Instruct-abliterated-v3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Goekdeniz-Guelmez/Josiefied-Qwen2.5-1.5B-Instruct-abliterated-v3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Goekdeniz-Guelmez__Josiefied-Qwen2.5-1.5B-Instruct-abliterated-v3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Goekdeniz-Guelmez/Josiefied-Qwen2.5-1.5B-Instruct-abliterated-v3 | 03ffa6f7a6ada9d63d838707c597297f048d409b | 13.540924 | apache-2.0 | 1 | 1 | true | true | true | false | true | 0.706201 | 0.425251 | 42.525056 | 0.405345 | 16.439712 | 0.007553 | 0.755287 | 0.243289 | 0 | 0.370187 | 4.240104 | 0.255568 | 17.285387 | false | 2024-09-28 | 2024-09-28 | 3 | Qwen/Qwen2.5-1.5B |
Goekdeniz-Guelmez_Josiefied-Qwen2.5-14B-Instruct-abliterated-v4_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/Goekdeniz-Guelmez/Josiefied-Qwen2.5-14B-Instruct-abliterated-v4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Goekdeniz-Guelmez/Josiefied-Qwen2.5-14B-Instruct-abliterated-v4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Goekdeniz-Guelmez__Josiefied-Qwen2.5-14B-Instruct-abliterated-v4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Goekdeniz-Guelmez/Josiefied-Qwen2.5-14B-Instruct-abliterated-v4 | 00afd27eef16e835fcb0d8e687435dca3c185bdf | 33.511798 | apache-2.0 | 6 | 14 | true | true | true | false | true | 1.747117 | 0.829167 | 82.916661 | 0.635564 | 48.05227 | 0 | 0 | 0.342282 | 12.304251 | 0.428667 | 13.15 | 0.501828 | 44.647606 | false | 2024-10-21 | 2024-10-23 | 2 | Qwen/Qwen2.5-14B |
Goekdeniz-Guelmez_Josiefied-Qwen2.5-7B-Instruct-abliterated-v2_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/Goekdeniz-Guelmez/Josiefied-Qwen2.5-7B-Instruct-abliterated-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Goekdeniz-Guelmez/Josiefied-Qwen2.5-7B-Instruct-abliterated-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Goekdeniz-Guelmez__Josiefied-Qwen2.5-7B-Instruct-abliterated-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Goekdeniz-Guelmez/Josiefied-Qwen2.5-7B-Instruct-abliterated-v2 | ecf4024048ea1be2f0840a50080fb79b88aacde9 | 27.763763 | apache-2.0 | 4 | 7 | true | true | true | false | true | 1.201506 | 0.781381 | 78.138118 | 0.530967 | 33.333986 | 0 | 0 | 0.298658 | 6.487696 | 0.435396 | 13.957813 | 0.411985 | 34.664967 | false | 2024-09-20 | 2024-10-08 | 1 | Qwen/Qwen2.5-7B |
Goekdeniz-Guelmez_j.o.s.i.e.v4o-1.5b-dpo-stage1-v1_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/Goekdeniz-Guelmez/j.o.s.i.e.v4o-1.5b-dpo-stage1-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Goekdeniz-Guelmez/j.o.s.i.e.v4o-1.5b-dpo-stage1-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Goekdeniz-Guelmez__j.o.s.i.e.v4o-1.5b-dpo-stage1-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Goekdeniz-Guelmez/j.o.s.i.e.v4o-1.5b-dpo-stage1-v1 | d5ddad290d83b1ba8a7612a6c1cfad6fb4346fe4 | 13.567474 | apache-2.0 | 1 | 1 | true | true | true | false | true | 0.791153 | 0.418831 | 41.883092 | 0.412421 | 17.748017 | 0.029456 | 2.945619 | 0.250839 | 0.111857 | 0.352854 | 1.440104 | 0.255485 | 17.276152 | false | 2024-10-07 | 2024-10-08 | 2 | Qwen/Qwen2.5-1.5B |
GreenNode_GreenNode-small-9B-it_float16 | float16 | 🟩 continuously pretrained | 🟩 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/GreenNode/GreenNode-small-9B-it" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">GreenNode/GreenNode-small-9B-it</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/GreenNode__GreenNode-small-9B-it-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | GreenNode/GreenNode-small-9B-it | 1ba4ce8e2267c7fcc820961a9bfc13ab80150866 | 28.286651 | 0 | 9 | false | true | true | false | true | 2.645944 | 0.743613 | 74.36125 | 0.599384 | 41.899926 | 0 | 0 | 0.319631 | 9.284116 | 0.420417 | 11.652083 | 0.392703 | 32.522533 | false | 2024-10-14 | 0 | Removed |
||
GritLM_GritLM-7B-KTO_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/GritLM/GritLM-7B-KTO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">GritLM/GritLM-7B-KTO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/GritLM__GritLM-7B-KTO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | GritLM/GritLM-7B-KTO | b5c48669508c1de18c698460c187f64e90e7df44 | 19.172954 | apache-2.0 | 4 | 7 | true | true | true | false | true | 0.639864 | 0.531013 | 53.101327 | 0.485294 | 27.904318 | 0.023414 | 2.34139 | 0.297819 | 6.375839 | 0.371021 | 6.644271 | 0.268035 | 18.670582 | false | 2024-04-16 | 2024-08-04 | 0 | GritLM/GritLM-7B-KTO |
GritLM_GritLM-8x7B-KTO_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | MixtralForCausalLM | <a target="_blank" href="https://huggingface.co/GritLM/GritLM-8x7B-KTO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">GritLM/GritLM-8x7B-KTO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/GritLM__GritLM-8x7B-KTO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | GritLM/GritLM-8x7B-KTO | 938913477064fcc498757c5136d9899bb6e713ed | 25.838485 | apache-2.0 | 3 | 46 | true | true | true | false | true | 4.604463 | 0.571405 | 57.140498 | 0.58203 | 40.826162 | 0.098187 | 9.818731 | 0.296141 | 6.152125 | 0.421656 | 11.673698 | 0.364777 | 29.419696 | false | 2024-04-17 | 2024-08-04 | 0 | GritLM/GritLM-8x7B-KTO |
Gryphe_Pantheon-RP-1.0-8b-Llama-3_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/Gryphe/Pantheon-RP-1.0-8b-Llama-3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Gryphe/Pantheon-RP-1.0-8b-Llama-3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Gryphe__Pantheon-RP-1.0-8b-Llama-3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Gryphe/Pantheon-RP-1.0-8b-Llama-3 | 70a6df202c9df9abdc6928bec5a5ab47f2667aee | 16.772417 | apache-2.0 | 46 | 8 | true | true | true | false | true | 0.720836 | 0.393252 | 39.325213 | 0.453908 | 23.631915 | 0.057402 | 5.740181 | 0.276007 | 3.467562 | 0.38324 | 5.504948 | 0.306682 | 22.964687 | false | 2024-05-08 | 2024-06-27 | 1 | meta-llama/Meta-Llama-3-8B |
Gryphe_Pantheon-RP-1.5-12b-Nemo_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/Gryphe/Pantheon-RP-1.5-12b-Nemo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Gryphe/Pantheon-RP-1.5-12b-Nemo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Gryphe__Pantheon-RP-1.5-12b-Nemo-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Gryphe/Pantheon-RP-1.5-12b-Nemo | 00107381f05f69666772d88a1b11affe77c94a47 | 21.311159 | apache-2.0 | 27 | 12 | true | true | true | false | true | 1.685583 | 0.476308 | 47.630842 | 0.519582 | 31.750144 | 0.048338 | 4.833837 | 0.272651 | 3.020134 | 0.442031 | 15.053906 | 0.330203 | 25.578088 | false | 2024-07-25 | 2024-08-04 | 1 | mistralai/Mistral-Nemo-Base-2407 |
Gryphe_Pantheon-RP-1.6-12b-Nemo_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/Gryphe/Pantheon-RP-1.6-12b-Nemo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Gryphe/Pantheon-RP-1.6-12b-Nemo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Gryphe__Pantheon-RP-1.6-12b-Nemo-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Gryphe/Pantheon-RP-1.6-12b-Nemo | 60cf38ae0367baf314e3cce748d9a199adfea557 | 20.365189 | apache-2.0 | 11 | 12 | true | true | true | false | true | 1.737253 | 0.448057 | 44.805671 | 0.520401 | 31.687344 | 0.033988 | 3.398792 | 0.277685 | 3.691275 | 0.42876 | 12.928385 | 0.331117 | 25.679669 | false | 2024-08-18 | 2024-08-31 | 1 | mistralai/Mistral-Nemo-Base-2407 |
Gryphe_Pantheon-RP-1.6-12b-Nemo-KTO_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/Gryphe/Pantheon-RP-1.6-12b-Nemo-KTO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Gryphe/Pantheon-RP-1.6-12b-Nemo-KTO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Gryphe__Pantheon-RP-1.6-12b-Nemo-KTO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Gryphe/Pantheon-RP-1.6-12b-Nemo-KTO | 6cb6d8d9a7352d71f539ab5053987e058c090443 | 21.407541 | apache-2.0 | 4 | 12 | true | true | true | false | true | 1.682026 | 0.463619 | 46.361875 | 0.527698 | 33.0322 | 0.043807 | 4.380665 | 0.295302 | 6.040268 | 0.424792 | 12.165625 | 0.338182 | 26.464613 | false | 2024-08-28 | 2024-08-31 | 1 | mistralai/Mistral-Nemo-Base-2407 |
Gryphe_Pantheon-RP-Pure-1.6.2-22b-Small_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/Gryphe/Pantheon-RP-Pure-1.6.2-22b-Small" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Gryphe/Pantheon-RP-Pure-1.6.2-22b-Small</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Gryphe__Pantheon-RP-Pure-1.6.2-22b-Small-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Gryphe/Pantheon-RP-Pure-1.6.2-22b-Small | d031830dcb3bc5ad9634374db4dd15b3ef6ebe0f | 27.823932 | other | 10 | 22 | true | true | true | false | true | 1.45332 | 0.693104 | 69.31043 | 0.530454 | 31.683163 | 0.183535 | 18.353474 | 0.328859 | 10.514541 | 0.376479 | 4.393229 | 0.394199 | 32.688756 | false | 2024-10-13 | 2024-10-15 | 1 | mistralai/Mistral-Small-Instruct-2409 |
Gunulhona_Gemma-Ko-Merge_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/Gunulhona/Gemma-Ko-Merge" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Gunulhona/Gemma-Ko-Merge</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Gunulhona__Gemma-Ko-Merge-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Gunulhona/Gemma-Ko-Merge | ca6b0eb1405f21db6a7a9cce3b112d21fcfdde97 | 25.935394 | 0 | 10 | false | true | true | false | true | 3.137248 | 0.641572 | 64.157214 | 0.581303 | 38.787197 | 0.001511 | 0.151057 | 0.33557 | 11.409396 | 0.404698 | 9.120573 | 0.387882 | 31.986924 | false | 2024-09-04 | 2024-10-23 | 1 | Gunulhona/Gemma-Ko-Merge (Merge) |
|
Gunulhona_Gemma-Ko-Merge-PEFT_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Adapter | ? | <a target="_blank" href="https://huggingface.co/Gunulhona/Gemma-Ko-Merge-PEFT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Gunulhona/Gemma-Ko-Merge-PEFT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Gunulhona__Gemma-Ko-Merge-PEFT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Gunulhona/Gemma-Ko-Merge-PEFT | ca6b0eb1405f21db6a7a9cce3b112d21fcfdde97 | 18.169495 | 0 | 20 | false | true | true | false | false | 5.876477 | 0.288039 | 28.803907 | 0.515409 | 30.186273 | 0 | 0 | 0.324664 | 9.955257 | 0.40801 | 8.767969 | 0.381732 | 31.303561 | false | 2024-09-30 | 2024-10-17 | 0 | Gunulhona/Gemma-Ko-Merge-PEFT |
|
Gunulhona_Gemma-Ko-Merge-PEFT_float16 | float16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Adapter | ? | <a target="_blank" href="https://huggingface.co/Gunulhona/Gemma-Ko-Merge-PEFT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Gunulhona/Gemma-Ko-Merge-PEFT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Gunulhona__Gemma-Ko-Merge-PEFT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Gunulhona/Gemma-Ko-Merge-PEFT | ca6b0eb1405f21db6a7a9cce3b112d21fcfdde97 | 18.06624 | 0 | 20 | false | true | true | false | true | 9.394334 | 0.444135 | 44.41349 | 0.486299 | 26.015069 | 0 | 0 | 0.307047 | 7.606264 | 0.398583 | 7.05625 | 0.309757 | 23.306368 | false | 2024-09-30 | 2024-10-23 | 0 | Gunulhona/Gemma-Ko-Merge-PEFT |
|
HPAI-BSC_Llama3-Aloe-8B-Alpha_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/HPAI-BSC/Llama3-Aloe-8B-Alpha" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HPAI-BSC/Llama3-Aloe-8B-Alpha</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HPAI-BSC__Llama3-Aloe-8B-Alpha-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | HPAI-BSC/Llama3-Aloe-8B-Alpha | f0bce5c1fee5ea2a6679bb3dc9de8548e7262c9e | 20.104566 | cc-by-nc-4.0 | 52 | 8 | true | true | true | false | true | 0.795245 | 0.508107 | 50.810738 | 0.483085 | 27.145978 | 0.053625 | 5.362538 | 0.294463 | 5.928412 | 0.367271 | 5.875521 | 0.329538 | 25.504211 | false | 2024-04-26 | 2024-10-29 | 0 | HPAI-BSC/Llama3-Aloe-8B-Alpha |
HPAI-BSC_Llama3.1-Aloe-Beta-8B_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/HPAI-BSC/Llama3.1-Aloe-Beta-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HPAI-BSC/Llama3.1-Aloe-Beta-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HPAI-BSC__Llama3.1-Aloe-Beta-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | HPAI-BSC/Llama3.1-Aloe-Beta-8B | 3f2f0bbfb03cb0a8310efa50659688c1f2c02da0 | 23.754809 | llama3.1 | 6 | 8 | true | true | true | false | true | 1.398697 | 0.725328 | 72.532769 | 0.509276 | 30.369625 | 0.016616 | 1.661631 | 0.268456 | 2.46085 | 0.383458 | 6.832292 | 0.358045 | 28.67169 | false | 2024-10-30 | 2024-11-07 | 0 | HPAI-BSC/Llama3.1-Aloe-Beta-8B |
Hastagaras_Zabuza-8B-Llama-3.1_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/Hastagaras/Zabuza-8B-Llama-3.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Hastagaras/Zabuza-8B-Llama-3.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Hastagaras__Zabuza-8B-Llama-3.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Hastagaras/Zabuza-8B-Llama-3.1 | 57ffa92f229b8308916aae1d64d8f0dc9baa0a34 | 19.711829 | llama3.1 | 0 | 8 | true | false | true | false | true | 0.675287 | 0.626534 | 62.653426 | 0.453892 | 23.220321 | 0.042296 | 4.229607 | 0.264262 | 1.901566 | 0.356792 | 4.898958 | 0.292304 | 21.367095 | false | 2024-11-05 | 2024-11-05 | 1 | Hastagaras/Zabuza-8B-Llama-3.1 (Merge) |
HiroseKoichi_Llama-Salad-4x8B-V3_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | MixtralForCausalLM | <a target="_blank" href="https://huggingface.co/HiroseKoichi/Llama-Salad-4x8B-V3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HiroseKoichi/Llama-Salad-4x8B-V3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HiroseKoichi__Llama-Salad-4x8B-V3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | HiroseKoichi/Llama-Salad-4x8B-V3 | a343915429779efbd1478f01ba1f7fd9d8d226c0 | 24.93529 | llama3 | 5 | 24 | true | false | false | false | true | 2.137695 | 0.665352 | 66.535238 | 0.524465 | 31.928849 | 0.096677 | 9.667674 | 0.302852 | 7.04698 | 0.374031 | 6.453906 | 0.351812 | 27.979093 | false | 2024-06-17 | 2024-06-26 | 0 | HiroseKoichi/Llama-Salad-4x8B-V3 |
HuggingFaceH4_zephyr-7b-alpha_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/HuggingFaceH4/zephyr-7b-alpha" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HuggingFaceH4/zephyr-7b-alpha</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HuggingFaceH4__zephyr-7b-alpha-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | HuggingFaceH4/zephyr-7b-alpha | 2ce2d025864af849b3e5029e2ec9d568eeda892d | 18.571864 | mit | 1,098 | 7 | true | true | true | false | true | 0.795675 | 0.519148 | 51.914808 | 0.458786 | 23.955291 | 0.017372 | 1.73716 | 0.297819 | 6.375839 | 0.394958 | 7.503125 | 0.279505 | 19.944962 | true | 2023-10-09 | 2024-06-12 | 1 | mistralai/Mistral-7B-v0.1 |
HuggingFaceH4_zephyr-7b-beta_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/HuggingFaceH4/zephyr-7b-beta" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HuggingFaceH4/zephyr-7b-beta</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HuggingFaceH4__zephyr-7b-beta-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | HuggingFaceH4/zephyr-7b-beta | b70e0c9a2d9e14bd1e812d3c398e5f313e93b473 | 17.767061 | mit | 1,600 | 7 | true | true | true | false | true | 0.555023 | 0.495043 | 49.504315 | 0.431582 | 21.487542 | 0.02719 | 2.719033 | 0.290268 | 5.369128 | 0.392542 | 7.734375 | 0.278092 | 19.787973 | true | 2023-10-26 | 2024-06-12 | 1 | mistralai/Mistral-7B-v0.1 |
HuggingFaceH4_zephyr-7b-gemma-v0.1_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | GemmaForCausalLM | <a target="_blank" href="https://huggingface.co/HuggingFaceH4/zephyr-7b-gemma-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HuggingFaceH4/zephyr-7b-gemma-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HuggingFaceH4__zephyr-7b-gemma-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | HuggingFaceH4/zephyr-7b-gemma-v0.1 | 03b3427d0ed07d2e0f86c0a7e53d82d4beef9540 | 15.929338 | other | 121 | 8 | true | true | true | false | true | 1.481775 | 0.336374 | 33.637415 | 0.462374 | 23.751163 | 0.075529 | 7.55287 | 0.294463 | 5.928412 | 0.373969 | 4.179427 | 0.284741 | 20.526743 | true | 2024-03-01 | 2024-06-12 | 2 | google/gemma-7b |
HuggingFaceH4_zephyr-orpo-141b-A35b-v0.1_float16 | float16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | MixtralForCausalLM | <a target="_blank" href="https://huggingface.co/HuggingFaceH4/zephyr-orpo-141b-A35b-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HuggingFaceH4/zephyr-orpo-141b-A35b-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HuggingFaceH4__zephyr-orpo-141b-A35b-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | HuggingFaceH4/zephyr-orpo-141b-A35b-v0.1 | a3be084543d278e61b64cd600f28157afc79ffd3 | 34.063023 | apache-2.0 | 260 | 140 | true | true | true | false | true | 42.067786 | 0.651089 | 65.108911 | 0.629044 | 47.503796 | 0.200906 | 20.090634 | 0.378356 | 17.114094 | 0.446521 | 14.715104 | 0.45861 | 39.845597 | true | 2024-04-10 | 2024-06-12 | 1 | mistral-community/Mixtral-8x22B-v0.1 |
HuggingFaceTB_SmolLM-1.7B_bfloat16 | bfloat16 | 🟢 pretrained | 🟢 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/HuggingFaceTB/SmolLM-1.7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HuggingFaceTB/SmolLM-1.7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HuggingFaceTB__SmolLM-1.7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | HuggingFaceTB/SmolLM-1.7B | 673a07602ca1191e5bc2ddda428e2f608a0a14c0 | 5.425399 | apache-2.0 | 159 | 1 | true | true | true | false | false | 0.324307 | 0.236157 | 23.615673 | 0.318052 | 4.411128 | 0.007553 | 0.755287 | 0.241611 | 0 | 0.342094 | 2.128385 | 0.114777 | 1.641918 | false | 2024-07-14 | 2024-07-18 | 0 | HuggingFaceTB/SmolLM-1.7B |
HuggingFaceTB_SmolLM-1.7B-Instruct_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/HuggingFaceTB/SmolLM-1.7B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HuggingFaceTB/SmolLM-1.7B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HuggingFaceTB__SmolLM-1.7B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | HuggingFaceTB/SmolLM-1.7B-Instruct | 0ad161e59935a9a691dfde2818df8b98786f30a7 | 5.138222 | apache-2.0 | 102 | 1 | true | true | true | false | true | 0.317023 | 0.234783 | 23.47826 | 0.288511 | 2.080374 | 0 | 0 | 0.260067 | 1.342282 | 0.348667 | 2.083333 | 0.116606 | 1.84508 | false | 2024-07-15 | 2024-07-18 | 1 | HuggingFaceTB/SmolLM-1.7B |
HuggingFaceTB_SmolLM-135M_bfloat16 | bfloat16 | 🟢 pretrained | 🟢 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/HuggingFaceTB/SmolLM-135M" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HuggingFaceTB/SmolLM-135M</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HuggingFaceTB__SmolLM-135M-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | HuggingFaceTB/SmolLM-135M | eec6e461571fba3e197a57c298f60b75422eae02 | 6.838197 | apache-2.0 | 171 | 0 | true | true | true | false | false | 0.343378 | 0.212476 | 21.247623 | 0.304605 | 3.2854 | 0.006798 | 0.679758 | 0.258389 | 1.118568 | 0.436604 | 13.342188 | 0.112201 | 1.355644 | false | 2024-07-14 | 2024-07-18 | 0 | HuggingFaceTB/SmolLM-135M |
HuggingFaceTB_SmolLM-135M-Instruct_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/HuggingFaceTB/SmolLM-135M-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HuggingFaceTB/SmolLM-135M-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HuggingFaceTB__SmolLM-135M-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | HuggingFaceTB/SmolLM-135M-Instruct | 8ca7af58e27777cae460ad8ca3ab9db15f5c160d | 3.564171 | apache-2.0 | 96 | 0 | true | true | true | false | true | 0.467805 | 0.121401 | 12.140122 | 0.301508 | 2.692958 | 0 | 0 | 0.259228 | 1.230425 | 0.363458 | 3.365625 | 0.117603 | 1.955895 | false | 2024-07-15 | 2024-10-12 | 1 | HuggingFaceTB/SmolLM-135M |
HuggingFaceTB_SmolLM-360M_bfloat16 | bfloat16 | 🟢 pretrained | 🟢 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/HuggingFaceTB/SmolLM-360M" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HuggingFaceTB/SmolLM-360M</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HuggingFaceTB__SmolLM-360M-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | HuggingFaceTB/SmolLM-360M | 318cc630b73730bfd712e5873063156ffb8936b5 | 6.147596 | apache-2.0 | 58 | 0 | true | true | true | false | false | 0.36526 | 0.213351 | 21.335058 | 0.306452 | 3.284915 | 0.004532 | 0.453172 | 0.267617 | 2.348993 | 0.401781 | 8.089323 | 0.112367 | 1.374113 | false | 2024-07-14 | 2024-07-18 | 0 | HuggingFaceTB/SmolLM-360M |
HuggingFaceTB_SmolLM-360M-Instruct_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/HuggingFaceTB/SmolLM-360M-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HuggingFaceTB/SmolLM-360M-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HuggingFaceTB__SmolLM-360M-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | HuggingFaceTB/SmolLM-360M-Instruct | 8e951de8c220295ea4f85d078c4e320df7137535 | 4.706784 | apache-2.0 | 76 | 0 | true | true | true | false | true | 0.366501 | 0.195165 | 19.516549 | 0.288511 | 2.080374 | 0 | 0 | 0.264262 | 1.901566 | 0.347177 | 2.897135 | 0.116606 | 1.84508 | false | 2024-07-15 | 2024-08-20 | 1 | HuggingFaceTB/SmolLM-360M |
HuggingFaceTB_SmolLM2-1.7B_bfloat16 | bfloat16 | 🟢 pretrained | 🟢 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/HuggingFaceTB/SmolLM2-1.7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HuggingFaceTB/SmolLM2-1.7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HuggingFaceTB__SmolLM2-1.7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | HuggingFaceTB/SmolLM2-1.7B | 4fa12cab4f5f53670b05125fb9d2873af587d231 | 9.495504 | apache-2.0 | 63 | 1 | true | true | true | false | false | 0.325026 | 0.244 | 24.400036 | 0.345259 | 9.301788 | 0.021148 | 2.114804 | 0.279362 | 3.914989 | 0.348542 | 4.601042 | 0.213763 | 12.640366 | false | 2024-10-30 | 2024-11-06 | 0 | HuggingFaceTB/SmolLM2-1.7B |
HuggingFaceTB_SmolLM2-1.7B-Instruct_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/HuggingFaceTB/SmolLM2-1.7B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HuggingFaceTB/SmolLM2-1.7B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HuggingFaceTB__SmolLM2-1.7B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | HuggingFaceTB/SmolLM2-1.7B-Instruct | d1bb90bcfbe0f211109880f4da18da66f229c4f6 | 14.745339 | apache-2.0 | 297 | 1 | true | true | true | false | true | 0.324961 | 0.536784 | 53.678351 | 0.359862 | 10.917989 | 0.041541 | 4.154079 | 0.279362 | 3.914989 | 0.342125 | 4.098958 | 0.205369 | 11.707668 | false | 2024-10-31 | 2024-11-06 | 0 | HuggingFaceTB/SmolLM2-1.7B-Instruct |
HuggingFaceTB_SmolLM2-135M_bfloat16 | bfloat16 | 🟢 pretrained | 🟢 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/HuggingFaceTB/SmolLM2-135M" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HuggingFaceTB/SmolLM2-135M</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HuggingFaceTB__SmolLM2-135M-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | HuggingFaceTB/SmolLM2-135M | 28e66ca6931668447a3bac213f23d990ad3b0e2b | 5.557677 | apache-2.0 | 27 | 0 | true | true | true | false | false | 0.333905 | 0.1833 | 18.330031 | 0.304423 | 3.708078 | 0.002266 | 0.226586 | 0.248322 | 0 | 0.411177 | 10.030469 | 0.109458 | 1.050901 | false | 2024-10-31 | 2024-11-06 | 0 | HuggingFaceTB/SmolLM2-135M |
HuggingFaceTB_SmolLM2-135M-Instruct_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/HuggingFaceTB/SmolLM2-135M-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HuggingFaceTB/SmolLM2-135M-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HuggingFaceTB__SmolLM2-135M-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | HuggingFaceTB/SmolLM2-135M-Instruct | 5a33ba103645800d7b3790c4448546c1b73efc71 | 6.467365 | apache-2.0 | 51 | 0 | true | true | true | false | true | 0.338376 | 0.288314 | 28.83139 | 0.312432 | 4.720808 | 0.003021 | 0.302115 | 0.235738 | 0 | 0.366219 | 3.677344 | 0.111453 | 1.272533 | false | 2024-10-31 | 2024-11-06 | 0 | HuggingFaceTB/SmolLM2-135M-Instruct |
HuggingFaceTB_SmolLM2-360M_bfloat16 | bfloat16 | 🟢 pretrained | 🟢 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/HuggingFaceTB/SmolLM2-360M" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HuggingFaceTB/SmolLM2-360M</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HuggingFaceTB__SmolLM2-360M-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | HuggingFaceTB/SmolLM2-360M | 3ce05f63c246c44616da500b47b01f082f4d3bcc | 6.100225 | apache-2.0 | 20 | 0 | true | true | true | false | false | 0.386658 | 0.211452 | 21.145228 | 0.323348 | 5.543603 | 0.003021 | 0.302115 | 0.245805 | 0 | 0.395427 | 7.728385 | 0.116938 | 1.882018 | false | 2024-10-31 | 2024-11-06 | 0 | HuggingFaceTB/SmolLM2-360M |
HumanLLMs_Humanish-LLama3-8B-Instruct_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/HumanLLMs/Humanish-LLama3-8B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HumanLLMs/Humanish-LLama3-8B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HumanLLMs__Humanish-LLama3-8B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | HumanLLMs/Humanish-LLama3-8B-Instruct | 42f73ada2b7fb16f18a75404d72b7911bf1e65ce | 22.564911 | llama3 | 1 | 8 | true | true | true | false | true | 0.748278 | 0.64979 | 64.979033 | 0.496771 | 28.012477 | 0.095921 | 9.592145 | 0.255872 | 0.782998 | 0.358156 | 2.002865 | 0.37018 | 30.019947 | false | 2024-10-04 | 2024-10-05 | 1 | meta-llama/Meta-Llama-3-8B-Instruct |
HumanLLMs_Humanish-Mistral-Nemo-Instruct-2407_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/HumanLLMs/Humanish-Mistral-Nemo-Instruct-2407" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HumanLLMs/Humanish-Mistral-Nemo-Instruct-2407</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HumanLLMs__Humanish-Mistral-Nemo-Instruct-2407-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | HumanLLMs/Humanish-Mistral-Nemo-Instruct-2407 | 45b80bdce8d447ef494af06751904afcc607eb37 | 23.0069 | apache-2.0 | 3 | 12 | true | true | true | false | true | 1.620283 | 0.545127 | 54.512693 | 0.526178 | 32.709613 | 0.083837 | 8.383686 | 0.287752 | 5.033557 | 0.39676 | 9.395052 | 0.352061 | 28.006797 | false | 2024-10-06 | 2024-10-06 | 2 | mistralai/Mistral-Nemo-Base-2407 |
HumanLLMs_Humanish-Qwen2.5-7B-Instruct_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/HumanLLMs/Humanish-Qwen2.5-7B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HumanLLMs/Humanish-Qwen2.5-7B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HumanLLMs__Humanish-Qwen2.5-7B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | HumanLLMs/Humanish-Qwen2.5-7B-Instruct | 7d2c71d926832d6e257ad2776011494dbac2d151 | 26.665374 | apache-2.0 | 2 | 7 | true | true | true | false | true | 1.193393 | 0.728425 | 72.842502 | 0.536368 | 34.478998 | 0 | 0 | 0.298658 | 6.487696 | 0.398063 | 8.424479 | 0.439827 | 37.75857 | false | 2024-10-05 | 2024-10-05 | 2 | Qwen/Qwen2.5-7B |
IDEA-CCNL_Ziya-LLaMA-13B-v1_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/IDEA-CCNL/Ziya-LLaMA-13B-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">IDEA-CCNL/Ziya-LLaMA-13B-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/IDEA-CCNL__Ziya-LLaMA-13B-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | IDEA-CCNL/Ziya-LLaMA-13B-v1 | 64d931f346e1a49ea3bbca07a83137075bab1c66 | 3.906425 | gpl-3.0 | 272 | 13 | true | true | true | false | false | 1.108257 | 0.169686 | 16.968643 | 0.287703 | 1.463617 | 0 | 0 | 0.249161 | 0 | 0.375052 | 3.88151 | 0.110123 | 1.124778 | true | 2023-05-16 | 2024-06-12 | 0 | IDEA-CCNL/Ziya-LLaMA-13B-v1 |
Infinirc_Infinirc-Llama3-8B-2G-Release-v1.0_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/Infinirc/Infinirc-Llama3-8B-2G-Release-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Infinirc/Infinirc-Llama3-8B-2G-Release-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Infinirc__Infinirc-Llama3-8B-2G-Release-v1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Infinirc/Infinirc-Llama3-8B-2G-Release-v1.0 | 9c542d9ec3f86e145ae445c200c6ebe9066e8cd6 | 13.087133 | llama3 | 1 | 8 | true | true | true | false | false | 1.818723 | 0.202434 | 20.243399 | 0.435074 | 20.831165 | 0.012085 | 1.208459 | 0.299497 | 6.599553 | 0.460938 | 16.750521 | 0.216007 | 12.889702 | false | 2024-06-26 | 2024-09-29 | 0 | Infinirc/Infinirc-Llama3-8B-2G-Release-v1.0 |
Intel_neural-chat-7b-v3_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/Intel/neural-chat-7b-v3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Intel/neural-chat-7b-v3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Intel__neural-chat-7b-v3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Intel/neural-chat-7b-v3 | fc679274dfcd28a8b6087634f71af7ed2a0659c4 | 17.943646 | apache-2.0 | 66 | 7 | true | true | true | false | false | 0.48929 | 0.277797 | 27.779736 | 0.504832 | 30.205692 | 0.021903 | 2.190332 | 0.291946 | 5.592841 | 0.50549 | 23.019531 | 0.269864 | 18.873744 | true | 2023-10-25 | 2024-06-12 | 1 | mistralai/Mistral-7B-v0.1 |
Intel_neural-chat-7b-v3-1_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/Intel/neural-chat-7b-v3-1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Intel/neural-chat-7b-v3-1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Intel__neural-chat-7b-v3-1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Intel/neural-chat-7b-v3-1 | c0d379a49c1c0579529d5e6f2e936ddb759552a8 | 21.004986 | apache-2.0 | 545 | 7 | true | true | true | false | false | 0.563692 | 0.46869 | 46.868974 | 0.505157 | 29.739752 | 0.031722 | 3.172205 | 0.290268 | 5.369128 | 0.497896 | 22.236979 | 0.267786 | 18.642878 | true | 2023-11-14 | 2024-06-12 | 1 | mistralai/Mistral-7B-v0.1 |
Intel_neural-chat-7b-v3-2_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/Intel/neural-chat-7b-v3-2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Intel/neural-chat-7b-v3-2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Intel__neural-chat-7b-v3-2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Intel/neural-chat-7b-v3-2 | 0d8f77647810d21d935ea90c66d6339b85e65a75 | 21.433647 | apache-2.0 | 56 | 7 | true | true | true | false | false | 0.560441 | 0.49884 | 49.883975 | 0.503223 | 30.237458 | 0.045317 | 4.531722 | 0.290268 | 5.369128 | 0.489521 | 20.056771 | 0.266705 | 18.522828 | true | 2023-11-21 | 2024-06-12 | 0 | Intel/neural-chat-7b-v3-2 |
Intel_neural-chat-7b-v3-3_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/Intel/neural-chat-7b-v3-3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Intel/neural-chat-7b-v3-3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Intel__neural-chat-7b-v3-3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Intel/neural-chat-7b-v3-3 | bdd31cf498d13782cc7497cba5896996ce429f91 | 19.99112 | apache-2.0 | 75 | 7 | true | true | true | false | false | 0.559524 | 0.476259 | 47.625855 | 0.487662 | 27.753851 | 0.006798 | 0.679758 | 0.28943 | 5.257271 | 0.485958 | 20.578125 | 0.262467 | 18.051862 | true | 2023-12-09 | 2024-06-12 | 2 | mistralai/Mistral-7B-v0.1 |