Uploaded finetuned model
- Developed by: bralynn
- License: apache-2.0
- Finetuned from model : huihui-ai/Llama-3.2-3B-Instruct-abliterated
This llama model was trained 2x faster with Unsloth and Huggingface's TRL library.
hf (pretrained=bralynn/omnim,trust_remote_code=True), gen_kwargs: (None), limit: None, num_fewshot: 0, batch_size: auto (16)
| Tasks | Version | Filter | n-shot | Metric | Value | Stderr | ||
|---|---|---|---|---|---|---|---|---|
| mmlu | 2 | none | acc | ↑ | 0.6000 | ± | 0.0040 | |
| - humanities | 2 | none | acc | ↑ | 0.5768 | ± | 0.0070 | |
| - formal_logic | 1 | none | 0 | acc | ↑ | 0.4206 | ± | 0.0442 |
| - high_school_european_history | 1 | none | 0 | acc | ↑ | 0.7273 | ± | 0.0348 |
| - high_school_us_history | 1 | none | 0 | acc | ↑ | 0.7304 | ± | 0.0311 |
| - high_school_world_history | 1 | none | 0 | acc | ↑ | 0.7890 | ± | 0.0266 |
| - international_law | 1 | none | 0 | acc | ↑ | 0.7107 | ± | 0.0414 |
| - jurisprudence | 1 | none | 0 | acc | ↑ | 0.6296 | ± | 0.0467 |
| - logical_fallacies | 1 | none | 0 | acc | ↑ | 0.7239 | ± | 0.0351 |
| - moral_disputes | 1 | none | 0 | acc | ↑ | 0.6532 | ± | 0.0256 |
| - moral_scenarios | 1 | none | 0 | acc | ↑ | 0.5196 | ± | 0.0167 |
| - philosophy | 1 | none | 0 | acc | ↑ | 0.6592 | ± | 0.0269 |
| - prehistory | 1 | none | 0 | acc | ↑ | 0.6636 | ± | 0.0263 |
| - professional_law | 1 | none | 0 | acc | ↑ | 0.4518 | ± | 0.0127 |
| - world_religions | 1 | none | 0 | acc | ↑ | 0.7544 | ± | 0.0330 |
| - other | 2 | none | acc | ↑ | 0.6662 | ± | 0.0082 | |
| - business_ethics | 1 | none | 0 | acc | ↑ | 0.5500 | ± | 0.0500 |
| - clinical_knowledge | 1 | none | 0 | acc | ↑ | 0.6415 | ± | 0.0295 |
| - college_medicine | 1 | none | 0 | acc | ↑ | 0.5954 | ± | 0.0374 |
| - global_facts | 1 | none | 0 | acc | ↑ | 0.4000 | ± | 0.0492 |
| - human_aging | 1 | none | 0 | acc | ↑ | 0.5964 | ± | 0.0329 |
| - management | 1 | none | 0 | acc | ↑ | 0.7670 | ± | 0.0419 |
| - marketing | 1 | none | 0 | acc | ↑ | 0.8462 | ± | 0.0236 |
| - medical_genetics | 1 | none | 0 | acc | ↑ | 0.7000 | ± | 0.0461 |
| - miscellaneous | 1 | none | 0 | acc | ↑ | 0.7586 | ± | 0.0153 |
| - nutrition | 1 | none | 0 | acc | ↑ | 0.6732 | ± | 0.0269 |
| - professional_accounting | 1 | none | 0 | acc | ↑ | 0.4752 | ± | 0.0298 |
| - professional_medicine | 1 | none | 0 | acc | ↑ | 0.7757 | ± | 0.0253 |
| - virology | 1 | none | 0 | acc | ↑ | 0.4639 | ± | 0.0388 |
| - social sciences | 2 | none | acc | ↑ | 0.6640 | ± | 0.0083 | |
| - econometrics | 1 | none | 0 | acc | ↑ | 0.3947 | ± | 0.0460 |
| - high_school_geography | 1 | none | 0 | acc | ↑ | 0.7121 | ± | 0.0323 |
| - high_school_government_and_politics | 1 | none | 0 | acc | ↑ | 0.7668 | ± | 0.0305 |
| - high_school_macroeconomics | 1 | none | 0 | acc | ↑ | 0.5744 | ± | 0.0251 |
| - high_school_microeconomics | 1 | none | 0 | acc | ↑ | 0.6218 | ± | 0.0315 |
| - high_school_psychology | 1 | none | 0 | acc | ↑ | 0.7835 | ± | 0.0177 |
| - human_sexuality | 1 | none | 0 | acc | ↑ | 0.6412 | ± | 0.0421 |
| - professional_psychology | 1 | none | 0 | acc | ↑ | 0.5866 | ± | 0.0199 |
| - public_relations | 1 | none | 0 | acc | ↑ | 0.6455 | ± | 0.0458 |
| - security_studies | 1 | none | 0 | acc | ↑ | 0.6367 | ± | 0.0308 |
| - sociology | 1 | none | 0 | acc | ↑ | 0.7861 | ± | 0.0290 |
| - us_foreign_policy | 1 | none | 0 | acc | ↑ | 0.8200 | ± | 0.0386 |
| - stem | 2 | none | acc | ↑ | 0.5068 | ± | 0.0086 | |
| - abstract_algebra | 1 | none | 0 | acc | ↑ | 0.2700 | ± | 0.0446 |
| - anatomy | 1 | none | 0 | acc | ↑ | 0.6370 | ± | 0.0415 |
| - astronomy | 1 | none | 0 | acc | ↑ | 0.6579 | ± | 0.0386 |
| - college_biology | 1 | none | 0 | acc | ↑ | 0.7222 | ± | 0.0375 |
| - college_chemistry | 1 | none | 0 | acc | ↑ | 0.4100 | ± | 0.0494 |
| - college_computer_science | 1 | none | 0 | acc | ↑ | 0.4300 | ± | 0.0498 |
| - college_mathematics | 1 | none | 0 | acc | ↑ | 0.3000 | ± | 0.0461 |
| - college_physics | 1 | none | 0 | acc | ↑ | 0.3627 | ± | 0.0478 |
| - computer_security | 1 | none | 0 | acc | ↑ | 0.6600 | ± | 0.0476 |
| - conceptual_physics | 1 | none | 0 | acc | ↑ | 0.5064 | ± | 0.0327 |
| - electrical_engineering | 1 | none | 0 | acc | ↑ | 0.5448 | ± | 0.0415 |
| - elementary_mathematics | 1 | none | 0 | acc | ↑ | 0.4233 | ± | 0.0254 |
| - high_school_biology | 1 | none | 0 | acc | ↑ | 0.7194 | ± | 0.0256 |
| - high_school_chemistry | 1 | none | 0 | acc | ↑ | 0.5567 | ± | 0.0350 |
| - high_school_computer_science | 1 | none | 0 | acc | ↑ | 0.5800 | ± | 0.0496 |
| - high_school_mathematics | 1 | none | 0 | acc | ↑ | 0.3630 | ± | 0.0293 |
| - high_school_physics | 1 | none | 0 | acc | ↑ | 0.4238 | ± | 0.0403 |
| - high_school_statistics | 1 | none | 0 | acc | ↑ | 0.4444 | ± | 0.0339 |
| - machine_learning | 1 | none | 0 | acc | ↑ | 0.4821 | ± | 0.0474 |
| Groups | Version | Filter | n-shot | Metric | Value | Stderr | ||
|---|---|---|---|---|---|---|---|---|
| mmlu | 2 | none | acc | ↑ | 0.6000 | ± | 0.0040 | |
| - humanities | 2 | none | acc | ↑ | 0.5768 | ± | 0.0070 | |
| - other | 2 | none | acc | ↑ | 0.6662 | ± | 0.0082 | |
| - social sciences | 2 | none | acc | ↑ | 0.6640 | ± | 0.0083 | |
| - stem | 2 | none | acc | ↑ | 0.5068 | ± | 0.0086 |
- Downloads last month
- 8
Model tree for bralynn/omnim
Base model
meta-llama/Llama-3.2-3B-Instruct