SmolLM2-360M-Merged
This is a merge of pre-trained language models created using mergekit.
Merge Details
Merge Method
This model was merged using the TIES merge method using HuggingFaceTB/SmolLM2-360M as a base.
Models Merged
The following models were included in the merge:
Configuration
The following YAML configuration was used to produce this model:
models:
- model: HuggingFaceTB/SmolLM2-360M-Instruct
parameters:
weight: 1
merge_method: ties
base_model: HuggingFaceTB/SmolLM2-360M
parameters:
normalize: true
int8_mask: true
dtype: bfloat16
Open LLM Leaderboard Evaluation Results
Detailed results can be found here
Metric | Value |
---|---|
Avg. | 7.13 |
IFEval (0-Shot) | 32.06 |
BBH (3-Shot) | 4.74 |
MATH Lvl 5 (4-Shot) | 0.76 |
GPQA (0-shot) | 0.78 |
MuSR (0-shot) | 3.36 |
MMLU-PRO (5-shot) | 1.09 |
- Downloads last month
- 125
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
the model is not deployed on the HF Inference API.
Model tree for vonjack/SmolLM2-360M-Merged
Evaluation results
- strict accuracy on IFEval (0-Shot)Open LLM Leaderboard32.060
- normalized accuracy on BBH (3-Shot)Open LLM Leaderboard4.740
- exact match on MATH Lvl 5 (4-Shot)Open LLM Leaderboard0.760
- acc_norm on GPQA (0-shot)Open LLM Leaderboard0.780
- acc_norm on MuSR (0-shot)Open LLM Leaderboard3.360
- accuracy on MMLU-PRO (5-shot)test set Open LLM Leaderboard1.090