merge

image/png

This is a merge of pre-trained language models created using mergekit.

Merge Details

Merge Method

This model was merged using the SCE merge method using DreadPoor/Aspire-8B-model_stock as a base.

Models Merged

The following models were included in the merge:

Configuration

The following YAML configuration was used to produce this model:

models:
  - model: vicgalle/Configurable-Llama-3.1-8B-Instruct #High IFEval score 
  - model: khoantap/llama-linear-0.5-1-0.5-merge #high BBH score
  - model: Triangle104/Distilled-DarkPlanet-Allades-8B #high MATH score
# - model: here is where i would put a model with high GPTQA, if i had one. they all suck.
  - model: johnsutor/Llama-3-8B-Instruct_dare_ties-density-0.9 #high MuSR score
  - model: DreadPoor/LemonP-8B-Model_Stock #high MMLU score, i am shocked at this
merge_method: sce
base_model: DreadPoor/Aspire-8B-model_stock #reference baseline, the output is a modified version of it
parameters:
  select_topk: 0.33 #for each model, the top 33% of diferences relative to the baseline are considered. results can be detrimental.
dtype: bfloat16
int8_mask: true
Downloads last month
2
Safetensors
Model size
8.03B params
Tensor type
BF16
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for DreadPoor/HOT_STINKING_GARBAGE

Collection including DreadPoor/HOT_STINKING_GARBAGE