Merges
Collection
Personal Merges
•
10 items
•
Updated
First Version of Mistral_Sunair
This is a merge of pre-trained language models created using mergekit.
This model was merged using the DARE TIES merge method using unsloth/Mistral-Nemo-Instruct-2407 as a base.
The following models were included in the merge:
The following YAML configuration was used to produce this model:
models:
- model: unsloth/Mistral-Nemo-Instruct-2407
#no parameters necessary for base model
- model: nbeerbower/Lyra4-Gutenberg2-12B
parameters:
density: 0.5
weight: 0.5
- model: nbeerbower/Mistral-Nemo-Gutenberg-Doppel-12B-v2
parameters:
density: 0.25
weight: 0.25
- model: anthracite-org/magnum-v4-12b
parameters:
density: 0.25
weight: 0.25
merge_method: dare_ties
base_model: unsloth/Mistral-Nemo-Instruct-2407
parameters:
normalize: true
int8_mask: true
dtype: float16