--- base_model: [] library_name: transformers tags: - mergekit - merge --- # merge This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the della_linear merge method using /workspace/text-generation-webui/models/mistralai_Mistral-Large-Instruct-2407 as a base. ### Models Merged The following models were included in the merge: * /workspace/text-generation-webui/models/FluffyKaeloky_Luminum-v0.1-123B * /workspace/text-generation-webui/models/anthracite-org_magnum-v2-123b * /workspace/text-generation-webui/models/migtissera_Tess-3-Mistral-Large-2-123B ### Configuration The following YAML configuration was used to produce this model: ```yaml models: - model: /workspace/text-generation-webui/models/anthracite-org_magnum-v2-123b parameters: weight: 0.25 density: 0.9 - model: /workspace/text-generation-webui/models/FluffyKaeloky_Luminum-v0.1-123B parameters: weight: 0.25 density: 0.9 - model: /workspace/text-generation-webui/models/migtissera_Tess-3-Mistral-Large-2-123B parameters: weight: 0.3 density: 0.9 merge_method: della_linear base_model: /workspace/text-generation-webui/models/mistralai_Mistral-Large-Instruct-2407 parameters: epsilon: 0.05 lambda: 1 int8_mask: true dtype: bfloat16 ```