--- base_model: - Aleteian/Zmey-Gorynich - win10/Mistral-Nemo-abliterated-Nemo-Pro-v2 - DavidAU/MN-GRAND-Gutenberg-Lyra4-Lyra-12B-DARKNESS - DavidAU/MN-Dark-Planet-TITAN-12B library_name: transformers tags: - mergekit - merge --- # merge This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the [TIES](https://arxiv.org/abs/2306.01708) merge method using [win10/Mistral-Nemo-abliterated-Nemo-Pro-v2](https://huggingface.co/win10/Mistral-Nemo-abliterated-Nemo-Pro-v2) as a base. ### Models Merged The following models were included in the merge: * [Aleteian/Zmey-Gorynich](https://huggingface.co/Aleteian/Zmey-Gorynich) * [DavidAU/MN-GRAND-Gutenberg-Lyra4-Lyra-12B-DARKNESS](https://huggingface.co/DavidAU/MN-GRAND-Gutenberg-Lyra4-Lyra-12B-DARKNESS) * [DavidAU/MN-Dark-Planet-TITAN-12B](https://huggingface.co/DavidAU/MN-Dark-Planet-TITAN-12B) ### Configuration The following YAML configuration was used to produce this model: ```yaml models: - model: DavidAU/MN-Dark-Planet-TITAN-12B parameters: density: 1 weight: 1 - model: DavidAU/MN-GRAND-Gutenberg-Lyra4-Lyra-12B-DARKNESS parameters: density: 1 weight: 1 - model: Aleteian/Zmey-Gorynich parameters: density: 1 weight: 0.5 merge_method: ties base_model: win10/Mistral-Nemo-abliterated-Nemo-Pro-v2 dtype: float32 chat_template: "chatml" tokenizer_source: union ```