--- base_model: - tannedbum/L3-Nymeria-v2-8B - surya-narayanan/human_sexuality - tannedbum/L3-Nymeria-v2-8B - surya-narayanan/professional_medicine - tannedbum/L3-Nymeria-v2-8B - Azazelle/ANJIR-ADAPTER-128 - tannedbum/L3-Nymeria-v2-8B - Azazelle/Llama-3-8B-Abomination-LORA - tannedbum/L3-Nymeria-v2-8B - surya-narayanan/biology - Sao10K/L3-8B-Stheno-v3.2 - grimjim/Llama-3-Instruct-abliteration-LoRA-8B - tannedbum/L3-Nymeria-v2-8B - BeastGokul/Bio-Medical-MultiModal-Llama-3-8B-Finetuned - tannedbum/L3-Nymeria-v2-8B - surya-narayanan/formal_logic - tannedbum/L3-Nymeria-v2-8B - kik41/lora-type-descriptive-llama-3-8b-v2 - tannedbum/L3-Nymeria-v2-8B - kik41/lora-length-long-llama-3-8b-v2 - tannedbum/L3-Nymeria-v2-8B - surya-narayanan/sociology - tannedbum/L3-Nymeria-v2-8B - ResplendentAI/Smarts_Llama3 - tannedbum/L3-Nymeria-v2-8B - surya-narayanan/anatomy - tannedbum/L3-Nymeria-v2-8B - surya-narayanan/health - tannedbum/L3-Nymeria-v2-8B - surya-narayanan/psychology - tannedbum/L3-Nymeria-v2-8B - Azazelle/Nimue-8B - tannedbum/L3-Nymeria-v2-8B - surya-narayanan/professional_psychology library_name: transformers tags: - mergekit - merge --- # merge This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the [Model Stock](https://arxiv.org/abs/2403.19522) merge method using [Sao10K/L3-8B-Stheno-v3.2](https://huggingface.co/Sao10K/L3-8B-Stheno-v3.2) + [grimjim/Llama-3-Instruct-abliteration-LoRA-8B](https://huggingface.co/grimjim/Llama-3-Instruct-abliteration-LoRA-8B) as a base. ### Models Merged The following models were included in the merge: * [tannedbum/L3-Nymeria-v2-8B](https://huggingface.co/tannedbum/L3-Nymeria-v2-8B) + [surya-narayanan/human_sexuality](https://huggingface.co/surya-narayanan/human_sexuality) * [tannedbum/L3-Nymeria-v2-8B](https://huggingface.co/tannedbum/L3-Nymeria-v2-8B) + [surya-narayanan/professional_medicine](https://huggingface.co/surya-narayanan/professional_medicine) * [tannedbum/L3-Nymeria-v2-8B](https://huggingface.co/tannedbum/L3-Nymeria-v2-8B) + [Azazelle/ANJIR-ADAPTER-128](https://huggingface.co/Azazelle/ANJIR-ADAPTER-128) * [tannedbum/L3-Nymeria-v2-8B](https://huggingface.co/tannedbum/L3-Nymeria-v2-8B) + [Azazelle/Llama-3-8B-Abomination-LORA](https://huggingface.co/Azazelle/Llama-3-8B-Abomination-LORA) * [tannedbum/L3-Nymeria-v2-8B](https://huggingface.co/tannedbum/L3-Nymeria-v2-8B) + [surya-narayanan/biology](https://huggingface.co/surya-narayanan/biology) * [tannedbum/L3-Nymeria-v2-8B](https://huggingface.co/tannedbum/L3-Nymeria-v2-8B) + [BeastGokul/Bio-Medical-MultiModal-Llama-3-8B-Finetuned](https://huggingface.co/BeastGokul/Bio-Medical-MultiModal-Llama-3-8B-Finetuned) * [tannedbum/L3-Nymeria-v2-8B](https://huggingface.co/tannedbum/L3-Nymeria-v2-8B) + [surya-narayanan/formal_logic](https://huggingface.co/surya-narayanan/formal_logic) * [tannedbum/L3-Nymeria-v2-8B](https://huggingface.co/tannedbum/L3-Nymeria-v2-8B) + [kik41/lora-type-descriptive-llama-3-8b-v2](https://huggingface.co/kik41/lora-type-descriptive-llama-3-8b-v2) * [tannedbum/L3-Nymeria-v2-8B](https://huggingface.co/tannedbum/L3-Nymeria-v2-8B) + [kik41/lora-length-long-llama-3-8b-v2](https://huggingface.co/kik41/lora-length-long-llama-3-8b-v2) * [tannedbum/L3-Nymeria-v2-8B](https://huggingface.co/tannedbum/L3-Nymeria-v2-8B) + [surya-narayanan/sociology](https://huggingface.co/surya-narayanan/sociology) * [tannedbum/L3-Nymeria-v2-8B](https://huggingface.co/tannedbum/L3-Nymeria-v2-8B) + [ResplendentAI/Smarts_Llama3](https://huggingface.co/ResplendentAI/Smarts_Llama3) * [tannedbum/L3-Nymeria-v2-8B](https://huggingface.co/tannedbum/L3-Nymeria-v2-8B) + [surya-narayanan/anatomy](https://huggingface.co/surya-narayanan/anatomy) * [tannedbum/L3-Nymeria-v2-8B](https://huggingface.co/tannedbum/L3-Nymeria-v2-8B) + [surya-narayanan/health](https://huggingface.co/surya-narayanan/health) * [tannedbum/L3-Nymeria-v2-8B](https://huggingface.co/tannedbum/L3-Nymeria-v2-8B) + [surya-narayanan/psychology](https://huggingface.co/surya-narayanan/psychology) * [tannedbum/L3-Nymeria-v2-8B](https://huggingface.co/tannedbum/L3-Nymeria-v2-8B) + [Azazelle/Nimue-8B](https://huggingface.co/Azazelle/Nimue-8B) * [tannedbum/L3-Nymeria-v2-8B](https://huggingface.co/tannedbum/L3-Nymeria-v2-8B) + [surya-narayanan/professional_psychology](https://huggingface.co/surya-narayanan/professional_psychology) ### Configuration The following YAML configuration was used to produce this model: ```yaml models: - model: tannedbum/L3-Nymeria-v2-8B+Azazelle/ANJIR-ADAPTER-128 - model: tannedbum/L3-Nymeria-v2-8B+Azazelle/Nimue-8B - model: tannedbum/L3-Nymeria-v2-8B+surya-narayanan/formal_logic - model: tannedbum/L3-Nymeria-v2-8B+surya-narayanan/sociology - model: tannedbum/L3-Nymeria-v2-8B+surya-narayanan/health - model: tannedbum/L3-Nymeria-v2-8B+surya-narayanan/professional_medicine - model: tannedbum/L3-Nymeria-v2-8B+BeastGokul/Bio-Medical-MultiModal-Llama-3-8B-Finetuned - model: tannedbum/L3-Nymeria-v2-8B+surya-narayanan/biology - model: tannedbum/L3-Nymeria-v2-8B+surya-narayanan/psychology - model: tannedbum/L3-Nymeria-v2-8B+surya-narayanan/professional_psychology - model: tannedbum/L3-Nymeria-v2-8B+ResplendentAI/Smarts_Llama3 - model: tannedbum/L3-Nymeria-v2-8B+Azazelle/Llama-3-8B-Abomination-LORA - model: tannedbum/L3-Nymeria-v2-8B+kik41/lora-type-descriptive-llama-3-8b-v2 - model: tannedbum/L3-Nymeria-v2-8B+kik41/lora-length-long-llama-3-8b-v2 - model: tannedbum/L3-Nymeria-v2-8B+surya-narayanan/anatomy - model: tannedbum/L3-Nymeria-v2-8B+surya-narayanan/human_sexuality merge_method: model_stock base_model: Sao10K/L3-8B-Stheno-v3.2+grimjim/Llama-3-Instruct-abliteration-LoRA-8B dtype: bfloat16 ```