--- base_model: - cgato/TheSpice-7b-v0.1.1 - ABX-AI/Laymonade-7B library_name: transformers tags: - mergekit - merge - not-for-all-audiences license: other --- GGUF: https://huggingface.co/ABX-AI/Spicy-Laymonade-7B-GGUF-IQ-Imatrix ![image/png](https://cdn-uploads.huggingface.co/production/uploads/65d936ad52eca001fdcd3245/bMW7mRqBS_xQJBXn-szWS.png) # Spicy-Laymonade-7B Well, we have Laymonade, so why not spice it up? This merge is a step into creating a new 9B. However, I did try it out, and it seemed to work pretty well. ## Merge Details This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ### Merge Method This model was merged using the SLERP merge method. ### Models Merged The following models were included in the merge: * [cgato/TheSpice-7b-v0.1.1](https://huggingface.co/cgato/TheSpice-7b-v0.1.1) * [ABX-AI/Laymonade-7B](https://huggingface.co/ABX-AI/Laymonade-7B) ### Configuration The following YAML configuration was used to produce this model: ```yaml slices: - sources: - model: cgato/TheSpice-7b-v0.1.1 layer_range: [0, 32] - model: ABX-AI/Laymonade-7B layer_range: [0, 32] merge_method: slerp base_model: ABX-AI/Laymonade-7B parameters: t: - filter: self_attn value: [0.7, 0.3, 0.6, 0.2, 0.5] - filter: mlp value: [0.3, 0.7, 0.4, 0.8, 0.5] - value: 0.5 dtype: bfloat16 ```