metadata
base_model:
- KatyTheCutie/LemonadeRP-4.5.3
library_name: transformers
tags:
- mergekit
- merge
Mytho-Lemon-11B
Just a simple 11B frankenmerge of LemonadeRP and MythoMist which was used in matchaaaaa/Chaifighter-20B-v2.
I didn't have to merge the models like this in Chaifighter, but I already had this lying around from a previous attempt, so I just went with it. It's nothing special, but here it is!
Merge Details
This is a merge of pre-trained language models created using mergekit.
Merge Method
This model was merged using the passthrough merge method.
Models Merged
The following models were included in the merge:
Configuration
The following YAML configuration was used to produce this model:
slices:
- sources:
- model: KatyTheCutie/LemonadeRP-4.5.3
layer_range: [0, 24]
- sources:
- model: Gryphe/MythoMist-7B
layer_range: [8, 32]
merge_method: passthrough
dtype: bfloat16
Anyway, have a great day!