Qwen2.5-Mavapy-b-7B / mergekit_config.yml
chargoddard's picture
Upload folder using huggingface_hub
b18298a verified
raw
history blame contribute delete
335 Bytes
models:
- model: Locutusque/StockQwen-2.5-7B
parameters:
weight: 0.8
density: 0.5
- model: fblgit/cybertron-v4-qw7B-MGS
parameters:
weight: 1
density: 0.8
merge_method: della
base_model: mergekit-community/Qwen2.5-Mavapy-7B
parameters:
epsilon: 0.05
lambda: 1
dtype: float32
out_dtype: bfloat16