Seraph-7B / README.md
Weyaxi's picture
Upload folder using huggingface_hub
2c6ea50
|
raw
history blame
602 Bytes
metadata
license: apache-2.0

Seraph-7B

This is the model for Seraph-7B. I used mergekit to merge models.

Yaml Config

slices:

  • sources:
    • model: Weyaxi/OpenHermes-2.5-neural-chat-v3-3-Slerp layer_range: [0, 32]
    • model: Q-bert/MetaMath-Cybertron-Starling layer_range: [0, 32]

merge_method: slerp base_model: mistralai/Mistral-7B-v0.1 parameters: t: - filter: self_attn value: [0, 0.5, 0.3, 0.7, 1] - filter: mlp value: [1, 0.5, 0.7, 0.3, 0] - value: 0.5 # fallback for rest of tensors dtype: bfloat16