HermesX2 / mergekit_moe_config.yml
ehristoforu's picture
Upload folder using huggingface_hub
ed692af verified
raw
history blame contribute delete
217 Bytes
base_model: NousResearch/Hermes-3-Llama-3.1-8B
gate_mode: random
architecture: mixtral
dtype: bfloat16
experts:
- source_model: NousResearch/Hermes-3-Llama-3.1-8B
- source_model: NousResearch/Hermes-3-Llama-3.1-8B