Falcon3-MoE-2x7B-Insruct / mergekit_moe_config.yml
ehristoforu's picture
Upload folder using huggingface_hub
989e2b8 verified
raw
history blame contribute delete
193 Bytes
base_model: tiiuae/Falcon3-7B-Instruct
gate_mode: random
architecture: mixtral
dtype: bfloat16
experts:
- source_model: tiiuae/Falcon3-7B-Instruct
- source_model: tiiuae/Falcon3-7B-Instruct