Edit model card

DraftReasoner-2x7B-MoE-v0.1

Experimental 2-expert MoE merge using mlabonne/Marcoro14-7B-slerp as base.

Notes

Please evaluate before use in any application pipeline. Activation for Math part of the model would be 'math', 'reason', 'solve', 'count'.

Downloads last month
6
Safetensors
Model size
12.9B params
Tensor type
BF16
·
Inference Examples
Inference API (serverless) is not available, repository is disabled.