This is a MoE model with a mix of domain agnostic fine-tuned models derived from the base Mistral

Downloads last month
23
Safetensors
Model size
24.2B params
Tensor type
BF16
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for collaiborate-tech/CollAIborate4x7B

Quantizations
1 model