22B AWQ
Collection
These models are selected for their compatibility with 2 small 12GB GPUs, or 1 medium 24GB GPU.
•
2 items
•
Updated
This model is not an moe, it is infact a 22B parameter dense model!
Just one day after the release of Mixtral-8x-22b, we are excited to introduce our handcrafted experimental model, Mistral-22b-V.01. This model is a culmination of equal knowledge distilled from all experts into a single, dense 22b model. This model is not a single trained expert, rather its a compressed MOE model, turning it into a dense 22b mode. This is the first working MOE to Dense model conversion.
GUANACO PROMPT FORMAT YOU MUST USE THE GUANACO PROMPT FORMAT SHOWN BELOW. Not using this prompt format will lead to sub optimal results.
Base model
Vezora/Mistral-22B-v0.1