MN-12B-Lyra-v4-exl2 / README.md
TheMelonGod's picture
Update README.md
266efde verified
|
raw
history blame
3.98 kB
---
license: cc-by-nc-4.0
language:
- en
quantized_by: TheMelonGod
pipeline_tag: text-generation
tags:
- quantized
- safetensors
- exllamav2
- mistral
base_model:
- Sao10K/MN-12B-Lyra-v4
base_model_relation: quantized
---
**Orignal Model by:** [Sao10K](https://huggingface.co/Sao10K)
**Orignal Model:** [MN-12B-Lyra-v4](https://huggingface.co/Sao10K/MN-12B-Lyra-v4)
**ExLlamaV2 Quantizations:**
**8.0bpw**: [8hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/8hb-8.0bpw) | [6hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/6hb-8.0bpw)
**7.5bpw**: [8hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/8hb-7.5bpw) | [6hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/6hb-7.5bpw)
**7.0bpw**: [8hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/8hb-7.0bpw) | [6hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/6hb-7.0bpw)
**6.5bpw**: [8hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/8hb-6.5bpw) | [6hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/6hb-6.5bpw)
**6.0bpw**: [8hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/8hb-6.0bpw) | [6hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/6hb-6.0bpw)
**5.5bpw**: [8hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/8hb-5.5bpw) | [6hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/6hb-5.5bpw)
**5.0bpw**: [8hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/8hb-5.0bpw) | [6hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/6hb-5.0bpw)
**4.5bpw**: [8hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/8hb-4.5bpw) | [6hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/6hb-4.5bpw)
**4.25bpw**: [8hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/8hb-4.25bpw) | [6hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/6hb-4.25bpw)
**4.0bpw**: [8hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/8hb-4.0bpw) | [6hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/6hb-4.0bpw)
**3.75bpw**: [8hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/8hb-3.75bpw) | [6hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/6hb-3.75bpw)
**3.5bpw**: [8hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/8hb-3.5bpw) | [6hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/6hb-3.5bpw)
**3.0bpw**: [8hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/8hb-3.0bpw) | [6hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/6hb-3.0bpw)
**2.75bpw**: [8hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/8hb-2.75bpw) | [6hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/6hb-2.75bpw)
**2.5bpw**: [8hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/8hb-2.5bpw) | [6hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/6hb-2.5bpw)
**2.25bpw**: [8hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/8hb-2.25bpw) | [6hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/6hb-2.25bpw)
**2.0bpw**: [8hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/8hb-2.0bpw) | [6hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/6hb-2.0bpw)
[Measurement File](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/blob/main/MN-12B-Lyra-v4-measurement.json) _(Default/built-in calibration dataset was used)_
If you need a specific model quantization or a particular bits per weight, please let me know. I’m happy to help quantize lesser-known models.
Your feedback and suggestions are always welcome! They help me improve and make quantizations better for everyone. Special thanks to [turboderp](https://huggingface.co/turboderp) for developing the tools that made these quantizations possible. Your contributions are greatly appreciated!