File size: 3,978 Bytes
0604be8
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
9093597
e46e040
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
0604be8
266efde
0604be8
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
---
license: cc-by-nc-4.0
language:
- en
quantized_by: TheMelonGod
pipeline_tag: text-generation
tags:
- quantized
- safetensors
- exllamav2
- mistral
base_model:
- Sao10K/MN-12B-Lyra-v4
base_model_relation: quantized
---
**Orignal Model by:** [Sao10K](https://huggingface.co/Sao10K)  
**Orignal Model:** [MN-12B-Lyra-v4](https://huggingface.co/Sao10K/MN-12B-Lyra-v4)

**ExLlamaV2 Quantizations:**  
**8.0bpw**:  [8hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/8hb-8.0bpw)   | [6hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/6hb-8.0bpw)  
**7.5bpw**:  [8hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/8hb-7.5bpw)   | [6hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/6hb-7.5bpw)  
**7.0bpw**:  [8hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/8hb-7.0bpw)   | [6hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/6hb-7.0bpw)  
**6.5bpw**:  [8hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/8hb-6.5bpw)   | [6hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/6hb-6.5bpw)  
**6.0bpw**:  [8hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/8hb-6.0bpw)   | [6hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/6hb-6.0bpw)  
**5.5bpw**:  [8hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/8hb-5.5bpw)   | [6hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/6hb-5.5bpw)  
**5.0bpw**:  [8hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/8hb-5.0bpw)   | [6hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/6hb-5.0bpw)  
**4.5bpw**:  [8hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/8hb-4.5bpw)   | [6hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/6hb-4.5bpw)  
**4.25bpw**: [8hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/8hb-4.25bpw) | [6hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/6hb-4.25bpw)  
**4.0bpw**:  [8hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/8hb-4.0bpw)   | [6hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/6hb-4.0bpw)  
**3.75bpw**: [8hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/8hb-3.75bpw) | [6hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/6hb-3.75bpw)  
**3.5bpw**:  [8hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/8hb-3.5bpw)   | [6hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/6hb-3.5bpw)  
**3.0bpw**:  [8hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/8hb-3.0bpw)   | [6hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/6hb-3.0bpw)  
**2.75bpw**: [8hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/8hb-2.75bpw) | [6hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/6hb-2.75bpw)  
**2.5bpw**:  [8hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/8hb-2.5bpw)   | [6hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/6hb-2.5bpw)  
**2.25bpw**: [8hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/8hb-2.25bpw) | [6hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/6hb-2.25bpw)  
**2.0bpw**:  [8hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/8hb-2.0bpw)   | [6hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/6hb-2.0bpw)  

[Measurement File](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/blob/main/MN-12B-Lyra-v4-measurement.json) _(Default/built-in calibration dataset was used)_

If you need a specific model quantization or a particular bits per weight, please let me know. I’m happy to help quantize lesser-known models.

Your feedback and suggestions are always welcome! They help me improve and make quantizations better for everyone. Special thanks to [turboderp](https://huggingface.co/turboderp) for developing the tools that made these quantizations possible. Your contributions are greatly appreciated!