TheMelonGod commited on
Commit
0604be8
·
verified ·
1 Parent(s): 2434dbe

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +42 -3
README.md CHANGED
@@ -1,3 +1,42 @@
1
- ---
2
- license: cc-by-nc-4.0
3
- ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: cc-by-nc-4.0
3
+ language:
4
+ - en
5
+ quantized_by: TheMelonGod
6
+ pipeline_tag: text-generation
7
+ tags:
8
+ - quantized
9
+ - safetensors
10
+ - exllamav2
11
+ - mistral
12
+ base_model:
13
+ - Sao10K/MN-12B-Lyra-v4
14
+ base_model_relation: quantized
15
+ ---
16
+ **Orignal Model by:** [Sao10K](https://huggingface.co/Sao10K)
17
+ **Orignal Model:** [MN-12B-Lyra-v4](https://huggingface.co/Sao10K/MN-12B-Lyra-v4)
18
+
19
+ **ExLlamaV2 Quantizations:**
20
+ - **8.0bpw**: [8hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/8hb-8.0bpw) | [6hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/8.0bpw)
21
+ - **7.5bpw**: [8hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/8hb-7.5bpw) | [6hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/7.5bpw)
22
+ - **7.0bpw**: [8hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/8hb-7.0bpw) | [6hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/7.0bpw)
23
+ - **6.5bpw**: [8hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/8hb-6.5bpw) | [6hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/6.5bpw)
24
+ - **6.0bpw**: [8hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/8hb-6.0bpw) | [6hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/6.0bpw)
25
+ - **5.5bpw**: [8hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/8hb-5.5bpw) | [6hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/5.5bpw)
26
+ - **5.0bpw**: [8hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/8hb-5.0bpw) | [6hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/5.0bpw)
27
+ - **4.5bpw**: [8hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/8hb-4.5bpw) | [6hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/4.5bpw)
28
+ - **4.25bpw**: [8hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/8hb-4.25bpw) | [6hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/4.25bpw)
29
+ - **4.0bpw**: [8hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/8hb-4.0bpw) | [6hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/4.0bpw)
30
+ - **3.75bpw**: [8hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/8hb-3.75bpw) | [6hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/3.75bpw)
31
+ - **3.5bpw**: [8hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/8hb-3.5bpw) | [6hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/3.5bpw)
32
+ - **3.0bpw**: [8hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/8hb-3.0bpw) | [6hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/3.0bpw)
33
+ - **2.75bpw**: [8hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/8hb-2.75bpw) | [6hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/2.75bpw)
34
+ - **2.5bpw**: [8hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/8hb-2.5bpw) | [6hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/2.5bpw)
35
+ - **2.25bpw**: [8hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/8hb-2.25bpw) | [6hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/2.25bpw)
36
+ - **2.0bpw**: [8hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/8hb-2.0bpw) | [6hb](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/tree/2.0bpw)
37
+
38
+ [Measurement File](https://huggingface.co/TheMelonGod/MN-12B-Lyra-v4-exl2/blob/main/MN-12B-Lyra-v4-measurement.json)
39
+
40
+ If you need a specific model quantization or a particular bits per weight, please let me know. I’m happy to help quantize lesser-known models.
41
+
42
+ Your feedback and suggestions are always welcome! They help me improve and make quantizations better for everyone. Special thanks to [turboderp](https://huggingface.co/turboderp) for developing the tools that made these quantizations possible. Your contributions are greatly appreciated!