TheMelonGod
commited on
Update README.md
Browse files
README.md
CHANGED
@@ -1,3 +1,46 @@
|
|
1 |
-
---
|
2 |
-
license: apache-2.0
|
3 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
license: apache-2.0
|
3 |
+
language:
|
4 |
+
- en
|
5 |
+
quantized_by: TheMelonGod
|
6 |
+
pipeline_tag: text-generation
|
7 |
+
tags:
|
8 |
+
- quantized
|
9 |
+
- safetensors
|
10 |
+
- exllamav2
|
11 |
+
- mistral
|
12 |
+
base_model:
|
13 |
+
- qingy2019/NaturalLM
|
14 |
+
base_model_relation: quantized
|
15 |
+
---
|
16 |
+
ExLlamaV2 quantizations of: [qingy2019 - NaturalLM](https://huggingface.co/qingy2019/NaturalLM)
|
17 |
+
|
18 |
+
|
19 |
+
Quantizations (6hb)
|
20 |
+
[8.0bpw](https://huggingface.co/TheMelonGod/NaturalLM-exl2/tree/8.0bpw)
|
21 |
+
[7.5bpw](https://huggingface.co/TheMelonGod/NaturalLM-exl2/tree/7.5bpw)
|
22 |
+
[7.0bpw](https://huggingface.co/TheMelonGod/NaturalLM-exl2/tree/7.0bpw)
|
23 |
+
[6.5bpw](https://huggingface.co/TheMelonGod/NaturalLM-exl2/tree/6.5bpw)
|
24 |
+
[6.0bpw](https://huggingface.co/TheMelonGod/NaturalLM-exl2/tree/6.0bpw)
|
25 |
+
[5.5bpw](https://huggingface.co/TheMelonGod/NaturalLM-exl2/tree/5.5bpw)
|
26 |
+
[5.0bpw](https://huggingface.co/TheMelonGod/NaturalLM-exl2/tree/5.0bpw)
|
27 |
+
[4.5bpw](https://huggingface.co/TheMelonGod/NaturalLM-exl2/tree/4.5bpw)
|
28 |
+
[4.25bpw](https://huggingface.co/TheMelonGod/NaturalLM-exl2/tree/4.25bpw)
|
29 |
+
[4.0bpw](https://huggingface.co/TheMelonGod/NaturalLM-exl2/tree/4.0bpw)
|
30 |
+
[3.75bpw](https://huggingface.co/TheMelonGod/NaturalLM-exl2/tree/3.75bpw)
|
31 |
+
[3.5bpw](https://huggingface.co/TheMelonGod/NaturalLM-exl2/tree/3.5bpw)
|
32 |
+
[3.0bpw](https://huggingface.co/TheMelonGod/NaturalLM-exl2/tree/3.0bpw)
|
33 |
+
[2.75bpw](https://huggingface.co/TheMelonGod/NaturalLM-exl2/tree/2.75bpw)
|
34 |
+
[2.5bpw](https://huggingface.co/TheMelonGod/NaturalLM-exl2/tree/2.5bpw)
|
35 |
+
[2.25bpw](https://huggingface.co/TheMelonGod/NaturalLM-exl2/tree/2.25bpw)
|
36 |
+
[2.0bpw](https://huggingface.co/TheMelonGod/NaturalLM-exl2/tree/2.0bpw)
|
37 |
+
|
38 |
+
|
39 |
+
|
40 |
+
If you need a specific model quantization or a particular bits per weight, please let me know. I’m happy to help quantize lesser known models.
|
41 |
+
|
42 |
+
|
43 |
+
If you have any suggestions for improvements or feedback, feel free to reach out. Your input is greatly appreciated and helps me make quantizations better for everyone.
|
44 |
+
|
45 |
+
|
46 |
+
Special thanks to [turboderp](https://huggingface.co/turboderp) for developing the tools that made these quantizations possible. Your contributions are greatly appreciated!
|