TheMelonGod commited on
Commit
0210df1
1 Parent(s): bebbdc1

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +29 -29
README.md CHANGED
@@ -13,34 +13,34 @@ base_model:
13
  - cognitivecomputations/dolphin-2.9.3-mistral-7B-32k
14
  base_model_relation: quantized
15
  ---
16
- ExLlamaV2 quantizations of: [Cognitive Computations - dolphin-2.9.3-mistral-7B-32k](https://huggingface.co/cognitivecomputations/dolphin-2.9.3-mistral-7B-32k)
17
-
18
-
19
- Quantizations (6hb)
20
- [8.0bpw](https://huggingface.co/TheMelonGod/dolphin-2.9.3-mistral-7B-32k-exl2/tree/8.0bpw)
21
- [7.5bpw](https://huggingface.co/TheMelonGod/dolphin-2.9.3-mistral-7B-32k-exl2/tree/7.5bpw)
22
- [7.0bpw](https://huggingface.co/TheMelonGod/dolphin-2.9.3-mistral-7B-32k-exl2/tree/7.0bpw)
23
- [6.5bpw](https://huggingface.co/TheMelonGod/dolphin-2.9.3-mistral-7B-32k-exl2/tree/6.5bpw)
24
- [6.0bpw](https://huggingface.co/TheMelonGod/dolphin-2.9.3-mistral-7B-32k-exl2/tree/6.0bpw)
25
- [5.5bpw](https://huggingface.co/TheMelonGod/dolphin-2.9.3-mistral-7B-32k-exl2/tree/5.5bpw)
26
- [5.0bpw](https://huggingface.co/TheMelonGod/dolphin-2.9.3-mistral-7B-32k-exl2/tree/5.0bpw)
27
- [4.5bpw](https://huggingface.co/TheMelonGod/dolphin-2.9.3-mistral-7B-32k-exl2/tree/4.5bpw)
28
- [4.25bpw](https://huggingface.co/TheMelonGod/dolphin-2.9.3-mistral-7B-32k-exl2/tree/4.25bpw)
29
- [4.0bpw](https://huggingface.co/TheMelonGod/dolphin-2.9.3-mistral-7B-32k-exl2/tree/4.0bpw)
30
- [3.75bpw](https://huggingface.co/TheMelonGod/dolphin-2.9.3-mistral-7B-32k-exl2/tree/3.75bpw)
31
- [3.5bpw](https://huggingface.co/TheMelonGod/dolphin-2.9.3-mistral-7B-32k-exl2/tree/3.5bpw)
32
- [3.0bpw](https://huggingface.co/TheMelonGod/dolphin-2.9.3-mistral-7B-32k-exl2/tree/3.0bpw)
33
- [2.75bpw](https://huggingface.co/TheMelonGod/dolphin-2.9.3-mistral-7B-32k-exl2/tree/2.75bpw)
34
- [2.5bpw](https://huggingface.co/TheMelonGod/dolphin-2.9.3-mistral-7B-32k-exl2/tree/2.5bpw)
35
- [2.25bpw](https://huggingface.co/TheMelonGod/dolphin-2.9.3-mistral-7B-32k-exl2/tree/2.25bpw)
36
- [2.0bpw](https://huggingface.co/TheMelonGod/dolphin-2.9.3-mistral-7B-32k-exl2/tree/2.0bpw)
37
-
38
- [Measurement File](https://huggingface.co/TheMelonGod/dolphin-2.9.3-mistral-7B-32k-exl2/blob/main/dolphin-2.9.3-mistral-7B-32k-measurement.json)
39
-
40
- If you need a specific model quantization or a particular bits per weight, please let me know. I’m happy to help quantize lesser known models.
41
-
42
-
43
- If you have any suggestions for improvements or feedback, feel free to reach out. Your input is greatly appreciated and helps me make quantizations better for everyone.
44
-
45
 
46
  Special thanks to [turboderp](https://huggingface.co/turboderp) for developing the tools that made these quantizations possible. Your contributions are greatly appreciated!
 
13
  - cognitivecomputations/dolphin-2.9.3-mistral-7B-32k
14
  base_model_relation: quantized
15
  ---
16
+ **Orignal Model by:** [Cognitive Computations](https://huggingface.co/cognitivecomputations)
17
+ **Orignal Model:** [dolphin-2.9.3-mistral-7B-32k-exl2](https://huggingface.co/cognitivecomputations/dolphin-2.9.3-mistral-7B-32k)
18
+
19
+ For more information about the model, I highly recommend checking out the original model page and the creator while you're at it.
20
+
21
+ **ExLlamaV2 Quantizations:**
22
+ **8.0bpw**: [8hb](https://huggingface.co/TheMelonGod/dolphin-2.9.3-mistral-7B-32k-exl2/tree/8hb-8.0bpw) | [6hb](https://huggingface.co/TheMelonGod/dolphin-2.9.3-mistral-7B-32k-exl2/tree/8.0bpw)
23
+ **7.5bpw**: [8hb](https://huggingface.co/TheMelonGod/dolphin-2.9.3-mistral-7B-32k-exl2/tree/8hb-7.5bpw) | [6hb](https://huggingface.co/TheMelonGod/dolphin-2.9.3-mistral-7B-32k-exl2/tree/7.5bpw)
24
+ **7.0bpw**: [8hb](https://huggingface.co/TheMelonGod/dolphin-2.9.3-mistral-7B-32k-exl2/tree/8hb-7.0bpw) | [6hb](https://huggingface.co/TheMelonGod/dolphin-2.9.3-mistral-7B-32k-exl2/tree/7.0bpw)
25
+ **6.5bpw**: [8hb](https://huggingface.co/TheMelonGod/dolphin-2.9.3-mistral-7B-32k-exl2/tree/8hb-6.5bpw) | [6hb](https://huggingface.co/TheMelonGod/dolphin-2.9.3-mistral-7B-32k-exl2/tree/6.5bpw)
26
+ **6.0bpw**: [8hb](https://huggingface.co/TheMelonGod/dolphin-2.9.3-mistral-7B-32k-exl2/tree/8hb-6.0bpw) | [6hb](https://huggingface.co/TheMelonGod/dolphin-2.9.3-mistral-7B-32k-exl2/tree/6.0bpw)
27
+ **5.5bpw**: [8hb](https://huggingface.co/TheMelonGod/dolphin-2.9.3-mistral-7B-32k-exl2/tree/8hb-5.5bpw) | [6hb](https://huggingface.co/TheMelonGod/dolphin-2.9.3-mistral-7B-32k-exl2/tree/5.5bpw)
28
+ **5.0bpw**: [8hb](https://huggingface.co/TheMelonGod/dolphin-2.9.3-mistral-7B-32k-exl2/tree/8hb-5.0bpw) | [6hb](https://huggingface.co/TheMelonGod/dolphin-2.9.3-mistral-7B-32k-exl2/tree/5.0bpw)
29
+ **4.5bpw**: [8hb](https://huggingface.co/TheMelonGod/dolphin-2.9.3-mistral-7B-32k-exl2/tree/8hb-4.5bpw) | [6hb](https://huggingface.co/TheMelonGod/dolphin-2.9.3-mistral-7B-32k-exl2/tree/4.5bpw)
30
+ **4.25bpw**: [8hb](https://huggingface.co/TheMelonGod/dolphin-2.9.3-mistral-7B-32k-exl2/tree/8hb-4.25bpw) | [6hb](https://huggingface.co/TheMelonGod/dolphin-2.9.3-mistral-7B-32k-exl2/tree/4.25bpw)
31
+ **4.0bpw**: [8hb](https://huggingface.co/TheMelonGod/dolphin-2.9.3-mistral-7B-32k-exl2/tree/8hb-4.0bpw) | [6hb](https://huggingface.co/TheMelonGod/dolphin-2.9.3-mistral-7B-32k-exl2/tree/4.0bpw)
32
+ **3.75bpw**: [8hb](https://huggingface.co/TheMelonGod/dolphin-2.9.3-mistral-7B-32k-exl2/tree/8hb-3.75bpw) | [6hb](https://huggingface.co/TheMelonGod/dolphin-2.9.3-mistral-7B-32k-exl2/tree/3.75bpw)
33
+ **3.5bpw**: [8hb](https://huggingface.co/TheMelonGod/dolphin-2.9.3-mistral-7B-32k-exl2/tree/8hb-3.5bpw) | [6hb](https://huggingface.co/TheMelonGod/dolphin-2.9.3-mistral-7B-32k-exl2/tree/3.5bpw)
34
+ **3.0bpw**: [8hb](https://huggingface.co/TheMelonGod/dolphin-2.9.3-mistral-7B-32k-exl2/tree/8hb-3.0bpw) | [6hb](https://huggingface.co/TheMelonGod/dolphin-2.9.3-mistral-7B-32k-exl2/tree/3.0bpw)
35
+ **2.75bpw**: [8hb](https://huggingface.co/TheMelonGod/dolphin-2.9.3-mistral-7B-32k-exl2/tree/8hb-2.75bpw) | [6hb](https://huggingface.co/TheMelonGod/dolphin-2.9.3-mistral-7B-32k-exl2/tree/2.75bpw)
36
+ **2.5bpw**: [8hb](https://huggingface.co/TheMelonGod/dolphin-2.9.3-mistral-7B-32k-exl2/tree/8hb-2.5bpw) | [6hb](https://huggingface.co/TheMelonGod/dolphin-2.9.3-mistral-7B-32k-exl2/tree/2.5bpw)
37
+ **2.25bpw**: [8hb](https://huggingface.co/TheMelonGod/dolphin-2.9.3-mistral-7B-32k-exl2/tree/8hb-2.25bpw) | [6hb](https://huggingface.co/TheMelonGod/dolphin-2.9.3-mistral-7B-32k-exl2/tree/2.25bpw)
38
+ **2.0bpw**: [8hb](https://huggingface.co/TheMelonGod/dolphin-2.9.3-mistral-7B-32k-exl2/tree/8hb-2.0bpw) | [6hb](https://huggingface.co/TheMelonGod/dolphin-2.9.3-mistral-7B-32k-exl2/tree/2.0bpw)
39
+
40
+ [Measurement File](https://huggingface.co/TheMelonGod/dolphin-2.9.3-mistral-7B-32k-exl2/blob/main/dolphin-2.9.3-mistral-7B-32k-measurement.json) _(Default/built-in calibration dataset was used)_
41
+
42
+ If you need a specific model quantized or particular bits per weight, please let me know. I’m happy to help.
43
+
44
+ Your feedback and suggestions are always welcome! They help me improve and make quantizations better for everyone.
45
 
46
  Special thanks to [turboderp](https://huggingface.co/turboderp) for developing the tools that made these quantizations possible. Your contributions are greatly appreciated!