Add 6.5 link
Browse files
README.md
CHANGED
@@ -25,6 +25,8 @@ Conversion was done using the default calibration dataset.
|
|
25 |
Default arguments used except when the bits per weight is above 6.0, at that point the lm_head layer is quantized at 8 bits per weight instead of the default 6.
|
26 |
|
27 |
Original model: https://huggingface.co/cognitivecomputations/dolphin-2.7-mixtral-8x7b
|
|
|
|
|
28 |
|
29 |
## Download instructions
|
30 |
|
|
|
25 |
Default arguments used except when the bits per weight is above 6.0, at that point the lm_head layer is quantized at 8 bits per weight instead of the default 6.
|
26 |
|
27 |
Original model: https://huggingface.co/cognitivecomputations/dolphin-2.7-mixtral-8x7b
|
28 |
+
|
29 |
+
<a href="https://huggingface.co/bartowski/dolphin-2.7-mixtral-8x7b-exl2/tree/6_5">6.5 bits per weight</a>
|
30 |
|
31 |
## Download instructions
|
32 |
|