Update README.md
Browse files
README.md
CHANGED
@@ -3,7 +3,7 @@ EXL2 quants of [Mistral 8x7B Instruct v0.1](https://huggingface.co/mistralai/Mix
|
|
3 |
Supported in ExLlamaV2 0.0.11 and up
|
4 |
|
5 |
[2.40 bits per weight](https://huggingface.co/turboderp/Mixtral-8x7B-instruct-exl2/tree/2.4bpw)
|
6 |
-
[2.50 bits per weight](https://huggingface.co/turboderp/Mixtral-8x7B-instruct-exl2/tree/2.5bpw)
|
7 |
[2.70 bits per weight](https://huggingface.co/turboderp/Mixtral-8x7B-instruct-exl2/tree/2.7bpw)
|
8 |
[3.00 bits per weight](https://huggingface.co/turboderp/Mixtral-8x7B-instruct-exl2/tree/3.0bpw) (still uploading)
|
9 |
[3.50 bits per weight](https://huggingface.co/turboderp/Mixtral-8x7B-instruct-exl2/tree/3.5bpw)
|
|
|
3 |
Supported in ExLlamaV2 0.0.11 and up
|
4 |
|
5 |
[2.40 bits per weight](https://huggingface.co/turboderp/Mixtral-8x7B-instruct-exl2/tree/2.4bpw)
|
6 |
+
[2.50 bits per weight](https://huggingface.co/turboderp/Mixtral-8x7B-instruct-exl2/tree/2.5bpw)
|
7 |
[2.70 bits per weight](https://huggingface.co/turboderp/Mixtral-8x7B-instruct-exl2/tree/2.7bpw)
|
8 |
[3.00 bits per weight](https://huggingface.co/turboderp/Mixtral-8x7B-instruct-exl2/tree/3.0bpw) (still uploading)
|
9 |
[3.50 bits per weight](https://huggingface.co/turboderp/Mixtral-8x7B-instruct-exl2/tree/3.5bpw)
|