legraphista
commited on
Commit
β’
c6bfd28
1
Parent(s):
3f0bcf7
Upload README.md with huggingface_hub
Browse files
README.md
CHANGED
@@ -69,7 +69,7 @@ Link: [here](https://huggingface.co/legraphista/xLAM-8x7b-r-IMat-GGUF/blob/main/
|
|
69 |
| [xLAM-8x7b-r.Q6_K.gguf](https://huggingface.co/legraphista/xLAM-8x7b-r-IMat-GGUF/blob/main/xLAM-8x7b-r.Q6_K.gguf) | Q6_K | 38.38GB | β
Available | βͺ Static | π¦ No
|
70 |
| [xLAM-8x7b-r.Q4_K.gguf](https://huggingface.co/legraphista/xLAM-8x7b-r-IMat-GGUF/blob/main/xLAM-8x7b-r.Q4_K.gguf) | Q4_K | 28.45GB | β
Available | π’ IMatrix | π¦ No
|
71 |
| [xLAM-8x7b-r.Q3_K.gguf](https://huggingface.co/legraphista/xLAM-8x7b-r-IMat-GGUF/blob/main/xLAM-8x7b-r.Q3_K.gguf) | Q3_K | 22.55GB | β
Available | π’ IMatrix | π¦ No
|
72 |
-
| xLAM-8x7b-r.Q2_K | Q2_K |
|
73 |
|
74 |
|
75 |
### All Quants
|
@@ -92,7 +92,7 @@ Link: [here](https://huggingface.co/legraphista/xLAM-8x7b-r-IMat-GGUF/blob/main/
|
|
92 |
| xLAM-8x7b-r.IQ3_S | IQ3_S | - | β³ Processing | π’ IMatrix | -
|
93 |
| xLAM-8x7b-r.IQ3_XS | IQ3_XS | - | β³ Processing | π’ IMatrix | -
|
94 |
| xLAM-8x7b-r.IQ3_XXS | IQ3_XXS | - | β³ Processing | π’ IMatrix | -
|
95 |
-
| xLAM-8x7b-r.Q2_K | Q2_K |
|
96 |
| xLAM-8x7b-r.Q2_K_S | Q2_K_S | - | β³ Processing | π’ IMatrix | -
|
97 |
| xLAM-8x7b-r.IQ2_M | IQ2_M | - | β³ Processing | π’ IMatrix | -
|
98 |
| xLAM-8x7b-r.IQ2_S | IQ2_S | - | β³ Processing | π’ IMatrix | -
|
|
|
69 |
| [xLAM-8x7b-r.Q6_K.gguf](https://huggingface.co/legraphista/xLAM-8x7b-r-IMat-GGUF/blob/main/xLAM-8x7b-r.Q6_K.gguf) | Q6_K | 38.38GB | β
Available | βͺ Static | π¦ No
|
70 |
| [xLAM-8x7b-r.Q4_K.gguf](https://huggingface.co/legraphista/xLAM-8x7b-r-IMat-GGUF/blob/main/xLAM-8x7b-r.Q4_K.gguf) | Q4_K | 28.45GB | β
Available | π’ IMatrix | π¦ No
|
71 |
| [xLAM-8x7b-r.Q3_K.gguf](https://huggingface.co/legraphista/xLAM-8x7b-r-IMat-GGUF/blob/main/xLAM-8x7b-r.Q3_K.gguf) | Q3_K | 22.55GB | β
Available | π’ IMatrix | π¦ No
|
72 |
+
| [xLAM-8x7b-r.Q2_K.gguf](https://huggingface.co/legraphista/xLAM-8x7b-r-IMat-GGUF/blob/main/xLAM-8x7b-r.Q2_K.gguf) | Q2_K | 17.31GB | β
Available | π’ IMatrix | π¦ No
|
73 |
|
74 |
|
75 |
### All Quants
|
|
|
92 |
| xLAM-8x7b-r.IQ3_S | IQ3_S | - | β³ Processing | π’ IMatrix | -
|
93 |
| xLAM-8x7b-r.IQ3_XS | IQ3_XS | - | β³ Processing | π’ IMatrix | -
|
94 |
| xLAM-8x7b-r.IQ3_XXS | IQ3_XXS | - | β³ Processing | π’ IMatrix | -
|
95 |
+
| [xLAM-8x7b-r.Q2_K.gguf](https://huggingface.co/legraphista/xLAM-8x7b-r-IMat-GGUF/blob/main/xLAM-8x7b-r.Q2_K.gguf) | Q2_K | 17.31GB | β
Available | π’ IMatrix | π¦ No
|
96 |
| xLAM-8x7b-r.Q2_K_S | Q2_K_S | - | β³ Processing | π’ IMatrix | -
|
97 |
| xLAM-8x7b-r.IQ2_M | IQ2_M | - | β³ Processing | π’ IMatrix | -
|
98 |
| xLAM-8x7b-r.IQ2_S | IQ2_S | - | β³ Processing | π’ IMatrix | -
|