legraphista
commited on
Commit
β’
6848e1f
1
Parent(s):
1cf2526
Upload README.md with huggingface_hub
Browse files
README.md
CHANGED
@@ -83,7 +83,7 @@ Link: [here](https://huggingface.co/legraphista/xLAM-8x7b-r-IMat-GGUF/blob/main/
|
|
83 |
| [xLAM-8x7b-r.Q5_K_S.gguf](https://huggingface.co/legraphista/xLAM-8x7b-r-IMat-GGUF/blob/main/xLAM-8x7b-r.Q5_K_S.gguf) | Q5_K_S | 32.23GB | β
Available | βͺ Static | π¦ No
|
84 |
| [xLAM-8x7b-r.Q4_K.gguf](https://huggingface.co/legraphista/xLAM-8x7b-r-IMat-GGUF/blob/main/xLAM-8x7b-r.Q4_K.gguf) | Q4_K | 28.45GB | β
Available | π’ IMatrix | π¦ No
|
85 |
| [xLAM-8x7b-r.Q4_K_S.gguf](https://huggingface.co/legraphista/xLAM-8x7b-r-IMat-GGUF/blob/main/xLAM-8x7b-r.Q4_K_S.gguf) | Q4_K_S | 26.75GB | β
Available | π’ IMatrix | π¦ No
|
86 |
-
| xLAM-8x7b-r.IQ4_NL | IQ4_NL |
|
87 |
| xLAM-8x7b-r.IQ4_XS | IQ4_XS | - | β³ Processing | π’ IMatrix | -
|
88 |
| [xLAM-8x7b-r.Q3_K.gguf](https://huggingface.co/legraphista/xLAM-8x7b-r-IMat-GGUF/blob/main/xLAM-8x7b-r.Q3_K.gguf) | Q3_K | 22.55GB | β
Available | π’ IMatrix | π¦ No
|
89 |
| [xLAM-8x7b-r.Q3_K_L.gguf](https://huggingface.co/legraphista/xLAM-8x7b-r-IMat-GGUF/blob/main/xLAM-8x7b-r.Q3_K_L.gguf) | Q3_K_L | 24.17GB | β
Available | π’ IMatrix | π¦ No
|
|
|
83 |
| [xLAM-8x7b-r.Q5_K_S.gguf](https://huggingface.co/legraphista/xLAM-8x7b-r-IMat-GGUF/blob/main/xLAM-8x7b-r.Q5_K_S.gguf) | Q5_K_S | 32.23GB | β
Available | βͺ Static | π¦ No
|
84 |
| [xLAM-8x7b-r.Q4_K.gguf](https://huggingface.co/legraphista/xLAM-8x7b-r-IMat-GGUF/blob/main/xLAM-8x7b-r.Q4_K.gguf) | Q4_K | 28.45GB | β
Available | π’ IMatrix | π¦ No
|
85 |
| [xLAM-8x7b-r.Q4_K_S.gguf](https://huggingface.co/legraphista/xLAM-8x7b-r-IMat-GGUF/blob/main/xLAM-8x7b-r.Q4_K_S.gguf) | Q4_K_S | 26.75GB | β
Available | π’ IMatrix | π¦ No
|
86 |
+
| [xLAM-8x7b-r.IQ4_NL.gguf](https://huggingface.co/legraphista/xLAM-8x7b-r-IMat-GGUF/blob/main/xLAM-8x7b-r.IQ4_NL.gguf) | IQ4_NL | 26.51GB | β
Available | π’ IMatrix | π¦ No
|
87 |
| xLAM-8x7b-r.IQ4_XS | IQ4_XS | - | β³ Processing | π’ IMatrix | -
|
88 |
| [xLAM-8x7b-r.Q3_K.gguf](https://huggingface.co/legraphista/xLAM-8x7b-r-IMat-GGUF/blob/main/xLAM-8x7b-r.Q3_K.gguf) | Q3_K | 22.55GB | β
Available | π’ IMatrix | π¦ No
|
89 |
| [xLAM-8x7b-r.Q3_K_L.gguf](https://huggingface.co/legraphista/xLAM-8x7b-r-IMat-GGUF/blob/main/xLAM-8x7b-r.Q3_K_L.gguf) | Q3_K_L | 24.17GB | β
Available | π’ IMatrix | π¦ No
|