legraphista commited on
Commit
c6bfd28
β€’
1 Parent(s): 3f0bcf7

Upload README.md with huggingface_hub

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -69,7 +69,7 @@ Link: [here](https://huggingface.co/legraphista/xLAM-8x7b-r-IMat-GGUF/blob/main/
69
  | [xLAM-8x7b-r.Q6_K.gguf](https://huggingface.co/legraphista/xLAM-8x7b-r-IMat-GGUF/blob/main/xLAM-8x7b-r.Q6_K.gguf) | Q6_K | 38.38GB | βœ… Available | βšͺ Static | πŸ“¦ No
70
  | [xLAM-8x7b-r.Q4_K.gguf](https://huggingface.co/legraphista/xLAM-8x7b-r-IMat-GGUF/blob/main/xLAM-8x7b-r.Q4_K.gguf) | Q4_K | 28.45GB | βœ… Available | 🟒 IMatrix | πŸ“¦ No
71
  | [xLAM-8x7b-r.Q3_K.gguf](https://huggingface.co/legraphista/xLAM-8x7b-r-IMat-GGUF/blob/main/xLAM-8x7b-r.Q3_K.gguf) | Q3_K | 22.55GB | βœ… Available | 🟒 IMatrix | πŸ“¦ No
72
- | xLAM-8x7b-r.Q2_K | Q2_K | - | ⏳ Processing | 🟒 IMatrix | -
73
 
74
 
75
  ### All Quants
@@ -92,7 +92,7 @@ Link: [here](https://huggingface.co/legraphista/xLAM-8x7b-r-IMat-GGUF/blob/main/
92
  | xLAM-8x7b-r.IQ3_S | IQ3_S | - | ⏳ Processing | 🟒 IMatrix | -
93
  | xLAM-8x7b-r.IQ3_XS | IQ3_XS | - | ⏳ Processing | 🟒 IMatrix | -
94
  | xLAM-8x7b-r.IQ3_XXS | IQ3_XXS | - | ⏳ Processing | 🟒 IMatrix | -
95
- | xLAM-8x7b-r.Q2_K | Q2_K | - | ⏳ Processing | 🟒 IMatrix | -
96
  | xLAM-8x7b-r.Q2_K_S | Q2_K_S | - | ⏳ Processing | 🟒 IMatrix | -
97
  | xLAM-8x7b-r.IQ2_M | IQ2_M | - | ⏳ Processing | 🟒 IMatrix | -
98
  | xLAM-8x7b-r.IQ2_S | IQ2_S | - | ⏳ Processing | 🟒 IMatrix | -
 
69
  | [xLAM-8x7b-r.Q6_K.gguf](https://huggingface.co/legraphista/xLAM-8x7b-r-IMat-GGUF/blob/main/xLAM-8x7b-r.Q6_K.gguf) | Q6_K | 38.38GB | βœ… Available | βšͺ Static | πŸ“¦ No
70
  | [xLAM-8x7b-r.Q4_K.gguf](https://huggingface.co/legraphista/xLAM-8x7b-r-IMat-GGUF/blob/main/xLAM-8x7b-r.Q4_K.gguf) | Q4_K | 28.45GB | βœ… Available | 🟒 IMatrix | πŸ“¦ No
71
  | [xLAM-8x7b-r.Q3_K.gguf](https://huggingface.co/legraphista/xLAM-8x7b-r-IMat-GGUF/blob/main/xLAM-8x7b-r.Q3_K.gguf) | Q3_K | 22.55GB | βœ… Available | 🟒 IMatrix | πŸ“¦ No
72
+ | [xLAM-8x7b-r.Q2_K.gguf](https://huggingface.co/legraphista/xLAM-8x7b-r-IMat-GGUF/blob/main/xLAM-8x7b-r.Q2_K.gguf) | Q2_K | 17.31GB | βœ… Available | 🟒 IMatrix | πŸ“¦ No
73
 
74
 
75
  ### All Quants
 
92
  | xLAM-8x7b-r.IQ3_S | IQ3_S | - | ⏳ Processing | 🟒 IMatrix | -
93
  | xLAM-8x7b-r.IQ3_XS | IQ3_XS | - | ⏳ Processing | 🟒 IMatrix | -
94
  | xLAM-8x7b-r.IQ3_XXS | IQ3_XXS | - | ⏳ Processing | 🟒 IMatrix | -
95
+ | [xLAM-8x7b-r.Q2_K.gguf](https://huggingface.co/legraphista/xLAM-8x7b-r-IMat-GGUF/blob/main/xLAM-8x7b-r.Q2_K.gguf) | Q2_K | 17.31GB | βœ… Available | 🟒 IMatrix | πŸ“¦ No
96
  | xLAM-8x7b-r.Q2_K_S | Q2_K_S | - | ⏳ Processing | 🟒 IMatrix | -
97
  | xLAM-8x7b-r.IQ2_M | IQ2_M | - | ⏳ Processing | 🟒 IMatrix | -
98
  | xLAM-8x7b-r.IQ2_S | IQ2_S | - | ⏳ Processing | 🟒 IMatrix | -