mradermacher commited on
Commit
f071213
1 Parent(s): 2576b84

auto-patch README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -1
README.md CHANGED
@@ -42,8 +42,10 @@ more details, including on how to concatenate multi-part files.
42
  |:-----|:-----|--------:|:------|
43
  | [PART 1](https://huggingface.co/mradermacher/Mistral-Large-218B-Instruct-GGUF/resolve/main/Mistral-Large-218B-Instruct.Q2_K.gguf.part1of2) [PART 2](https://huggingface.co/mradermacher/Mistral-Large-218B-Instruct-GGUF/resolve/main/Mistral-Large-218B-Instruct.Q2_K.gguf.part2of2) | Q2_K | 80.4 | |
44
  | [PART 1](https://huggingface.co/mradermacher/Mistral-Large-218B-Instruct-GGUF/resolve/main/Mistral-Large-218B-Instruct.Q3_K_S.gguf.part1of2) [PART 2](https://huggingface.co/mradermacher/Mistral-Large-218B-Instruct-GGUF/resolve/main/Mistral-Large-218B-Instruct.Q3_K_S.gguf.part2of2) | Q3_K_S | 94.0 | |
 
45
  | [PART 1](https://huggingface.co/mradermacher/Mistral-Large-218B-Instruct-GGUF/resolve/main/Mistral-Large-218B-Instruct.Q3_K_M.gguf.part1of3) [PART 2](https://huggingface.co/mradermacher/Mistral-Large-218B-Instruct-GGUF/resolve/main/Mistral-Large-218B-Instruct.Q3_K_M.gguf.part2of3) [PART 3](https://huggingface.co/mradermacher/Mistral-Large-218B-Instruct-GGUF/resolve/main/Mistral-Large-218B-Instruct.Q3_K_M.gguf.part3of3) | Q3_K_M | 105.2 | lower quality |
46
  | [PART 1](https://huggingface.co/mradermacher/Mistral-Large-218B-Instruct-GGUF/resolve/main/Mistral-Large-218B-Instruct.Q3_K_L.gguf.part1of3) [PART 2](https://huggingface.co/mradermacher/Mistral-Large-218B-Instruct-GGUF/resolve/main/Mistral-Large-218B-Instruct.Q3_K_L.gguf.part2of3) [PART 3](https://huggingface.co/mradermacher/Mistral-Large-218B-Instruct-GGUF/resolve/main/Mistral-Large-218B-Instruct.Q3_K_L.gguf.part3of3) | Q3_K_L | 114.9 | |
 
47
  | [PART 1](https://huggingface.co/mradermacher/Mistral-Large-218B-Instruct-GGUF/resolve/main/Mistral-Large-218B-Instruct.Q4_K_S.gguf.part1of3) [PART 2](https://huggingface.co/mradermacher/Mistral-Large-218B-Instruct-GGUF/resolve/main/Mistral-Large-218B-Instruct.Q4_K_S.gguf.part2of3) [PART 3](https://huggingface.co/mradermacher/Mistral-Large-218B-Instruct-GGUF/resolve/main/Mistral-Large-218B-Instruct.Q4_K_S.gguf.part3of3) | Q4_K_S | 123.8 | fast, recommended |
48
  | [PART 1](https://huggingface.co/mradermacher/Mistral-Large-218B-Instruct-GGUF/resolve/main/Mistral-Large-218B-Instruct.Q4_K_M.gguf.part1of3) [PART 2](https://huggingface.co/mradermacher/Mistral-Large-218B-Instruct-GGUF/resolve/main/Mistral-Large-218B-Instruct.Q4_K_M.gguf.part2of3) [PART 3](https://huggingface.co/mradermacher/Mistral-Large-218B-Instruct-GGUF/resolve/main/Mistral-Large-218B-Instruct.Q4_K_M.gguf.part3of3) | Q4_K_M | 130.2 | fast, recommended |
49
  | [PART 1](https://huggingface.co/mradermacher/Mistral-Large-218B-Instruct-GGUF/resolve/main/Mistral-Large-218B-Instruct.Q5_K_S.gguf.part1of4) [PART 2](https://huggingface.co/mradermacher/Mistral-Large-218B-Instruct-GGUF/resolve/main/Mistral-Large-218B-Instruct.Q5_K_S.gguf.part2of4) [PART 3](https://huggingface.co/mradermacher/Mistral-Large-218B-Instruct-GGUF/resolve/main/Mistral-Large-218B-Instruct.Q5_K_S.gguf.part3of4) [PART 4](https://huggingface.co/mradermacher/Mistral-Large-218B-Instruct-GGUF/resolve/main/Mistral-Large-218B-Instruct.Q5_K_S.gguf.part4of4) | Q5_K_S | 150.1 | |
@@ -68,6 +70,6 @@ questions you might have and/or if you want some other model quantized.
68
 
69
  I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
70
  me use its servers and providing upgrades to my workstation to enable
71
- this work in my free time.
72
 
73
  <!-- end -->
 
42
  |:-----|:-----|--------:|:------|
43
  | [PART 1](https://huggingface.co/mradermacher/Mistral-Large-218B-Instruct-GGUF/resolve/main/Mistral-Large-218B-Instruct.Q2_K.gguf.part1of2) [PART 2](https://huggingface.co/mradermacher/Mistral-Large-218B-Instruct-GGUF/resolve/main/Mistral-Large-218B-Instruct.Q2_K.gguf.part2of2) | Q2_K | 80.4 | |
44
  | [PART 1](https://huggingface.co/mradermacher/Mistral-Large-218B-Instruct-GGUF/resolve/main/Mistral-Large-218B-Instruct.Q3_K_S.gguf.part1of2) [PART 2](https://huggingface.co/mradermacher/Mistral-Large-218B-Instruct-GGUF/resolve/main/Mistral-Large-218B-Instruct.Q3_K_S.gguf.part2of2) | Q3_K_S | 94.0 | |
45
+ | [PART 1](https://huggingface.co/mradermacher/Mistral-Large-218B-Instruct-GGUF/resolve/main/Mistral-Large-218B-Instruct.IQ3_S.gguf.part1of2) [PART 2](https://huggingface.co/mradermacher/Mistral-Large-218B-Instruct-GGUF/resolve/main/Mistral-Large-218B-Instruct.IQ3_S.gguf.part2of2) | IQ3_S | 94.3 | beats Q3_K* |
46
  | [PART 1](https://huggingface.co/mradermacher/Mistral-Large-218B-Instruct-GGUF/resolve/main/Mistral-Large-218B-Instruct.Q3_K_M.gguf.part1of3) [PART 2](https://huggingface.co/mradermacher/Mistral-Large-218B-Instruct-GGUF/resolve/main/Mistral-Large-218B-Instruct.Q3_K_M.gguf.part2of3) [PART 3](https://huggingface.co/mradermacher/Mistral-Large-218B-Instruct-GGUF/resolve/main/Mistral-Large-218B-Instruct.Q3_K_M.gguf.part3of3) | Q3_K_M | 105.2 | lower quality |
47
  | [PART 1](https://huggingface.co/mradermacher/Mistral-Large-218B-Instruct-GGUF/resolve/main/Mistral-Large-218B-Instruct.Q3_K_L.gguf.part1of3) [PART 2](https://huggingface.co/mradermacher/Mistral-Large-218B-Instruct-GGUF/resolve/main/Mistral-Large-218B-Instruct.Q3_K_L.gguf.part2of3) [PART 3](https://huggingface.co/mradermacher/Mistral-Large-218B-Instruct-GGUF/resolve/main/Mistral-Large-218B-Instruct.Q3_K_L.gguf.part3of3) | Q3_K_L | 114.9 | |
48
+ | [PART 1](https://huggingface.co/mradermacher/Mistral-Large-218B-Instruct-GGUF/resolve/main/Mistral-Large-218B-Instruct.IQ4_XS.gguf.part1of3) [PART 2](https://huggingface.co/mradermacher/Mistral-Large-218B-Instruct-GGUF/resolve/main/Mistral-Large-218B-Instruct.IQ4_XS.gguf.part2of3) [PART 3](https://huggingface.co/mradermacher/Mistral-Large-218B-Instruct-GGUF/resolve/main/Mistral-Large-218B-Instruct.IQ4_XS.gguf.part3of3) | IQ4_XS | 117.5 | |
49
  | [PART 1](https://huggingface.co/mradermacher/Mistral-Large-218B-Instruct-GGUF/resolve/main/Mistral-Large-218B-Instruct.Q4_K_S.gguf.part1of3) [PART 2](https://huggingface.co/mradermacher/Mistral-Large-218B-Instruct-GGUF/resolve/main/Mistral-Large-218B-Instruct.Q4_K_S.gguf.part2of3) [PART 3](https://huggingface.co/mradermacher/Mistral-Large-218B-Instruct-GGUF/resolve/main/Mistral-Large-218B-Instruct.Q4_K_S.gguf.part3of3) | Q4_K_S | 123.8 | fast, recommended |
50
  | [PART 1](https://huggingface.co/mradermacher/Mistral-Large-218B-Instruct-GGUF/resolve/main/Mistral-Large-218B-Instruct.Q4_K_M.gguf.part1of3) [PART 2](https://huggingface.co/mradermacher/Mistral-Large-218B-Instruct-GGUF/resolve/main/Mistral-Large-218B-Instruct.Q4_K_M.gguf.part2of3) [PART 3](https://huggingface.co/mradermacher/Mistral-Large-218B-Instruct-GGUF/resolve/main/Mistral-Large-218B-Instruct.Q4_K_M.gguf.part3of3) | Q4_K_M | 130.2 | fast, recommended |
51
  | [PART 1](https://huggingface.co/mradermacher/Mistral-Large-218B-Instruct-GGUF/resolve/main/Mistral-Large-218B-Instruct.Q5_K_S.gguf.part1of4) [PART 2](https://huggingface.co/mradermacher/Mistral-Large-218B-Instruct-GGUF/resolve/main/Mistral-Large-218B-Instruct.Q5_K_S.gguf.part2of4) [PART 3](https://huggingface.co/mradermacher/Mistral-Large-218B-Instruct-GGUF/resolve/main/Mistral-Large-218B-Instruct.Q5_K_S.gguf.part3of4) [PART 4](https://huggingface.co/mradermacher/Mistral-Large-218B-Instruct-GGUF/resolve/main/Mistral-Large-218B-Instruct.Q5_K_S.gguf.part4of4) | Q5_K_S | 150.1 | |
 
70
 
71
  I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
72
  me use its servers and providing upgrades to my workstation to enable
73
+ this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to.
74
 
75
  <!-- end -->