File size: 3,733 Bytes
1cf40f4
 
 
 
 
 
 
 
 
 
 
 
 
 
 
390516e
 
 
 
 
 
1cf40f4
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
5fe0403
1cf40f4
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
---
base_model: google/gemma-2-27b-it
extra_gated_button_content: Acknowledge license
extra_gated_heading: Access Gemma on Hugging Face
extra_gated_prompt: To access Gemma on Hugging Face, you’re required to review and
  agree to Google’s usage license. To do this, please ensure you’re logged in to Hugging
  Face and click below. Requests are processed immediately.
language:
- en
library_name: transformers
license: gemma
quantized_by: mradermacher
---
## About

<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type:  -->
<!-- ### tags:  -->
static quants of https://huggingface.co/google/gemma-2-27b-it

<!-- provided-files -->
weighted/imatrix quants are available at https://huggingface.co/mradermacher/gemma-2-27b-it-i1-GGUF
## Usage

If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.

## Provided Quants

(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)

| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/gemma-2-27b-it-GGUF/resolve/main/gemma-2-27b-it.Q2_K.gguf) | Q2_K | 10.5 |  |
| [GGUF](https://huggingface.co/mradermacher/gemma-2-27b-it-GGUF/resolve/main/gemma-2-27b-it.IQ3_XS.gguf) | IQ3_XS | 11.7 |  |
| [GGUF](https://huggingface.co/mradermacher/gemma-2-27b-it-GGUF/resolve/main/gemma-2-27b-it.IQ3_S.gguf) | IQ3_S | 12.3 | beats Q3_K* |
| [GGUF](https://huggingface.co/mradermacher/gemma-2-27b-it-GGUF/resolve/main/gemma-2-27b-it.Q3_K_S.gguf) | Q3_K_S | 12.3 |  |
| [GGUF](https://huggingface.co/mradermacher/gemma-2-27b-it-GGUF/resolve/main/gemma-2-27b-it.IQ3_M.gguf) | IQ3_M | 12.6 |  |
| [GGUF](https://huggingface.co/mradermacher/gemma-2-27b-it-GGUF/resolve/main/gemma-2-27b-it.Q3_K_M.gguf) | Q3_K_M | 13.5 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/gemma-2-27b-it-GGUF/resolve/main/gemma-2-27b-it.Q3_K_L.gguf) | Q3_K_L | 14.6 |  |
| [GGUF](https://huggingface.co/mradermacher/gemma-2-27b-it-GGUF/resolve/main/gemma-2-27b-it.IQ4_XS.gguf) | IQ4_XS | 15.0 |  |
| [GGUF](https://huggingface.co/mradermacher/gemma-2-27b-it-GGUF/resolve/main/gemma-2-27b-it.Q4_K_S.gguf) | Q4_K_S | 15.8 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/gemma-2-27b-it-GGUF/resolve/main/gemma-2-27b-it.Q4_K_M.gguf) | Q4_K_M | 16.7 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/gemma-2-27b-it-GGUF/resolve/main/gemma-2-27b-it.Q5_K_S.gguf) | Q5_K_S | 19.0 |  |
| [GGUF](https://huggingface.co/mradermacher/gemma-2-27b-it-GGUF/resolve/main/gemma-2-27b-it.Q5_K_M.gguf) | Q5_K_M | 19.5 |  |
| [GGUF](https://huggingface.co/mradermacher/gemma-2-27b-it-GGUF/resolve/main/gemma-2-27b-it.Q6_K.gguf) | Q6_K | 22.4 | very good quality |
| [GGUF](https://huggingface.co/mradermacher/gemma-2-27b-it-GGUF/resolve/main/gemma-2-27b-it.Q8_0.gguf) | Q8_0 | 29.0 | fast, best quality |

Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

![image.png](https://www.nethype.de/huggingface_embed/quantpplgraph.png)

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9

## FAQ / Model Request

See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.

## Thanks

I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time.

<!-- end -->