# MiquMaid-v1-70B IQ2 ## Description 2bit imatrix GGUF quants of [NeverSleep/MiquMaid-v1-70B](https://huggingface.co/NeverSleep/MiquMaid-v1-70B) [Imatrix](https://huggingface.co/Kooten/MiquMaid-v1-70B-IQ2-GGUF/blob/main/Imatrix/imatrix-MiquMaid-c2000-ctx500-wikitext.dat) generated from q8 of MiquMaid, 2000 chunks at 500 ctx. Dataset was Wikitext. These quants take a while to do so please leave a like or a comment on the repo so that i know if there is interest. ## Other quants: EXL2: [3.5bpw](https://huggingface.co/Kooten/MiquMaid-v1-70B-3.5bpw-exl2), [3bpw](https://huggingface.co/Kooten/MiquMaid-v1-70B-3bpw-exl2), [2.4bpw](https://huggingface.co/Kooten/MiquMaid-v1-70B-2.4bpw-exl2) GGUF: [2bit Imatrix GGUF](https://huggingface.co/Kooten/MiquMaid-v1-70B-IQ2-GGUF): [XS](), [XXS]() ### Custom format: ``` ### Instruction: {system prompt} ### Input: {input} ### Response: {reply} ``` ## Contact Kooten on discord [ko-fi.com/kooten](https://ko-fi.com/kooten)