Any chance anyone is quantizing this into a 2.4bpw EXL2 version for those of us with a single 24GB video cards?
#30
by
clevnumb
- opened
That would be very cool! I would love to use this in Text-Generation-WebUI
Thanks in advance if you are!
Lonestriker has it at that quant level https://huggingface.co/LoneStriker/miqu-1-70b-sf-2.4bpw-h6-exl2