Update README.md
#1
by
wolfram
- opened
README.md
CHANGED
@@ -16,9 +16,9 @@ tags:
|
|
16 |
|
17 |
![image/jpeg](https://cdn-uploads.huggingface.co/production/uploads/6303ca537373aacccd85d8a7/LxO9j7OykuabKLYQHIodG.jpeg)
|
18 |
|
19 |
-
- HF: wolfram/miqu-1-103b
|
20 |
-
- GGUF:
|
21 |
-
- EXL2:
|
22 |
|
23 |
This is a 103b frankenmerge of [miqu-1-70b](https://huggingface.co/miqudev/miqu-1-70b) created by interleaving layers of [miqu-1-70b-sf](https://huggingface.co/152334H/miqu-1-70b-sf) with itself using [mergekit](https://github.com/cg123/mergekit).
|
24 |
|
@@ -26,7 +26,7 @@ Inspired by [Midnight-Rose-103B-v2.0.3](https://huggingface.co/sophosympatheia/M
|
|
26 |
|
27 |
Thanks for the support, [CopilotKit](https://github.com/CopilotKit/CopilotKit) - the open-source platform for building in-app AI Copilots into any product, with any LLM model. Check out their GitHub.
|
28 |
|
29 |
-
Thanks for the
|
30 |
|
31 |
Also available:
|
32 |
|
|
|
16 |
|
17 |
![image/jpeg](https://cdn-uploads.huggingface.co/production/uploads/6303ca537373aacccd85d8a7/LxO9j7OykuabKLYQHIodG.jpeg)
|
18 |
|
19 |
+
- HF: [wolfram/miqu-1-103b](https://huggingface.co/wolfram/miqu-1-103b)
|
20 |
+
- GGUF: mradermacher's [static quants](https://huggingface.co/mradermacher/miqu-1-103b-GGUF) | [weighted/imatrix quants](https://huggingface.co/mradermacher/miqu-1-103b-i1-GGUF)
|
21 |
+
- EXL2: LoneStriker's [2.4bpw](https://huggingface.co/LoneStriker/miqu-1-103b-2.4bpw-h6-exl2) | [3.0bpw](https://huggingface.co/LoneStriker/miqu-1-103b-3.0bpw-h6-exl2) | [3.5bpw](https://huggingface.co/LoneStriker/miqu-1-103b-3.5bpw-h6-exl2)
|
22 |
|
23 |
This is a 103b frankenmerge of [miqu-1-70b](https://huggingface.co/miqudev/miqu-1-70b) created by interleaving layers of [miqu-1-70b-sf](https://huggingface.co/152334H/miqu-1-70b-sf) with itself using [mergekit](https://github.com/cg123/mergekit).
|
24 |
|
|
|
26 |
|
27 |
Thanks for the support, [CopilotKit](https://github.com/CopilotKit/CopilotKit) - the open-source platform for building in-app AI Copilots into any product, with any LLM model. Check out their GitHub.
|
28 |
|
29 |
+
Thanks for the quants, [Michael Radermacher](https://huggingface.co/mradermacher) and [Lone Striker](https://huggingface.co/LoneStriker)!
|
30 |
|
31 |
Also available:
|
32 |
|