Upload README.md with huggingface_hub
Browse files
README.md
CHANGED
@@ -247,26 +247,38 @@ tags:
|
|
247 |
- gguf
|
248 |
---
|
249 |
|
250 |
-
# Supa-AI/Ministral-8B-Instruct-2410-gguf
|
251 |
-
This model was converted to GGUF format from [`mistralai/Ministral-8B-Instruct-2410`](https://huggingface.co/mistralai/Ministral-8B-Instruct-2410) using llama.cpp.
|
252 |
-
Refer to the [original model card](https://huggingface.co/mistralai/Ministral-8B-Instruct-2410) for more details on the model.
|
253 |
-
|
254 |
-
## Available Versions
|
|
|
|
|
|
|
|
|
255 |
- `Ministral-8B-Instruct-2410.q8_0.gguf` (q8_0)
|
256 |
-
|
257 |
-
|
258 |
-
|
259 |
-
|
260 |
-
|
261 |
-
|
262 |
-
|
263 |
-
|
264 |
-
|
265 |
-
|
266 |
-
|
267 |
-
|
268 |
-
|
269 |
-
|
270 |
-
|
271 |
-
|
272 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
247 |
- gguf
|
248 |
---
|
249 |
|
250 |
+
# Supa-AI/Ministral-8B-Instruct-2410-gguf
|
251 |
+
This model was converted to GGUF format from [`mistralai/Ministral-8B-Instruct-2410`](https://huggingface.co/mistralai/Ministral-8B-Instruct-2410) using llama.cpp.
|
252 |
+
Refer to the [original model card](https://huggingface.co/mistralai/Ministral-8B-Instruct-2410) for more details on the model.
|
253 |
+
|
254 |
+
## Available Versions
|
255 |
+
- `Ministral-8B-Instruct-2410.q4_0.gguf` (q4_0)
|
256 |
+
- `Ministral-8B-Instruct-2410.q4_1.gguf` (q4_1)
|
257 |
+
- `Ministral-8B-Instruct-2410.q5_0.gguf` (q5_0)
|
258 |
+
- `Ministral-8B-Instruct-2410.q5_1.gguf` (q5_1)
|
259 |
- `Ministral-8B-Instruct-2410.q8_0.gguf` (q8_0)
|
260 |
+
- `Ministral-8B-Instruct-2410.q3_k_s.gguf` (q3_K_S)
|
261 |
+
- `Ministral-8B-Instruct-2410.q3_k_m.gguf` (q3_K_M)
|
262 |
+
- `Ministral-8B-Instruct-2410.q3_k_l.gguf` (q3_K_L)
|
263 |
+
- `Ministral-8B-Instruct-2410.q4_k_s.gguf` (q4_K_S)
|
264 |
+
- `Ministral-8B-Instruct-2410.q4_k_m.gguf` (q4_K_M)
|
265 |
+
- `Ministral-8B-Instruct-2410.q5_k_s.gguf` (q5_K_S)
|
266 |
+
- `Ministral-8B-Instruct-2410.q5_k_m.gguf` (q5_K_M)
|
267 |
+
- `Ministral-8B-Instruct-2410.q6_k.gguf` (q6_K)
|
268 |
+
|
269 |
+
## Use with llama.cpp
|
270 |
+
Replace `FILENAME` with one of the above filenames.
|
271 |
+
|
272 |
+
### CLI:
|
273 |
+
```bash
|
274 |
+
llama-cli --hf-repo Supa-AI/Ministral-8B-Instruct-2410-gguf --hf-file FILENAME -p "Your prompt here"
|
275 |
+
```
|
276 |
+
|
277 |
+
### Server:
|
278 |
+
```bash
|
279 |
+
llama-server --hf-repo Supa-AI/Ministral-8B-Instruct-2410-gguf --hf-file FILENAME -c 2048
|
280 |
+
```
|
281 |
+
|
282 |
+
## Model Details
|
283 |
+
- **Original Model:** [mistralai/Ministral-8B-Instruct-2410](https://huggingface.co/mistralai/Ministral-8B-Instruct-2410)
|
284 |
+
- **Format:** GGUF
|