Update README.md
Browse files
README.md
CHANGED
|
@@ -49,6 +49,10 @@ If you use llama-cli to run GGUF, it is recommended to use the latest version of
|
|
| 49 |
|
| 50 |
[Q4_K_M-GGUF](https://huggingface.co/huihui-ai/Huihui-gpt-oss-120b-BF16-abliterated/tree/main/Q4_K_M-GGUF)
|
| 51 |
|
|
|
|
|
|
|
|
|
|
|
|
|
| 52 |
|
| 53 |
```
|
| 54 |
huggingface-cli download huihui-ai/Huihui-gpt-oss-120b-BF16-abliterated --local-dir ./huihui-ai/Huihui-gpt-oss-120b-BF16-abliterated --token xxx
|
|
|
|
| 49 |
|
| 50 |
[Q4_K_M-GGUF](https://huggingface.co/huihui-ai/Huihui-gpt-oss-120b-BF16-abliterated/tree/main/Q4_K_M-GGUF)
|
| 51 |
|
| 52 |
+
[Q8_0-GGUF](https://huggingface.co/huihui-ai/Huihui-gpt-oss-120b-BF16-abliterated/tree/main/Q8_0-GGUF)
|
| 53 |
+
|
| 54 |
+
[f16-GGUF](https://huggingface.co/huihui-ai/Huihui-gpt-oss-120b-BF16-abliterated/tree/main/f16-GGUF)
|
| 55 |
+
|
| 56 |
|
| 57 |
```
|
| 58 |
huggingface-cli download huihui-ai/Huihui-gpt-oss-120b-BF16-abliterated --local-dir ./huihui-ai/Huihui-gpt-oss-120b-BF16-abliterated --token xxx
|