Update README.md
Browse files
README.md
CHANGED
|
@@ -61,6 +61,9 @@ It seems like 1.71 koboldcpp can't run GGUFs of llama-3.1 MoE models yet, or per
|
|
| 61 |
If anyone has similar problem - run the model directly from llama.cpp, here's simple [open source GUI(Windows)](https://huggingface.co/xxx777xxxASD/LlamaCpp-Server-Ui) you can use if the console is your worst enemy
|
| 62 |

|
| 63 |
|
|
|
|
|
|
|
|
|
|
| 64 |
## Models used
|
| 65 |
|
| 66 |
- [NeverSleep/Lumimaid-v0.2-8B](https://huggingface.co/NeverSleep/Lumimaid-v0.2-8B)
|
|
|
|
| 61 |
If anyone has similar problem - run the model directly from llama.cpp, here's simple [open source GUI(Windows)](https://huggingface.co/xxx777xxxASD/LlamaCpp-Server-Ui) you can use if the console is your worst enemy
|
| 62 |

|
| 63 |
|
| 64 |
+
UPDATE 28.07.2024
|
| 65 |
+
Try [this koboldcpp version](https://github.com/Nexesenex/kobold.cpp/releases/tag/v1.71013_b3455%2B9)
|
| 66 |
+
|
| 67 |
## Models used
|
| 68 |
|
| 69 |
- [NeverSleep/Lumimaid-v0.2-8B](https://huggingface.co/NeverSleep/Lumimaid-v0.2-8B)
|