Update README.md
Browse files
README.md
CHANGED
@@ -41,7 +41,7 @@ This repo contains GGML format model files for [Stability AI's StableBeluga 2](h
|
|
41 |
These 70B Llama 2 GGML files currently only support CPU inference. They are known to work with:
|
42 |
* [llama.cpp](https://github.com/ggerganov/llama.cpp)
|
43 |
* [text-generation-webui](https://github.com/oobabooga/text-generation-webui), the most popular web UI.
|
44 |
-
* [KoboldCpp](https://github.com/LostRuins/koboldcpp), version 1.37 and later. A powerful GGML web UI
|
45 |
* [llama-cpp-python](https://github.com/abetlen/llama-cpp-python), version 0.1.77 and later. A Python library with GPU accel, LangChain support, and OpenAI-compatible API server.
|
46 |
|
47 |
## Repositories available
|
|
|
41 |
These 70B Llama 2 GGML files currently only support CPU inference. They are known to work with:
|
42 |
* [llama.cpp](https://github.com/ggerganov/llama.cpp)
|
43 |
* [text-generation-webui](https://github.com/oobabooga/text-generation-webui), the most popular web UI.
|
44 |
+
* [KoboldCpp](https://github.com/LostRuins/koboldcpp), version 1.37 and later. A powerful GGML web UI, especially good for story telling.
|
45 |
* [llama-cpp-python](https://github.com/abetlen/llama-cpp-python), version 0.1.77 and later. A Python library with GPU accel, LangChain support, and OpenAI-compatible API server.
|
46 |
|
47 |
## Repositories available
|