Update README.md
#13
by
pavben
- opened
README.md
CHANGED
@@ -84,6 +84,7 @@ These Mixtral GGUFs are known to work in:
|
|
84 |
* KoboldCpp 1.52 as later
|
85 |
* LM Studio 0.2.9 and later
|
86 |
* llama-cpp-python 0.2.23 and later
|
|
|
87 |
|
88 |
Other clients/libraries, not listed above, may not yet work.
|
89 |
|
@@ -160,6 +161,7 @@ The following clients/libraries will automatically download models for you, prov
|
|
160 |
* LM Studio
|
161 |
* LoLLMS Web UI
|
162 |
* Faraday.dev
|
|
|
163 |
|
164 |
### In `text-generation-webui`
|
165 |
|
|
|
84 |
* KoboldCpp 1.52 as later
|
85 |
* LM Studio 0.2.9 and later
|
86 |
* llama-cpp-python 0.2.23 and later
|
87 |
+
* Sanctum 1.1.2 and later
|
88 |
|
89 |
Other clients/libraries, not listed above, may not yet work.
|
90 |
|
|
|
161 |
* LM Studio
|
162 |
* LoLLMS Web UI
|
163 |
* Faraday.dev
|
164 |
+
* Sanctum
|
165 |
|
166 |
### In `text-generation-webui`
|
167 |
|