Update README_TEMPLATE.md
Browse files- README_TEMPLATE.md +9 -1
README_TEMPLATE.md
CHANGED
@@ -33,11 +33,19 @@ Via pip: `pip install llm-rs`
|
|
33 |
from llm_rs import AutoModel
|
34 |
|
35 |
#Load the model, define any model you like from the list above as the `model_file`
|
36 |
-
model = AutoModel.from_pretrained("rustformers/redpajama-ggml",model_file="RedPajama-INCITE-Base-3B-v1-q4_0-ggjt.bin")
|
37 |
|
38 |
#Generate
|
39 |
print(model.generate("The meaning of life is"))
|
40 |
```
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
41 |
|
42 |
### Rust via [Rustformers/llm](https://github.com/rustformers/llm):
|
43 |
|
|
|
33 |
from llm_rs import AutoModel
|
34 |
|
35 |
#Load the model, define any model you like from the list above as the `model_file`
|
36 |
+
model = AutoModel.from_pretrained("rustformers/redpajama-3b-ggml",model_file="RedPajama-INCITE-Base-3B-v1-q4_0-ggjt.bin")
|
37 |
|
38 |
#Generate
|
39 |
print(model.generate("The meaning of life is"))
|
40 |
```
|
41 |
+
### GUI via [local.ai](https://github.com/louisgv/local.ai)
|
42 |
+
|
43 |
+
#### Installation
|
44 |
+
Download the installer via: [www.localai.app](https://www.localai.app/)
|
45 |
+
|
46 |
+
#### Run inference
|
47 |
+
Simply download the model you want to use and place it into your "models" directory.
|
48 |
+
Then you can simply start a chat session with your model.
|
49 |
|
50 |
### Rust via [Rustformers/llm](https://github.com/rustformers/llm):
|
51 |
|