prithivMLmods
commited on
Update README.md
Browse files
README.md
CHANGED
@@ -198,6 +198,7 @@ To exit the program, simply type:
|
|
198 |
```plaintext
|
199 |
/exit
|
200 |
```
|
|
|
201 |
## Example 2: Running Multi-Modal Models (Future Use)
|
202 |
|
203 |
Ollama supports running multi-modal models where you can send images and ask questions based on them. This section will be updated as more models become available.
|
@@ -213,5 +214,4 @@ Quantized models like **triangulum-10b-f16.gguf** are optimized for performance
|
|
213 |
|
214 |
Running the **Triangulum-10B** model with Ollama provides a robust way to leverage open-source LLMs locally for diverse use cases. By following these steps, you can explore the capabilities of other open-source models in the future.
|
215 |
|
216 |
-
Happy experimenting!
|
217 |
-
```
|
|
|
198 |
```plaintext
|
199 |
/exit
|
200 |
```
|
201 |
+
|
202 |
## Example 2: Running Multi-Modal Models (Future Use)
|
203 |
|
204 |
Ollama supports running multi-modal models where you can send images and ask questions based on them. This section will be updated as more models become available.
|
|
|
214 |
|
215 |
Running the **Triangulum-10B** model with Ollama provides a robust way to leverage open-source LLMs locally for diverse use cases. By following these steps, you can explore the capabilities of other open-source models in the future.
|
216 |
|
217 |
+
Happy experimenting!
|
|