prithivMLmods commited on
Commit
a4d1dd5
·
verified ·
1 Parent(s): c8c0560

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -210,6 +210,6 @@ Quantized models like **triangulum-10b-f16.gguf** are optimized for performance
210
  1. Ensure your system has sufficient VRAM or CPU resources.
211
  2. Use the `.gguf` model format for compatibility with Ollama.
212
 
213
- ## Conclusion
214
 
215
  Running the **Triangulum-10B** model with Ollama provides a robust way to leverage open-source LLMs locally for diverse use cases. By following these steps, you can explore the capabilities of other open-source models in the future.
 
210
  1. Ensure your system has sufficient VRAM or CPU resources.
211
  2. Use the `.gguf` model format for compatibility with Ollama.
212
 
213
+ # **Conclusion**
214
 
215
  Running the **Triangulum-10B** model with Ollama provides a robust way to leverage open-source LLMs locally for diverse use cases. By following these steps, you can explore the capabilities of other open-source models in the future.