File size: 557 Bytes
145cc78 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 |
---
tags:
- llamafile
- GGUF
base_model: QuantFactory/Phi-3-mini-4k-instruct-GGUF
---
## phi-3-mini-llamafile-nonAVX
llamafile lets you distribute and run LLMs with a single file. [announcement blog post](https://hacks.mozilla.org/2023/11/introducing-llamafile/)
#### Downloads
- [Phi-3-mini-4k-instruct.Q4_0.llamafile](https://huggingface.co/blueprintninja/phi-3-mini-llamafile-nonAVX/resolve/main/Phi-3-mini-4k-instruct.Q4_0.llamafile)
This repository was created using the [llamafile-builder](https://github.com/rabilrbl/llamafile-builder)
|