Use --prompt
#6
by
pcuenq
HF staff
- opened
README.md
CHANGED
@@ -28,7 +28,7 @@ export HF_HUB_ENABLE_HF_TRANSFER=1
|
|
28 |
huggingface-cli download --local-dir Llama-2-7b-chat-mlx mlx-llama/Llama-2-7b-chat-mlx
|
29 |
|
30 |
# Run example
|
31 |
-
python mlx-examples/llama/llama.py Llama-2-7b-chat-mlx/ Llama-2-7b-chat-mlx/tokenizer.model
|
32 |
```
|
33 |
|
34 |
Please, refer to the [original model card](https://huggingface.co/meta-llama/Llama-2-7b-chat) for details on Llama 2.
|
|
|
28 |
huggingface-cli download --local-dir Llama-2-7b-chat-mlx mlx-llama/Llama-2-7b-chat-mlx
|
29 |
|
30 |
# Run example
|
31 |
+
python mlx-examples/llama/llama.py --prompt "My name is " Llama-2-7b-chat-mlx/ Llama-2-7b-chat-mlx/tokenizer.model
|
32 |
```
|
33 |
|
34 |
Please, refer to the [original model card](https://huggingface.co/meta-llama/Llama-2-7b-chat) for details on Llama 2.
|