myContainers / serve-from-url.sh
TobDeBer's picture
update README
a653b79
raw
history blame
No virus
313 Bytes
1) wget the model
2) save hostip: ip route | grep 'default' | awk '{print $9}' >hostip
3a) calls llama-server in container
3b) calls sed + llama-server in container
opodman run -d --net=host -v ~/funstreams:/models localhost/bookworm:server ./models/llama-server-static -m /models/qwen2-500.gguf