How to run the model ?
#2
by
d3m0t3p
- opened
Hi,
thx for your work it's greatly appreciated.
I'm wondering how to use the model:
- directly from python, i tried the top "use in transformers" but it didn't work.
- using llama.cpp but it crashed, my computer (i had to reboot)
Any help would be appreciated,
Thx !
Use Koboldcpp, this app for allow you to split the workload effectively.