GPUs Requirement For Inference
#42
by
Aillian
- opened
What is the GPUs requirement for inference on this model?
This comment has been hidden
Any answer ?
I am running exl2 5.0bpw with 64k context at this very moment on 90.72 GB
Hi, this issue looks resolved so closing it but feel free to reopen in case you're still facing any issues related to this.
shivi
changed discussion status to
closed