"OutOfMemoryError: CUDA out of memory"

#21
by Anuraag-pal - opened

Screenshot 2023-11-22 at 9.51.44 PM.png

Llama 2 7b is running fine with llama_index but when querying with mistral7b varient, it's showing this error!

Facing the same error, wondering if you've found out a solution yet?

Your need to confirm your account before you can post a new comment.

Sign up or log in to comment