Context size VRAM requirements
#18
by
LapinMalin
- opened
Does anyone know why this calculator reports such a large VRAM usage for the context size compared to all other models I've checked?
Even deepseek 1.3B or 33B have lower VRAM requirements for the context size. Any explanation?
https://huggingface.co/spaces/NyxKrage/LLM-Model-VRAM-Calculator