CUDA usage is low

#28
by Max545 - opened

When I trained a gemma2, the GPU usage is low (0% at most of time). And when I use the same method (LoRA, peft library) to train llama, the GPU usage is constantly about 100%. What's the reason?

Sign up or log in to comment