Do I need a 8xA100 machine to run the model?
#12
by
seven-dev
- opened
I'm new to this, I only have a GeForce RTX 2060. There's no way I can run this model, right?
Is a 8xA100 machine to train the model or to run?
You can run this model, if you don't run out of VRAM.
This model runs on an NVIDIA T4 in fp16 mode. So no, An 8xA100 is not needed for inference.
You definitely do not need a A100 to run the example code. Try to run the code example so that the model doesn't OOM.
Also see: https://huggingface.co/diffusers/controlnet-canny-sdxl-1.0/discussions/16#64db818d69d21c567bf02bdd
williamberman
changed discussion status to
closed