No one can run this if its 64Gb and not less than 16GB.... whwy would anyone do this here?

#1
by akos2 - opened

No one can run this if its 64Gb and not less than 16GB.... whwy would anyone do this here?

Alibaba-PAI org

The 64GB refers to storage space requirements. The model architecture size (parameter count) remains identical to Wan2.2. If your system can successfully run Wan2.2, it will be fully compatible with this model's computational and memory demands

Alibaba-PAI org

In fact, when the model is used for generation with sequential CPU offloading, 16GB is also fully sufficient for generating 480P + 81frames, because the layer of the model will only be loaded onto the GPU when that specific layer is being used.

Alibaba-PAI org

If you still feel that the computational power is insufficient, you can wait for our 5B version, which will be released very soon.

the model high and low noise is 28GB..... so nope it wont rund on 32GB ram macs, or 16gb ram pcs.... not even a rtx5090 will be able to run a 28GB model....

akos2 changed discussion status to closed
akos2 changed discussion status to open
Alibaba-PAI org

Macs are indeed not very suitable for this, but the 5090 should work fine. You can check out KJ's Wan nodes — he provides a solution that saves GPU memory.
https://github.com/kijai/ComfyUI-WanVideoWrapper

If you're going to run the code directly, you can also use the official VideoX-Fun code, which can be run with sequential_cpu_offload.

Alibaba-PAI org

No matter what, please don't be too pessimistic — there are many solutions available on the internet; it just takes some time to explore and understand them.

no solutions around.... what solutions for mac do you offer? i use pinokio and mostly draw things app from Liuliu on a mac m2. if your HN and LN would be around 12gb or 16gb, then a 32 gb ram mac will run it no problems. but with 28gb for each , there is no mac nor pc that can run this, not even a Rtx 4090 or 5090 , which have merely 24 gb VRAM ,... 28GB times 2 will never fit into the Vram, and using cpu is useless, since you have to wait for hours when using cpu, versus minutes when all is in Vram, which then renders just a few seconds. or a few minutes, but not hours. out of 1000 Pc users, you would be lucky to find one who has more than 32GB ram, same with mac users. This project has no real life applicataion uses, unless you work at an IT company with powerfull grapühic cards. in addition 5B models are a farse, and have no use unless you are a 9 year old girl who sees this for the first time.

Sign up or log in to comment