Text-to-Video

45GB VRAM? Why over 48GB

#8
by DrNicefellow - opened

CUDA_VISIBLE_DEVICES=1 python sample_video.py --video-size 544 960 --video-length 129 --infer-steps 30 --prompt "a cat is playing in the backyard" --flow-reverse --seed 0 --save-path /src/results

+-----------------------------------------+------------------------+----------------------+
| 1 NVIDIA RTX A6000 On | 00000000:2B:00.0 Off | Off |
| 60% 85C P2 145W / 300W | 48298MiB / 49140MiB | 100% Default |
| | | N/A |

It's probably over the A6000's complete VRAM. In model card, it says 45GB vram....

CUDA_VISIBLE_DEVICES=1 python sample_video.py --video-size 544 960 --video-length 129 --infer-steps 30 --prompt "a cat is playing in the backyard" --flow-reverse --seed 0 --save-path /src/results

+-----------------------------------------+------------------------+----------------------+
| 1 NVIDIA RTX A6000 On | 00000000:2B:00.0 Off | Off |
| 60% 85C P2 145W / 300W | 48298MiB / 49140MiB | 100% Default |
| | | N/A |

It's probably over the A6000's complete VRAM. In model card, it says 45GB vram....

it is after that model has started, so you need 80gb. they are only talking when it is generating after it started

CUDA_VISIBLE_DEVICES=1 python sample_video.py --video-size 544 960 --video-length 129 --infer-steps 30 --prompt "a cat is playing in the backyard" --flow-reverse --seed 0 --save-path /src/results

+-----------------------------------------+------------------------+----------------------+
| 1 NVIDIA RTX A6000 On | 00000000:2B:00.0 Off | Off |
| 60% 85C P2 145W / 300W | 48298MiB / 49140MiB | 100% Default |
| | | N/A |

It's probably over the A6000's complete VRAM. In model card, it says 45GB vram....

it is after that model has started, so you need 80gb

When you say how much VRAM it requres, you should tell the peak VRAM consumption, rather than before the model started.

CUDA_VISIBLE_DEVICES=1 python sample_video.py --video-size 544 960 --video-length 129 --infer-steps 30 --prompt "a cat is playing in the backyard" --flow-reverse --seed 0 --save-path /src/results

+-----------------------------------------+------------------------+----------------------+
| 1 NVIDIA RTX A6000 On | 00000000:2B:00.0 Off | Off |
| 60% 85C P2 145W / 300W | 48298MiB / 49140MiB | 100% Default |
| | | N/A |

It's probably over the A6000's complete VRAM. In model card, it says 45GB vram....

it is after that model has started, so you need 80gb

When you say how much VRAM it requres, you should tell the peak VRAM consumption, rather than before the model started.

i am not part of tencent but it does need like 70gb of vram to start

Sign up or log in to comment