File size: 293 Bytes
e4cf0b6
 
79be9dd
 
e4cf0b6
8f27b7b
e4cf0b6
 
5c7bfed
 
de8ab08
 
 
 
e4cf0b6
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
gradio
numpy
torch==2.4.0
torchvision==0.19.0
Pillow
transformers
matplotlib
decord
timm
einops
accelerate
bitsandbytes
peft
tensorboardX
flash_attn @ https://github.com/Dao-AILab/flash-attention/releases/download/v2.6.3/flash_attn-2.6.3+cu123torch2.4cxx11abiFALSE-cp310-cp310-linux_x86_64.whl