File size: 88 Bytes
f9aceb1 |
1 2 3 4 5 |
auto_gptq==0.4.2+cu118
datasets==2.14.5
gradio==3.44.4
torch==2.0.1
transformers==4.32.0 |
f9aceb1 |
1 2 3 4 5 |
auto_gptq==0.4.2+cu118
datasets==2.14.5
gradio==3.44.4
torch==2.0.1
transformers==4.32.0 |