File size: 288 Bytes
276f4dd
 
 
608acbe
 
1
2
3
4
5
transformers==4.47.1
git+https://github.com/CadQuery/cadquery.git@e99a15df3cf6a88b69101c405326305b5db8ed94
torch==2.4.0
trimesh==4.5.3
https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.2.post1/flash_attn-2.7.2.post1+cu12torch2.4cxx11abiFALSE-cp310-cp310-linux_x86_64.whl