Commit
·
5c7bfed
1
Parent(s):
2bec297
Update requirements
Browse files- requirements.txt +4 -3
requirements.txt
CHANGED
@@ -1,4 +1,5 @@
|
|
1 |
huggingface_hub==0.25.2
|
2 |
-
|
3 |
-
|
4 |
-
|
|
|
|
1 |
huggingface_hub==0.25.2
|
2 |
+
timm
|
3 |
+
einops
|
4 |
+
--index-url https://github.com/Dao-AILab/flash-attention/releases/download/v2.6.3/flash_attn-2.6.3+cu123torch2.4cxx11abiFALSE-cp310-cp310-linux_x86_64.whl
|
5 |
+
--no-dependencies --upgrade flash_attn-2.6.3+cu123torch2.4cxx11abiFALSE-cp310-cp310-linux_x86_64.whl
|