与与 commited on
Commit
f3fa7b4
·
1 Parent(s): e6df3f2

Update requirements

Browse files
Files changed (1) hide show
  1. requirements.txt +1 -1
requirements.txt CHANGED
@@ -1,4 +1,4 @@
1
  numpy==1.24.3
2
  torch==2.2.0
3
  transformers==4.44.2
4
- flash-attn==2.6.3
 
1
  numpy==1.24.3
2
  torch==2.2.0
3
  transformers==4.44.2
4
+ https://github.com/Dao-AILab/flash-attention/releases/download/v2.6.3/flash_attn-2.6.3+cu123torch2.2cxx11abiFALSE-cp310-cp310-linux_x86_64.whl