varunneal commited on
Commit
252fa2f
·
verified ·
1 Parent(s): 8880718

update readme

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -1,6 +1,6 @@
1
  Flash Attention 3 compatible with `torch.compile`. See [this PR](https://github.com/Dao-AILab/flash-attention/pull/1769) by guilhermeleobas for more details.
2
 
3
- There is a build here for Torch 2.8.0 and a build for Torch Nightlies from 09/10/25 (Torch 2.10) Nightlies onward.
4
 
5
  Reproduce:
6
 
@@ -31,7 +31,7 @@ export FLASH_ATTENTION_DISABLE_HDIM256=TRUE
31
  python setup.py bdist_wheel
32
  ```
33
 
34
- ## Torch 2.10 Nightlies build
35
 
36
  Compiled from `https://github.com/varunneal/flash-attention` on branch `stable`.
37
 
 
1
  Flash Attention 3 compatible with `torch.compile`. See [this PR](https://github.com/Dao-AILab/flash-attention/pull/1769) by guilhermeleobas for more details.
2
 
3
+ There is a build here for Torch 2.8.0 and a build for Torch Nightlies from 08/30 onward.
4
 
5
  Reproduce:
6
 
 
31
  python setup.py bdist_wheel
32
  ```
33
 
34
+ ## Torch Nightlies build
35
 
36
  Compiled from `https://github.com/varunneal/flash-attention` on branch `stable`.
37