update readme
Browse files
README.md
CHANGED
|
@@ -1,6 +1,6 @@
|
|
| 1 |
Flash Attention 3 compatible with `torch.compile`. See [this PR](https://github.com/Dao-AILab/flash-attention/pull/1769) by guilhermeleobas for more details.
|
| 2 |
|
| 3 |
-
There is a build here for Torch 2.8.0 and a build for Torch Nightlies from
|
| 4 |
|
| 5 |
Reproduce:
|
| 6 |
|
|
@@ -31,7 +31,7 @@ export FLASH_ATTENTION_DISABLE_HDIM256=TRUE
|
|
| 31 |
python setup.py bdist_wheel
|
| 32 |
```
|
| 33 |
|
| 34 |
-
## Torch
|
| 35 |
|
| 36 |
Compiled from `https://github.com/varunneal/flash-attention` on branch `stable`.
|
| 37 |
|
|
|
|
| 1 |
Flash Attention 3 compatible with `torch.compile`. See [this PR](https://github.com/Dao-AILab/flash-attention/pull/1769) by guilhermeleobas for more details.
|
| 2 |
|
| 3 |
+
There is a build here for Torch 2.8.0 and a build for Torch Nightlies from 08/30 onward.
|
| 4 |
|
| 5 |
Reproduce:
|
| 6 |
|
|
|
|
| 31 |
python setup.py bdist_wheel
|
| 32 |
```
|
| 33 |
|
| 34 |
+
## Torch Nightlies build
|
| 35 |
|
| 36 |
Compiled from `https://github.com/varunneal/flash-attention` on branch `stable`.
|
| 37 |
|