Update README.md
Browse files
README.md
CHANGED
|
@@ -49,10 +49,10 @@ This is RWKV-7 model under flash-linear attention format.
|
|
| 49 |
## Uses
|
| 50 |
|
| 51 |
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
|
| 52 |
-
Install `flash-linear-attention` and the latest version of `transformers` before using this model:
|
| 53 |
|
| 54 |
```bash
|
| 55 |
-
pip install
|
| 56 |
pip install 'transformers>=4.48.0'
|
| 57 |
```
|
| 58 |
|
|
|
|
| 49 |
## Uses
|
| 50 |
|
| 51 |
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
|
| 52 |
+
Install `flash-linear-attention` <= 0.1.2 and the latest version of `transformers` before using this model:
|
| 53 |
|
| 54 |
```bash
|
| 55 |
+
pip install --no-use-pep517 flash-linear-attention==0.1.2
|
| 56 |
pip install 'transformers>=4.48.0'
|
| 57 |
```
|
| 58 |
|