Spaces:
Sleeping
Sleeping
upgrade llama cpp python to 0.3.16
Browse files- requirements.txt +1 -1
requirements.txt
CHANGED
|
@@ -2,7 +2,7 @@ fastapi==0.109.0
|
|
| 2 |
uvicorn[standard]==0.31.1
|
| 3 |
websockets==12.0
|
| 4 |
python-multipart==0.0.9
|
| 5 |
-
llama-cpp-python==0.
|
| 6 |
opencc-python-reimplemented==0.1.7
|
| 7 |
pydantic==2.11.0
|
| 8 |
aiofiles==23.2.1
|
|
|
|
| 2 |
uvicorn[standard]==0.31.1
|
| 3 |
websockets==12.0
|
| 4 |
python-multipart==0.0.9
|
| 5 |
+
llama-cpp-python==0.3.16
|
| 6 |
opencc-python-reimplemented==0.1.7
|
| 7 |
pydantic==2.11.0
|
| 8 |
aiofiles==23.2.1
|