excom-ai-demo / requirements.txt
plarnholt's picture
Update requirements.txt
b29ecd9 verified
raw
history blame
330 Bytes
fastapi>=0.111
uvicorn[standard]>=0.30
gradio>=4.38
requests>=2.31
# vLLM ↔ Torch version pair (CUDA 12.1 wheels)
vllm==0.5.5
--extra-index-url https://download.pytorch.org/whl/cu121
torch==2.4.0
transformers>=4.44
accelerate>=0.30
# Fix Outlines/Numba cache error in vLLM guided decoding stack
outlines==0.0.46
numba==0.59.1