excom-ai-demo / requirements.txt
Peter Larnholt
Fix torch version to 2.4.0 for vLLM 0.6.3.post1 compatibility
972cd35
raw
history blame
211 Bytes
fastapi>=0.111
uvicorn[standard]>=0.30
gradio>=4.38
requests>=2.31
# vLLM + CUDA 12.1
vllm==0.6.3.post1
--extra-index-url https://download.pytorch.org/whl/cu121
torch==2.4.0
transformers>=4.44
accelerate>=0.30