Unable to run with vLLM
1
#6 opened about 13 hours ago
by
rak-r
Update config.json
1
#5 opened about 18 hours ago
by
prince-canuma
Run with llamacpp
❤️
🔥
1
5
#4 opened about 23 hours ago
by
AbacusGauge
Congrats!
❤️
3
1
#3 opened 1 day ago
by
CyborgPaloma
Adding `transformers` as the library name
🚀
2
#2 opened 4 days ago
by
ariG23498
How to run inference without vLLM? (e.g., with standard transformers)
1
#1 opened 4 days ago
by
Fezz04