sheikh-llm / README.md
root
Revamp app with chat studio and tooling
94c8770
metadata
title: Sheikh LLM Studio
emoji: πŸš€
colorFrom: blue
colorTo: purple
sdk: docker
pinned: false

Sheikh LLM Studio

Full-featured FastAPI application deployed on Hugging Face Spaces with multi-model chat, tooling, and model-creation workflows.

Features

  • Web UI with real-time chat experience and adjustable generation settings
  • Multi-model support backed by Hugging Face gated models via InferenceClient
  • Tool integration endpoints for search and code execution prototypes
  • Model Studio workflow to queue fine-tuning jobs and monitor status
  • WebSocket endpoint for streaming-style interactions

Configuration

  1. Add an HF_TOKEN repository secret in your Space with access to the desired gated models.
  2. Optional: adjust available models in app.py under Config.AVAILABLE_MODELS.

Development

git clone git@hf.co:spaces/RecentCoders/sheikh-llm
cd sheikh-llm
python -m venv .venv
source .venv/bin/activate
pip install -r requirements.txt
uvicorn app:app --reload --port 7860

Deployment

./deploy.sh

After pushing, monitor the build logs on your Space and test the endpoints:

  • https://recentcoders-sheikh-llm.hf.space/
  • https://recentcoders-sheikh-llm.hf.space/chat
  • https://recentcoders-sheikh-llm.hf.space/docs
  • https://recentcoders-sheikh-llm.hf.space/health