Spaces:
Paused
Paused
Tool Calling Guide
Your ExCom AI deployment supports tool calling! However, there's a quirk with vLLM that requires a workaround.
The Issue
vLLM requires --enable-auto-tool-choice and --tool-call-parser flags to accept the tool_choice: "auto" parameter. Since Qwen 2.5 has native tool calling built into the model, we don't use these flags.
Result: LangChain's default agent framework sends tool_choice: "auto" which vLLM rejects with a 400 error.
Solution: Use OpenAI SDK Directly
I've created simple_tool_chat.py which uses the OpenAI SDK directly and doesn't send tool_choice.
Installation
pip install openai
Usage
python simple_tool_chat.py
Example Session
You: What is 15 * 23 + 100?
🔧 Calling tool: calculator({'expression': '15 * 23 + 100'})
Assistant: The result is 445.
You: What's the weather in Paris and what time is it?
🔧 Calling tool: get_weather({'city': 'Paris'})
🔧 Calling tool: get_current_time({})
Assistant: The weather in Paris is 18°C and sunny. The current time is 2025-10-09 18:30:45.
How It Works
- No tool_choice parameter - We don't send
tool_choiceat all - Qwen decides naturally - The model's training handles when to use tools
- OpenAI SDK - Direct HTTP calls to your vLLM endpoint
- Multi-turn - Maintains conversation history for context
Using with Your Own Code
from openai import OpenAI
client = OpenAI(
base_url="https://plarnholt-excom-ai-demo.hf.space/v1",
api_key="not-needed"
)
# Define your tools
tools = [{
"type": "function",
"function": {
"name": "my_tool",
"description": "What it does",
"parameters": {
"type": "object",
"properties": {
"param": {"type": "string"}
}
}
}
}]
# Call without tool_choice parameter
response = client.chat.completions.create(
model="excom-ai",
messages=[{"role": "user", "content": "Use my tool"}],
tools=tools,
temperature=0.4
# NOTE: No tool_choice parameter!
)
# Check for tool calls
if response.choices[0].message.tool_calls:
for tool_call in response.choices[0].message.tool_calls:
print(f"Tool: {tool_call.function.name}")
print(f"Args: {tool_call.function.arguments}")
Adding Custom Tools
Edit simple_tool_chat.py:
# 1. Add tool definition to 'tools' list
{
"type": "function",
"function": {
"name": "my_custom_tool",
"description": "What it does",
"parameters": {
"type": "object",
"properties": {
"param": {"type": "string", "description": "Param description"}
},
"required": ["param"]
}
}
}
# 2. Add implementation
def my_custom_tool(param: str) -> str:
# Your logic here
return "result"
# 3. Add to dispatcher
def execute_tool(tool_name: str, arguments: dict) -> str:
# ... existing tools ...
elif tool_name == "my_custom_tool":
return my_custom_tool(arguments["param"])
Troubleshooting
Error: "auto" tool choice requires --enable-auto-tool-choice
- You're using LangChain's agent framework
- Solution: Use
simple_tool_chat.pyinstead
Tool calls not working
- Make sure your Space is running: https://huggingface.co/spaces/plarnholt/excom-ai-demo
- Check that you're not sending
tool_choiceparameter - Verify tools are properly formatted (see OpenAI docs)
500 Internal Server Error
- Space might be sleeping - make a request to wake it up
- Check Space logs for errors