NiWaRe commited on
Commit
1ec8d33
Β·
1 Parent(s): f647629

adapt readme to fit HF

Browse files
Files changed (1) hide show
  1. README.md +69 -275
README.md CHANGED
@@ -1,314 +1,108 @@
1
- <p align="center">
2
- <picture>
3
- <source media="(prefers-color-scheme: dark)" srcset="https://raw.githubusercontent.com/wandb/wandb/main/assets/logo-dark.svg">
4
- <source media="(prefers-color-scheme: light)" srcset="https://raw.githubusercontent.com/wandb/wandb/main/assets/logo-light.svg">
5
- <img src="https://raw.githubusercontent.com/wandb/wandb/main/assets/logo-light.svg" width="600" alt="Weights & Biases">
6
- </picture>
7
- </p>
8
-
 
9
 
10
  # Weights & Biases MCP Server
11
 
12
- A Model Context Protocol (MCP) server for querying [Weights & Biases](https://www.wandb.ai/) data. This server allows a MCP Client to:
13
-
14
- - query W&B Models runs and sweeps
15
- - query W&B Weave traces, evaluations and datasets
16
- - query [wandbot](https://github.com/wandb/wandbot), the W&B support agent, for general W&B feature questions
17
- - write text and charts to W&B Reports
18
-
19
 
20
- ## Installation
 
 
 
 
21
 
22
- ### 1. Install `uv`
23
 
24
- Please first install [`uv`](https://docs.astral.sh/uv/getting-started/installation/) with either:
 
25
 
 
 
26
 
27
- ```bash
28
- curl -LsSf https://astral.sh/uv/install.sh | sh
29
  ```
30
-
31
- or
32
-
33
- ```bash
34
- brew install uv
35
  ```
36
 
37
- ### 2. Install on your MCP client of choice:
38
 
39
- ### Cursor, project-only
40
- Enable the server for a specific project. Run the following in the root of your project dir:
41
 
42
- ```bash
43
- uvx --from git+https://github.com/wandb/wandb-mcp-server -- add_to_client --config_path .cursor/mcp.json && uvx wandb login
44
- ```
45
 
46
- ### Cursor global
47
- Enable the server for all Cursor projects, doesn't matter where this is run:
 
 
48
 
49
- ```bash
50
- uvx --from git+https://github.com/wandb/wandb-mcp-server -- add_to_client --config_path ~/.cursor/mcp.json && uvx wandb login
51
- ```
52
 
53
- ### Windsurf
 
 
 
54
 
55
- ```bash
56
- uvx --from git+https://github.com/wandb/wandb-mcp-server -- add_to_client --config_path ~/.codeium/windsurf/mcp_config.json && uvx wandb login
57
- ```
58
 
59
- ### Claude Code
60
 
61
- ```bash
62
- claude mcp add wandb -- uvx --from git+https://github.com/wandb/wandb-mcp-server wandb_mcp_server && uvx wandb login
63
  ```
64
-
65
- Passing an environment variable to Claude Code, e.g. api key:
66
-
67
- ```bash
68
- claude mcp add wandb -e WANDB_API_KEY=your-api-key -- uvx --from git+https://github.com/wandb/wandb-mcp-server wandb_mcp_server
69
  ```
70
 
71
- ### Claude Desktop
72
- First ensure `uv` is installed, you might have to use `homebrew` to install depite `uv` being available in your terminal. Then run the below:
73
-
74
- ```bash
75
- uvx --from git+https://github.com/wandb/wandb-mcp-server -- add_to_client --config_path "~/Library/Application Support/Claude/claude_desktop_config.json" && uvx wandb login
76
  ```
77
-
78
- ### Manual Installation
79
- 1. Ensure you have `uv` installed, see above installation instructions for uv.
80
- 2. Get your W&B api key [here](https://www.wandb.ai/authorize)
81
- 3. Add the following to your MCP client config manually.
82
-
83
- ```json
84
- {
85
- "mcpServers": {
86
- "wandb": {
87
- "command": "uvx",
88
- "args": [
89
- "--from",
90
- "git+https://github.com/wandb/wandb-mcp-server",
91
- "wandb_mcp_server"
92
- ],
93
- "env": {
94
- "WANDB_API_KEY": "<insert your wandb key>",
95
- }
96
- }
97
- }
98
- }
99
- ```
100
-
101
- These help utilities above are inspired by the OpenMCP Server Registry [add-to-client pattern](https://www.open-mcp.org/servers).
102
-
103
- ## Available MCP tools
104
-
105
- ### 1. wandb
106
- - **`query_wandb_tool`** Execute queries against wandb experiment tracking data including Runs & Sweeps.
107
-
108
- ### 2. weave
109
- - **`query_weave_traces_tool`** Queries Weave evaluations and traces with powerful filtering, sorting, and pagination options.
110
- Returns either complete trace data or just metadata to avoid overwhelming the LLM context window.
111
-
112
- - **`count_weave_traces_tool`** Efficiently counts Weave traces matching given filters without returning the trace data.
113
- Returns both total trace count and root traces count to understand project scope before querying.
114
-
115
- ### 3. W&B Support agent
116
- - **`query_wandb_support_bot`** Connect your client to [wandbot](https://github.com/wandb/wandbot), our RAG-powered support agent for general help on how to use Weigths & Biases products and features.
117
-
118
- ### 4. Saving Analysis
119
- - **`create_wandb_report_tool`** Creates a new W&B Report with markdown text and HTML-rendered visualizations.
120
- Provides a permanent, shareable document for saving analysis findings and generated charts.
121
-
122
- ### 5. General W&B helpers
123
- - **`query_wandb_entity_projects`** List the available W&B entities and projects that can be accessed to give the LLM more context on how to write the correct queries for the above tools.
124
-
125
- ## Usage tips
126
-
127
- #### Provide your W&B project and entity name
128
-
129
- LLMs are not mind readers, ensure you specify the W&B Entity and W&B Project to the LLM. Example query for Claude Desktop:
130
-
131
- ```markdown
132
- how many openai.chat traces in the wandb-applied-ai-team/mcp-tests weave project? plot the most recent 5 traces over time and save to a report
133
  ```
134
 
135
- #### Avoid asking overly broad questions
136
-
137
- Questions such as "what is my best evaluation?" are probably overly broad and you'll get to an answer faster by refining your question to be more specific such as: "what eval had the highest f1 score?"
138
-
139
- #### Ensure all data was retrieved
140
-
141
- When asking broad, general questions such as "what are my best performing runs/evaluations?" its always a good idea to ask the LLM to check that it retrieved all the available runs. The MCP tools are designed to fetch the correct amount of data, but sometimes there can be a tendency from the LLMs to only retrieve the latest runs or the last N runs.
142
-
143
- ## Advanced
144
-
145
- ### Writing environment variables to the config file
146
-
147
- The `add_to_client` function accepts a number of flags to enable writing optional environment variables to the server's config file. Below is an example setting other env variables that don't have dedicated flags.
148
-
149
- ```bash
150
- # Write the server config file with additional env vars
151
- uvx --from git+https://github.com/wandb/wandb-mcp-server -- add_to_client \
152
- --config_path ~/.codeium/windsurf/mcp_config.json \
153
- --write_env_vars MCP_LOGS_WANDB_ENTITY=my_wandb_entity
154
-
155
- # Then login to W&B
156
- uvx wandb login
157
  ```
158
-
159
- Arguments passed to `--write_env_vars` must be space separated and the key and value of each env variable must be separated only by a `=`.
160
-
161
- ### Running from Source
162
-
163
- Run the server from source by running the below in the root dir:
164
-
165
- ```bash
166
- wandb login && uv run src/wandb_mcp_server/server.py
167
  ```
168
 
169
- ### Transport Options
170
 
171
- The server supports two transport modes. You can specify these options when running from source:
 
172
 
173
- #### Local MCP Client Communication (default)
174
- For standard MCP client integration (Cursor, Claude Desktop, etc.), use the default stdio transport:
 
175
 
176
- ```bash
177
- # Default - uses stdio transport (same as Running from Source section)
178
- uv run src/wandb_mcp_server/server.py
179
 
180
- # Explicit stdio transport
181
- uv run src/wandb_mcp_server/server.py --transport stdio
182
- ```
183
-
184
- #### HTTP Server Transport (SSE)
185
- For remote access or web-based applications that need HTTP connectivity via Server-Sent Events:
186
-
187
- ```bash
188
- # HTTP server on default port 8080
189
- uv run src/wandb_mcp_server/server.py --transport http
190
 
191
- # HTTP server on custom port
192
- uv run src/wandb_mcp_server/server.py --transport http --port 9090
193
-
194
- # HTTP server accessible from any IP
195
- uv run src/wandb_mcp_server/server.py --transport http --host 0.0.0.0 --port 8080
196
- ```
197
 
198
- **Available Options:**
199
- - `--transport`: Choose `stdio` (default) for local MCP clients or `http` for HTTP server with SSE
200
- - `--port`: Port number for HTTP server (defaults to 8080 when using HTTP transport)
201
- - `--host`: Host to bind HTTP server to (defaults to `localhost`)
202
 
203
- **Note:** The HTTP transport uses the streamable HTTP protocol for bidirectional communication. No additional dependencies are required.
204
 
205
- #### Using with Chat Applications via ngrok
 
 
 
206
 
207
- To use the HTTP server with external chat applications like Mistral's le Chat, you can expose it publicly using ngrok:
208
 
209
- 1. **Install ngrok** (if not already installed):
210
- ```bash
211
- # macOS
212
- brew install ngrok
213
-
214
- # Or download from https://ngrok.com/download
215
- ```
216
 
217
- 2. **Start the MCP server** on HTTP transport:
218
- ```bash
219
- uv run src/wandb_mcp_server/server.py --transport http --port 8080 --wandb_api_key your_wandb_key
220
- ```
221
-
222
- 3. **In a new terminal, expose the server** with ngrok:
223
- ```bash
224
- ngrok http 8080
225
- ```
226
-
227
- 4. **Copy the public URL** from ngrok output (e.g., `https://abc123.ngrok.io`)
228
-
229
- 5. **Configure your chat application** to use the MCP server:
230
- - **Mistral le Chat**: Add the ngrok URL + `/mcp` as the MCP server endpoint
231
- - **Other chat apps**: Use the ngrok URL + `/mcp` for MCP connections
232
- - **Example**: `https://abc123.ngrok.io/mcp`
233
-
234
- **Example ngrok output:**
235
- ```
236
- Session Status online
237
- Account your-account (Plan: Free)
238
- Version 3.0.0
239
- Region United States (us)
240
- Forwarding https://abc123.ngrok.io -> http://localhost:8080
241
- ```
242
-
243
- Use `https://abc123.ngrok.io/mcp` as your MCP server endpoint in chat applications.
244
-
245
- ### Environment Variables
246
-
247
- The full list of environment variables used to control the server's settings can be found in the `.env.example` file.
248
-
249
- ## Troubleshooting
250
-
251
- ### Authentication
252
-
253
- Ensure the machine running the MCP server is authenticated to Weights & Biases, either by setting the `WANDB_API_KEY` or running the below to add the key to the .netrc file:
254
-
255
- ```bash
256
- uvx wandb login
257
- ```
258
-
259
- ### Error: spawn uv ENOENT
260
-
261
- If you encounter an error like this when starting the MCP server:
262
- ```
263
- Error: spawn uv ENOENT
264
- ```
265
-
266
- This indicates that the `uv` package manager cannot be found. Fix this with these steps:
267
-
268
- 1. Install `uv` using the official installation script:
269
- ```bash
270
- curl -LsSf https://astral.sh/uv/install.sh | sh
271
- ```
272
-
273
- or if using a Mac:
274
-
275
- ```
276
- brew install uv
277
- ```
278
-
279
- 2. If the error persists after installation, create a symlink to make `uv` available system-wide:
280
- ```bash
281
- sudo ln -s ~/.local/bin/uv /usr/local/bin/uv
282
- ```
283
-
284
- 3. Restart your application or IDE after making these changes.
285
-
286
- This ensures that the `uv` executable is accessible from standard system paths that are typically included in the PATH for all processes.
287
-
288
- ## Testing
289
-
290
- The tests include a mix of unit tests and integration tests that test the tool calling reliability of a LLM. For now the integration tets only use claude-sonnet-3.7.
291
-
292
-
293
- #### Set LLM provider API key
294
-
295
- Set the appropriate api key in the `.env` file, e.g.
296
-
297
- ```
298
- ANTHROPIC_API_KEY=<my_key>
299
- ```
300
-
301
- #### Run 1 test file
302
-
303
- Run a single test using pytest with 10 workers
304
- ```
305
- uv run pytest -s -n 10 tests/test_query_wandb_gql.py
306
- ```
307
-
308
- #### Test debugging
309
-
310
- Turn on debug logging for a single sample in 1 test file
311
-
312
- ```
313
- pytest -s -n 1 "tests/test_query_weave_traces.py::test_query_weave_trace[longest_eval_most_expensive_child]" -v --log-cli-level=DEBUG
314
- ```
 
1
+ ---
2
+ title: Weights & Biases MCP Server
3
+ emoji: πŸ‹οΈβ€β™‚οΈ
4
+ colorFrom: yellow
5
+ colorTo: gray
6
+ sdk: docker
7
+ app_file: app.py
8
+ pinned: false
9
+ ---
10
 
11
  # Weights & Biases MCP Server
12
 
13
+ A [Model Context Protocol (MCP)](https://modelcontextprotocol.io/) server for querying [Weights & Biases](https://www.wandb.ai/) data, hosted on HuggingFace Spaces.
 
 
 
 
 
 
14
 
15
+ This server allows MCP clients to:
16
+ - πŸ“Š Query W&B Models runs and sweeps
17
+ - πŸ” Query W&B Weave traces, evaluations and datasets
18
+ - πŸ€– Query [wandbot](https://github.com/wandb/wandbot), the W&B support agent
19
+ - πŸ“ Write text and charts to W&B Reports
20
 
21
+ ## πŸš€ Quick Start
22
 
23
+ ### 1. Get Your W&B API Key
24
+ Get your API key from [wandb.ai/authorize](https://wandb.ai/authorize)
25
 
26
+ ### 2. Configure Environment Variables
27
+ ⚠️ **Important**: You must set your `WANDB_API_KEY` in the Space settings under "Variables and secrets" for this to work.
28
 
29
+ ### 3. Use the MCP Server
30
+ The server runs on HTTP transport with Server-Sent Events (SSE) at:
31
  ```
32
+ https://huggingface.co/spaces/[your-username]/[space-name]/mcp
 
 
 
 
33
  ```
34
 
35
+ ## πŸ”§ Available MCP Tools
36
 
37
+ ### W&B Models
38
+ - **`query_wandb_tool`**: Execute GraphQL queries against W&B experiment tracking data
39
 
40
+ ### W&B Weave
41
+ - **`query_weave_traces_tool`**: Query Weave evaluations and traces with filtering and pagination
42
+ - **`count_weave_traces_tool`**: Count traces matching filters without returning data
43
 
44
+ ### Support & Reports
45
+ - **`query_wandb_support_bot`**: Get help from wandbot, the W&B support agent
46
+ - **`create_wandb_report_tool`**: Create W&B Reports with markdown and visualizations
47
+ - **`query_wandb_entity_projects`**: List available W&B entities and projects
48
 
49
+ ## πŸ–₯️ Using with MCP Clients
 
 
50
 
51
+ ### Mistral le Chat
52
+ 1. Go to your chat interface
53
+ 2. Add MCP server with URL: `https://huggingface.co/spaces/[your-username]/[space-name]/mcp`
54
+ 3. Start querying your W&B data!
55
 
56
+ ### Other MCP Clients
57
+ Use the endpoint with any MCP-compatible client that supports HTTP transport.
 
58
 
59
+ ## πŸ“ Example Queries
60
 
 
 
61
  ```
62
+ How many openai.chat traces are in my wandb-team/my-project weave project?
 
 
 
 
63
  ```
64
 
 
 
 
 
 
65
  ```
66
+ Show me the latest 10 runs from my experiment tracking project and create a report with the results.
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
67
  ```
68
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
69
  ```
70
+ What's the best performing model in my latest sweep? Plot the results.
 
 
 
 
 
 
 
 
71
  ```
72
 
73
+ ## βš™οΈ Configuration
74
 
75
+ ### Required Environment Variables
76
+ - `WANDB_API_KEY`: Your Weights & Biases API key (set as Secret in Space settings)
77
 
78
+ ### Optional Environment Variables
79
+ - `MCP_SERVER_LOG_LEVEL`: Set to `DEBUG` for verbose logging (default: `WARNING`)
80
+ - `PORT`: Server port (automatically set by HuggingFace Spaces)
81
 
82
+ ## πŸ” Troubleshooting
 
 
83
 
84
+ ### Authentication Issues
85
+ - Ensure your `WANDB_API_KEY` is set correctly in the Space environment variables
86
+ - Verify your API key is valid at [wandb.ai/authorize](https://wandb.ai/authorize)
 
 
 
 
 
 
 
87
 
88
+ ### Connection Issues
89
+ - Make sure you're using the correct endpoint with `/mcp` suffix
90
+ - Check that the Space is running and not in a crashed state
 
 
 
91
 
92
+ ### Query Issues
93
+ - Always specify your W&B entity and project name in queries
94
+ - Be specific rather than overly broad in your questions
95
+ - Verify you have access to the projects you're querying
96
 
97
+ ## πŸ“š Resources
98
 
99
+ - [Model Context Protocol Documentation](https://modelcontextprotocol.io/)
100
+ - [Weights & Biases Documentation](https://docs.wandb.ai/)
101
+ - [W&B Weave Documentation](https://weave-docs.wandb.ai/)
102
+ - [Source Code](https://github.com/wandb/wandb-mcp-server)
103
 
104
+ ## πŸ“„ License
105
 
106
+ This project is licensed under the Apache License 2.0.
 
 
 
 
 
 
107
 
108
+ Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference