AI Assistants (MCP)
Install the UploadKit MCP server so Claude Code, Cursor, Windsurf, and Zed know every component, scaffold boilerplate, and set up BYOS for you.
UploadKit ships an official Model Context Protocol server. Once you add it to your AI coding assistant, the assistant gains first-class knowledge of:
- Every one of UploadKit's 40+ React components — names, categories, inspirations, ready-to-paste usage
- How to scaffold the Next.js route handler at
app/api/uploadkit/[...uploadkit]/route.ts - How to wire
<UploadKitProvider>into your root layout - How to configure Bring Your Own Storage (S3, R2, GCS, Backblaze B2)
- The right install command for your package manager
No API key. No config. Runs locally via npx.
The MCP server ships the component catalog bundled — it never calls home, never sees your code, and works offline.
Install
Add the server with one command:
claude mcp add uploadkit -- npx -y @uploadkitdev/mcpOr edit ~/.claude.json manually:
{
"mcpServers": {
"uploadkit": {
"command": "npx",
"args": ["-y", "@uploadkitdev/mcp"]
}
}
}Open Settings → Features → Model Context Protocol → Add new MCP server:
- Name:
uploadkit - Command:
npx -y @uploadkitdev/mcp
Or edit ~/.cursor/mcp.json:
{
"mcpServers": {
"uploadkit": {
"command": "npx",
"args": ["-y", "@uploadkitdev/mcp"]
}
}
}Edit ~/.codeium/windsurf/mcp_config.json:
{
"mcpServers": {
"uploadkit": {
"command": "npx",
"args": ["-y", "@uploadkitdev/mcp"]
}
}
}Restart Windsurf so it picks up the new server.
In settings.json:
{
"context_servers": {
"uploadkit": {
"command": {
"path": "npx",
"args": ["-y", "@uploadkitdev/mcp"]
}
}
}
}Add to your config.json:
{
"experimental": {
"modelContextProtocolServers": [
{
"transport": {
"type": "stdio",
"command": "npx",
"args": ["-y", "@uploadkitdev/mcp"]
}
}
]
}
}Remote MCP
For clients that cannot run npx — ChatGPT custom connectors, Claude.ai's web UI, Smithery, and any hosted MCP host — UploadKit also exposes a remote Streamable HTTP endpoint:
https://api.uploadkit.dev/api/v1/mcpStreamable HTTP, stateless, no auth required for v1. Tools are read-only and share the same 11-tool surface as the stdio server.
ChatGPT connector
- Open ChatGPT → Settings → Connectors → Add custom connector.
- Paste the URL above into the MCP Server URL field.
- Save. The connector will handshake, discover the 11 tools, and become available in your chats.
Claude.ai web
Paste https://api.uploadkit.dev/api/v1/mcp as a custom MCP connector wherever Claude.ai's web UI exposes the "Custom connector" input. Until the UI is GA, the stdio install below remains the recommended path for Claude Desktop and Claude Code.
Smithery / mcp.so
Once listed on Smithery, users can install the remote endpoint from:
https://smithery.ai/server/io.github.drumst0ck/uploadkitFor IDE usage (Claude Code, Cursor, Windsurf, Zed, Continue), the stdio install (npx -y @uploadkitdev/mcp) documented above remains the recommended path — it runs locally, starts instantly, and requires no network.
Test the remote endpoint
Health check (should return {"status":"ok","version":"0.5.2","tools":11}):
curl https://api.uploadkit.dev/api/v1/mcp/healthFull inspection with the official MCP Inspector (browse tools + call them interactively in a web UI):
npx @modelcontextprotocol/inspector --transport streamable-http https://api.uploadkit.dev/api/v1/mcpRaw JSON-RPC handshake (no client needed):
curl -X POST https://api.uploadkit.dev/api/v1/mcp \
-H "Content-Type: application/json" \
-H "Accept: application/json, text/event-stream" \
-H "MCP-Protocol-Version: 2024-11-05" \
-d '{"jsonrpc":"2.0","id":1,"method":"initialize","params":{"protocolVersion":"2024-11-05","capabilities":{},"clientInfo":{"name":"curl","version":"0"}}}'Self-host with Docker
The repository ships a minimal Dockerfile at the root that runs the stdio server inside a container. Most users never need this (the npx install is simpler), but it's useful for:
- Containerized CI pipelines that invoke MCP tools
- Corporate networks that can't reach
registry.npmjs.orgat runtime - Air-gapped developer environments
# Clone + build
git clone https://github.com/drumst0ck/uploadkit.git
cd uploadkit
docker build -t uploadkit-mcp .
# Run — the container speaks JSON-RPC on stdio
docker run -i --rm uploadkit-mcp < request.jsonTo wire it into Claude Code or Cursor as a containerized server:
{
"mcpServers": {
"uploadkit": {
"command": "docker",
"args": ["run", "-i", "--rm", "uploadkit-mcp"]
}
}
}Verify it's working
Ask your assistant:
"What UploadKit components are available?"
If the MCP server is connected, it will call the list_components tool and return the full catalog with categories (classic, dropzone, button, progress, motion, specialty, gallery).
What the server exposes
Tools
| Tool | Description |
|---|---|
list_components | List all components, optionally filtered by category |
get_component | Full metadata + usage example for a single component |
search_components | Fuzzy-search by keyword, inspiration, or use case |
get_install_command | Returns the install command for pnpm / npm / yarn / bun |
scaffold_route_handler | Generates app/api/uploadkit/[...uploadkit]/route.ts |
scaffold_provider | Snippet for wiring <UploadKitProvider> into layout.tsx |
get_byos_config | Env + handler config for S3 / R2 / GCS / Backblaze B2 |
get_quickstart | End-to-end setup walkthrough |
search_docs | Full-text search across every docs page (88+ pages) |
get_doc | Fetch the full content of a docs page by path |
list_docs | Enumerate every documented page (title, description, URL) |
Resources
| URI | Description |
|---|---|
uploadkit://catalog | JSON catalog of every component |
uploadkit://quickstart | Markdown quickstart guide |
uploadkit://docs | Index of every docs page (title, description, URL) |
Example prompts
Once installed, you can speak to your assistant naturally:
Add an Apple-style animated dropzone to my profile page.Set up UploadKit with Cloudflare R2 in BYOS mode.What's the difference between UploadProgressRadial and UploadProgressOrbit?Create a route handler that accepts up to 4 images of 8 MB each.Show me every progress indicator UploadKit ships.Why MCP
Most AI coding assistants can only see the files in your editor plus whatever they learned during pretraining. That means:
- They don't know about components shipped after their knowledge cutoff
- They invent props that don't exist (hallucinations)
- They write outdated setup code
- They can't scaffold correctly without guessing
The MCP server fixes all four. The assistant queries the authoritative source — UploadKit's own catalog and scaffolds — every time. New release? Update the server and the assistant knows the new components immediately.
Security & privacy
- The server runs locally on your machine via stdio.
- It does not make network requests, read your codebase, or collect telemetry.
- The only thing it ships is the component catalog (names, descriptions, usage examples) bundled at publish time.
- Source: github.com/drumst0ck/uploadkit/tree/master/packages/mcp
Troubleshooting
"Command not found: npx" — install Node.js 18+ from nodejs.org.
Server fails to start — run npx -y @uploadkitdev/mcp in a terminal manually. You should see the process hang (stdio waits for input). If it crashes instead, open an issue with the stderr output.
Tools don't show up in Claude Code — restart Claude Code after editing the config. Run claude mcp list to verify the server is registered.
"Outdated catalog" — the server is bundled with UploadKit at publish time. npx -y @uploadkitdev/mcp@latest pulls the newest version every time.
Resources
- Source: github.com/drumst0ck/uploadkit/tree/master/packages/mcp
- npm: @uploadkitdev/mcp
- Changelog: packages/mcp/CHANGELOG.md
- Glama listing: glama.ai/mcp/servers/drumst0ck/uploadkit (AAA)
- Official MCP Registry:
io.github.drumst0ck/uploadkiton registry.modelcontextprotocol.io - Issues: github.com/drumst0ck/uploadkit/issues