Configure, connect, and use Model Context Protocol tools with your favorite local LLM clients.
Model Context Protocol (MCP) standardizes how AI models interact with external tools, data, and APIs. You run an MCP Server locally or remotely, and your LLM client connects to it via stdio or HTTP/SSE.
Settings (⚙️) → Navigate to MCP Servers.⚠️ LM Studio v0.3.5+ recommended. Paths/UI may shift slightly between releases.
~/.config/Claude/claude_desktop_config.json (Linux/Windows) or ~/Library/Application Support/Claude/claude_desktop_config.json (macOS).📁 Ensure the JSON is valid. Claude strictly validates on startup and will ignore malformed entries.
mcp_config.json via docker-compose.yml env vars.💡 Open WebUI auto-reloads MCP configs. You may need to toggle "Use MCP Tools" in the chat settings.
.cursor folder (if it doesn't exist).mcp.json inside it and paste the generated JSON.CMD/CTRL + SHIFT + P → Cursor: Reload Window.🔒 Workspace-scoped config means each project can have its own MCP tools. Great for repo-specific automation.
⚙️ Settings.mcpServers in your config.json (usually ~/.continue/config.json).mcpServers array/object.🛠️ Continue supports both stdio and streamable-http. Use stdio for local Node/Python servers.
mcp.json, config.json, or settings UI)."mcpServers" key format shown in the generator below.stdio transport for local servers, sse or http for remote.Replace placeholder links with your accounts • No tracking • 100% optional