Summary
On Windows, the databricks-ai-dev-kit plugin installs successfully via Claude Code's marketplace but the MCP server never starts. The plugin appears installed (skills are visible) but mcp__databricks__* tools never become available. Three Windows portability bugs prevent the SessionStart hook from successfully hydrating the bundled venv and launching the server.
Environment
- OS: Windows 11
- Shell: PowerShell, with GitBash available (used by the SessionStart hook)
- Python: 3.11 (bundled via
uv venv)
- Plugin version: 1.1.9
- Claude Code: any current version
Bug 1: setup.sh hardcodes Linux venv paths
In [.claude-plugin/setup.sh](https://github.com/databricks-solutions/ai-dev-kit/blob/main/.claude-plugin/setup.sh) (paths approximate):
# Idempotency check
if [ -f "${PLUGIN_ROOT}/.venv/bin/python" ] && \
"${PLUGIN_ROOT}/.venv/bin/python" -c "import databricks_mcp_server" 2>/dev/null; then
...
fi
# Install commands
uv pip install --python .venv/bin/python -e "$TOOLS_CORE_DIR" --quiet >&2
uv pip install --python .venv/bin/python -e "$MCP_SERVER_DIR" --quiet >&2
On Windows, python.exe lives at .venv/Scripts/python.exe, not .venv/bin/python. The idempotency check always reports "not yet set up," then every install command fails because the target Python doesn't exist at that path.
Suggested fix: detect platform and pick the right path:
if [[ "$OSTYPE" == "msys" || "$OSTYPE" == "cygwin" || "$OSTYPE" == "win32" ]]; then
VENV_PYTHON="${PLUGIN_ROOT}/.venv/Scripts/python.exe"
else
VENV_PYTHON="${PLUGIN_ROOT}/.venv/bin/python"
fi
…and use $VENV_PYTHON everywhere instead of the hardcoded .venv/bin/python.
Bug 2: setup.sh requires uv on PATH with no fallback
The script uses uv exclusively to create the venv and install packages:
if ! command -v uv &> /dev/null; then
echo "Error: 'uv' is required but not installed." >&2
exit 1
fi
uv venv --python 3.11 >&2
uv pip install ...
On Windows, uv is rarely on PATH for fresh installs. The plugin can't bootstrap itself.
Suggested fix: fall back to standard library tooling when uv isn't available:
if command -v uv &> /dev/null; then
uv venv --python 3.11 >&2
UV_PIP="uv pip install --python $VENV_PYTHON"
else
python3.11 -m venv "${PLUGIN_ROOT}/.venv" >&2
"$VENV_PYTHON" -m ensurepip --default-pip
"$VENV_PYTHON" -m pip install --upgrade pip --quiet
UV_PIP="$VENV_PYTHON -m pip install"
fi
$UV_PIP -e "$TOOLS_CORE_DIR" --quiet
$UV_PIP -e "$MCP_SERVER_DIR" --quiet
Bug 3: [.mcp.json](https://github.com/databricks-solutions/ai-dev-kit/blob/main/.mcp.json) hardcodes Linux venv path
In .mcp.json:
{
"mcpServers": {
"databricks": {
"command": "${CLAUDE_PLUGIN_ROOT}/.venv/bin/python",
"args": ["${CLAUDE_PLUGIN_ROOT}/databricks-mcp-server/run_server.py"],
"defer_loading": true
}
}
}
Even if Bug 1 and Bug 2 are fixed and the venv is hydrated, this path doesn't resolve on Windows. The MCP server is never launched.
Suggested fix: use a wrapper script that the platform-specific setup.sh / setup.ps1 can shim, OR provide platform-specific MCP entries. The simplest pattern is a small launcher script the plugin owns:
{
"mcpServers": {
"databricks": {
"command": "bash",
"args": ["${CLAUDE_PLUGIN_ROOT}/.claude-plugin/launch_server.sh"],
"defer_loading": true
}
}
}
…where launch_server.sh does the OS detection and invokes the right Python. This keeps .mcp.json portable without per-OS variants.
Reproduction
- Fresh Windows 11 install with Claude Code and Python 3.11
/plugin install databricks-ai-dev-kit from the marketplace
- New Claude Code session — SessionStart hook fires,
setup.sh runs and silently fails
- Ask Claude: "Which Databricks workspace am I connected to?"
- The
databricks-config skill tries to call mcp__databricks__manage_workspace. ToolSearch returns No matching deferred tools found. The MCP server never started.
Expected vs actual
- Expected: plugin installs, SessionStart hook hydrates
.venv, .mcp.json launches run_server.py on the next session, MCP tools become available.
- Actual:
.venv exists but is empty of dependencies (no fastmcp, no databricks_mcp_server import). .mcp.json points at a Linux path that doesn't exist on Windows. No MCP tools available.
Workaround
Documented at — manual ensurepip + editable installs against the bundled venv, plus a project-level .mcp.json override using Windows paths.
Impact
Affects every Windows user of the plugin. The skills work fine, but the MCP server is the heart of the plugin — without it, ~30 tools are unreachable, including manage_workspace, databricks-jobs, databricks-dbsql, and databricks-unity-catalog. Users on Linux/Mac don't see this issue.
Notes
- The plugin's
server.py already has thoughtful Windows-specific code (_patch_subprocess_stdin), so the runtime is Windows-aware. The install path just hasn't caught up.
- The standalone
install.ps1 exists in the repo root and appears to be Windows-aware, but it installs to ~/.ai-dev-kit/ rather than the plugin marketplace cache, so users who installed via the marketplace bypass it entirely.
Summary
On Windows, the
databricks-ai-dev-kitplugin installs successfully via Claude Code's marketplace but the MCP server never starts. The plugin appears installed (skills are visible) butmcp__databricks__*tools never become available. Three Windows portability bugs prevent the SessionStart hook from successfully hydrating the bundled venv and launching the server.Environment
uv venv)Bug 1:
setup.shhardcodes Linux venv pathsIn
[.claude-plugin/setup.sh](https://github.com/databricks-solutions/ai-dev-kit/blob/main/.claude-plugin/setup.sh)(paths approximate):On Windows,
python.exelives at.venv/Scripts/python.exe, not.venv/bin/python. The idempotency check always reports "not yet set up," then every install command fails because the target Python doesn't exist at that path.Suggested fix: detect platform and pick the right path:
…and use
$VENV_PYTHONeverywhere instead of the hardcoded.venv/bin/python.Bug 2:
setup.shrequiresuvon PATH with no fallbackThe script uses
uvexclusively to create the venv and install packages:On Windows,
uvis rarely on PATH for fresh installs. The plugin can't bootstrap itself.Suggested fix: fall back to standard library tooling when
uvisn't available:Bug 3:
[.mcp.json](https://github.com/databricks-solutions/ai-dev-kit/blob/main/.mcp.json)hardcodes Linux venv pathIn
.mcp.json:{ "mcpServers": { "databricks": { "command": "${CLAUDE_PLUGIN_ROOT}/.venv/bin/python", "args": ["${CLAUDE_PLUGIN_ROOT}/databricks-mcp-server/run_server.py"], "defer_loading": true } } }Even if Bug 1 and Bug 2 are fixed and the venv is hydrated, this path doesn't resolve on Windows. The MCP server is never launched.
Suggested fix: use a wrapper script that the platform-specific
setup.sh/setup.ps1can shim, OR provide platform-specific MCP entries. The simplest pattern is a small launcher script the plugin owns:{ "mcpServers": { "databricks": { "command": "bash", "args": ["${CLAUDE_PLUGIN_ROOT}/.claude-plugin/launch_server.sh"], "defer_loading": true } } }…where
launch_server.shdoes the OS detection and invokes the right Python. This keeps.mcp.jsonportable without per-OS variants.Reproduction
/plugin install databricks-ai-dev-kitfrom the marketplacesetup.shruns and silently failsdatabricks-configskill tries to callmcp__databricks__manage_workspace. ToolSearch returnsNo matching deferred tools found. The MCP server never started.Expected vs actual
.venv,.mcp.jsonlaunchesrun_server.pyon the next session, MCP tools become available..venvexists but is empty of dependencies (nofastmcp, nodatabricks_mcp_serverimport)..mcp.jsonpoints at a Linux path that doesn't exist on Windows. No MCP tools available.Workaround
Documented at — manual
ensurepip+ editable installs against the bundled venv, plus a project-level.mcp.jsonoverride using Windows paths.Impact
Affects every Windows user of the plugin. The skills work fine, but the MCP server is the heart of the plugin — without it, ~30 tools are unreachable, including
manage_workspace,databricks-jobs,databricks-dbsql, anddatabricks-unity-catalog. Users on Linux/Mac don't see this issue.Notes
server.pyalready has thoughtful Windows-specific code (_patch_subprocess_stdin), so the runtime is Windows-aware. The install path just hasn't caught up.install.ps1exists in the repo root and appears to be Windows-aware, but it installs to~/.ai-dev-kit/rather than the plugin marketplace cache, so users who installed via the marketplace bypass it entirely.