Give Claude Code the power of Gemini 3.1
An MCP server that connects Claude Code to Google's Gemini 3.1, unlocking capabilities that complement Claude's strengths.
| Gemini's Strengths | Use Case |
|---|---|
| 1M Token Context | Analyze entire codebases in one shot |
| Google Search Grounding | Get real-time documentation & latest info |
| Multimodal Vision | Understand screenshots, diagrams, designs |
Philosophy: Claude is the commander, Gemini is the specialist.
Add to your MCP config file:
- Mac:
~/Library/Application Support/Claude/claude_desktop_config.json - Windows:
%APPDATA%\Claude\claude_desktop_config.json - Linux:
~/.config/Claude/claude_desktop_config.json
Then restart Claude Code.
Two authentication modes are supported. The server auto-detects which mode to use based on environment variables.
Best for personal development and quick trials.
- Visit Google AI Studio and create an API key
- Add to your MCP config:
{
"mcpServers": {
"gemini": {
"command": "npx",
"args": ["-y", "@lkbaba/mcp-server-gemini"],
"env": {
"GEMINI_API_KEY": "your-api-key"
}
}
}
}More secure, uses Google Cloud IAM authentication.
Prerequisites:
- A Google Cloud project with Vertex AI API enabled
- A service account with Vertex AI User role (create one here)
Setup (2 minutes):
- Create a service account in GCP Console → download JSON key file
- Open the JSON key file, copy all key-value pairs
- Paste them into the
envsection of your MCP config:
{
"mcpServers": {
"gemini": {
"command": "npx",
"args": ["-y", "@lkbaba/mcp-server-gemini"],
"env": {
"type": "service_account",
"project_id": "your-project-id",
"private_key_id": "key-id-here",
"private_key": "-----BEGIN PRIVATE KEY-----\nMIIEv...\n-----END PRIVATE KEY-----\n",
"client_email": "your-sa@your-project.iam.gserviceaccount.com",
"client_id": "123456789",
"auth_uri": "https://accounts.google.com/o/oauth2/auth",
"token_uri": "https://oauth2.googleapis.com/token",
"auth_provider_x509_cert_url": "https://www.googleapis.com/oauth2/v1/certs",
"client_x509_cert_url": "https://www.googleapis.com/robot/v1/metadata/x509/your-sa%40your-project.iam.gserviceaccount.com",
"universe_domain": "googleapis.com"
}
}
}
}The server auto-detects service account credentials from env vars — no GOOGLE_GENAI_USE_VERTEXAI or GOOGLE_CLOUD_PROJECT needed. Just paste and go.
Tip: On Windows, the server automatically fixes slash corruption (
/→\) in PEM private keys that some MCP clients introduce.
Advanced options: You can also use
GOOGLE_GENAI_USE_VERTEXAI=true+GOOGLE_CREDENTIALS_JSON,GOOGLE_APPLICATION_CREDENTIALS(file path), orgcloud auth application-default login. See the environment variables reference below.
Environment variables reference
Paste JSON approach (Option 2 above — simplest for Vertex AI):
Just paste the service account JSON fields directly into env. No extra variables needed — the server auto-detects type: "service_account".
Explicit Vertex AI mode (advanced):
| Variable | Required | Description |
|---|---|---|
GOOGLE_GENAI_USE_VERTEXAI |
Yes | Set to "true" to enable |
GOOGLE_CLOUD_PROJECT |
Yes | GCP project ID |
GOOGLE_CLOUD_LOCATION |
No | Region (default: global) |
GOOGLE_CREDENTIALS_JSON |
No* | Entire service account JSON as a single string |
GOOGLE_APPLICATION_CREDENTIALS |
No* | File path to service account JSON key |
* At least one credential source is needed: GOOGLE_CREDENTIALS_JSON, GOOGLE_APPLICATION_CREDENTIALS, or gcloud ADC.
AI Studio mode:
| Variable | Required | Description |
|---|---|---|
GEMINI_API_KEY |
Yes | API key from Google AI Studio |
If both modes are configured, Vertex AI takes priority.
v2.0.0 (2026-04): The protocol layer has been rewritten on top of the official @modelcontextprotocol/sdk. This fixes a JSON-RPC notifications/initialized spec violation inherited from the upstream fork, which caused strict MCP clients (recent Claude CLI, some VS Code extensions) to drop the connection with MCP error -32000: Connection closed or to silently omit the Gemini tools from the tool list. No user-facing API changes — upgrade is drop-in.
v1.3.0+:
- The default model is now
gemini-3.1-pro-preview - Old model names are automatically mapped (no config changes needed)
- See CHANGELOG.md for details
| Tool | Description |
|---|---|
gemini_search |
Web search with Google Search grounding. Get real-time info, latest docs, current events. |
| Tool | Description |
|---|---|
gemini_analyze_codebase |
Analyze entire projects with 1M token context. Supports directory path, file paths, or direct content. |
gemini_analyze_content |
Analyze code, documents, or data. Supports file path or direct content input. |
| Tool | Description |
|---|---|
gemini_multimodal_query |
Analyze images with natural language. Understand designs, diagrams, screenshots. |
| Tool | Description |
|---|---|
gemini_brainstorm |
Generate creative ideas with project context. Supports reading README, PRD files. |
All tools now support an optional model parameter:
| Model | Speed | Best For |
|---|---|---|
gemini-3.1-pro-preview |
Standard | Complex analysis, deep reasoning, agentic workflows (default) |
gemini-3-flash-preview |
Fast | Simple tasks, quick responses, search queries |
Note: gemini-3-pro-preview is deprecated (retired 2026-03-09) and will be automatically mapped to gemini-3.1-pro-preview.
Example: Use the new default model
{
"name": "gemini_analyze_content",
"arguments": {
"filePath": "./src/index.ts",
"task": "review",
"model": "gemini-3.1-pro-preview"
}
}"Use Gemini to analyze the ./src directory for architectural patterns and potential issues"
"Search for the latest Next.js 15 App Router documentation"
"Analyze this architecture diagram and explain the data flow" (attach image)
"Brainstorm feature ideas based on this project's README.md"
For users behind proxy/VPN
Add proxy environment variable to your config:
{
"mcpServers": {
"gemini": {
"command": "npx",
"args": ["-y", "@lkbaba/mcp-server-gemini"],
"env": {
"GEMINI_API_KEY": "your_api_key_here",
"HTTPS_PROXY": "http://127.0.0.1:7897"
}
}
}
}Build from source
git clone https://github.com/LKbaba/Gemini-mcp.git
cd Gemini-mcp
npm install
npm run build
export GEMINI_API_KEY="your_api_key_here"
npm startsrc/
├── config/
│ ├── models.ts # Model configurations
│ └── constants.ts # Global constants
├── tools/
│ ├── definitions.ts # MCP tool definitions
│ ├── multimodal-query.ts # Multimodal queries
│ ├── analyze-content.ts # Content analysis
│ ├── analyze-codebase.ts # Codebase analysis
│ ├── brainstorm.ts # Brainstorming
│ └── search.ts # Web search
├── utils/
│ ├── gemini-factory.ts # Dual-mode auth factory (API Key + Vertex AI)
│ ├── gemini-client.ts # Gemini API client
│ ├── file-reader.ts # File system access
│ ├── security.ts # Path validation
│ ├── validators.ts # Parameter validation
│ └── error-handler.ts # Error handling
├── types.ts # Type definitions
└── server.ts # Main server
Based on aliargun/mcp-server-gemini
MIT