Skip to content

Commit 843ac58

Browse files
committed
refactor: abstract LLM client to support multiple providers (Anthropic & OpenAI)
Major Changes: - Introduce LLMClient wrapper with unified interface for multiple LLM providers - Split monolithic llm.py into modular architecture: - LLMClient: Unified wrapper with automatic provider selection - AnthropicClient: Anthropic protocol implementation - OpenAIClient: OpenAI protocol implementation (with reasoning_split support) - LLMClientBase: Abstract base class defining the interface - Auto-append provider-specific API suffixes (/anthropic or /v1) - Add LLMProvider enum for type-safe provider selection Tool Schema Improvements: - Remove AnthropicTool/ToolInputSchema from public API - Add to_openai_schema() to Tool base class for format conversion - Simplify Agent/Logger to pass Tool objects directly - LLM clients handle both Tool objects and dict schemas internally New Features: - examples/05_provider_selection.py: Demo provider switching - examples/06_tool_schema_demo.py: Demo Tool base class usage - Complete test coverage for both Anthropic and OpenAI clients Benefits: - Flexible provider switching without code changes - Cleaner architecture with separation of concerns - Backward compatible with existing code - Unified interface simplifies maintenance Tests: 90 passed, 3 skipped ✅
1 parent 4084afa commit 843ac58

36 files changed

+2965
-1043
lines changed

.gitignore

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -55,3 +55,5 @@ mcp.json.bak
5555

5656
docs/assets/backup/
5757
docs/assets/preview.html
58+
59+
claude.md

README.md

Lines changed: 11 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -30,6 +30,9 @@ This project comes packed with features designed for a robust and intelligent ag
3030
- [Testing](#testing)
3131
- [Quick Run](#quick-run)
3232
- [Test Coverage](#test-coverage)
33+
- [Troubleshooting](#troubleshooting)
34+
- [SSL Certificate Error](#ssl-certificate-error)
35+
- [Module Not Found Error](#module-not-found-error)
3336
- [Related Documentation](#related-documentation)
3437
- [Contributing](#contributing)
3538
- [License](#license)
@@ -41,10 +44,10 @@ This project comes packed with features designed for a robust and intelligent ag
4144

4245
MiniMax provides both global and China platforms. Choose based on your network environment:
4346

44-
| Version | Platform | API Base |
45-
| ---------- | -------------------------------------------------------------- | ------------------------------------ |
46-
| **Global** | [https://platform.minimax.io](https://platform.minimax.io) | `https://api.minimax.io/anthropic` |
47-
| **China** | [https://platform.minimaxi.com](https://platform.minimaxi.com) | `https://api.minimaxi.com/anthropic` |
47+
| Version | Platform | API Base |
48+
| ---------- | -------------------------------------------------------------- | -------------------------- |
49+
| **Global** | [https://platform.minimax.io](https://platform.minimax.io) | `https://api.minimax.io` |
50+
| **China** | [https://platform.minimaxi.com](https://platform.minimaxi.com) | `https://api.minimaxi.com` |
4851

4952
**Steps to get API Key:**
5053
1. Visit the corresponding platform to register and login
@@ -109,8 +112,8 @@ Fill in your API Key and corresponding API Base:
109112

110113
```yaml
111114
api_key: "YOUR_API_KEY_HERE" # API Key from step 1
112-
api_base: "https://api.minimax.io/anthropic" # Global
113-
# api_base: "https://api.minimaxi.com/anthropic" # China
115+
api_base: "https://api.minimax.io" # Global
116+
# api_base: "https://api.minimaxi.com" # China
114117
model: "MiniMax-M2"
115118
```
116119
@@ -176,8 +179,8 @@ Fill in your API Key and corresponding API Base:
176179

177180
```yaml
178181
api_key: "YOUR_API_KEY_HERE" # API Key from step 1
179-
api_base: "https://api.minimax.io/anthropic" # Global
180-
# api_base: "https://api.minimaxi.com/anthropic" # China
182+
api_base: "https://api.minimax.io" # Global
183+
# api_base: "https://api.minimaxi.com" # China
181184
model: "MiniMax-M2"
182185
max_steps: 100
183186
workspace_dir: "./workspace"

README_CN.md

Lines changed: 11 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -30,6 +30,9 @@
3030
- [测试](#测试)
3131
- [快速运行](#快速运行)
3232
- [测试覆盖范围](#测试覆盖范围)
33+
- [常见问题](#常见问题)
34+
- [SSL 证书错误](#ssl-证书错误)
35+
- [模块未找到错误](#模块未找到错误)
3336
- [相关文档](#相关文档)
3437
- [贡献](#贡献)
3538
- [许可证](#许可证)
@@ -41,10 +44,10 @@
4144

4245
MiniMax 提供国内和海外两个平台,请根据您的网络环境选择:
4346

44-
| 版本 | 平台地址 | API Base |
45-
| ---------- | -------------------------------------------------------------- | ------------------------------------ |
46-
| **国内版** | [https://platform.minimaxi.com](https://platform.minimaxi.com) | `https://api.minimaxi.com/anthropic` |
47-
| **海外版** | [https://platform.minimax.io](https://platform.minimax.io) | `https://api.minimax.io/anthropic` |
47+
| 版本 | 平台地址 | API Base |
48+
| ---------- | -------------------------------------------------------------- | -------------------------- |
49+
| **国内版** | [https://platform.minimaxi.com](https://platform.minimaxi.com) | `https://api.minimaxi.com` |
50+
| **海外版** | [https://platform.minimax.io](https://platform.minimax.io) | `https://api.minimax.io` |
4851

4952
**获取步骤:**
5053
1. 访问相应平台注册并登录
@@ -109,8 +112,8 @@ nano ~/.mini-agent/config/config.yaml
109112

110113
```yaml
111114
api_key: "YOUR_API_KEY_HERE" # 填入第 1 步获取的 API Key
112-
api_base: "https://api.minimaxi.com/anthropic" # 国内版
113-
# api_base: "https://api.minimax.io/anthropic" # 海外版(如使用海外平台,请取消本行注释)
115+
api_base: "https://api.minimaxi.com" # 国内版
116+
# api_base: "https://api.minimax.io" # 海外版(如使用海外平台,请取消本行注释)
114117
model: "MiniMax-M2"
115118
```
116119
@@ -176,8 +179,8 @@ vim mini_agent/config/config.yaml # 或使用您偏好的编辑器
176179

177180
```yaml
178181
api_key: "YOUR_API_KEY_HERE" # 填入第 1 步获取的 API Key
179-
api_base: "https://api.minimaxi.com/anthropic" # 国内版
180-
# api_base: "https://api.minimax.io/anthropic" # 海外版(如使用海外平台,请修改此行)
182+
api_base: "https://api.minimaxi.com" # 国内版
183+
# api_base: "https://api.minimax.io" # 海外版(如使用海外平台,请修改此行)
181184
model: "MiniMax-M2"
182185
max_steps: 100
183186
workspace_dir: "./workspace"

examples/01_basic_tools.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -13,7 +13,7 @@
1313
import tempfile
1414
from pathlib import Path
1515

16-
from mini_agent.tools import ReadTool, WriteTool, EditTool, BashTool
16+
from mini_agent.tools import BashTool, EditTool, ReadTool, WriteTool
1717

1818

1919
async def demo_write_tool():

examples/02_simple_agent.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -10,9 +10,9 @@
1010
import tempfile
1111
from pathlib import Path
1212

13+
from mini_agent import LLMClient
1314
from mini_agent.agent import Agent
1415
from mini_agent.config import Config
15-
from mini_agent.llm import LLMClient
1616
from mini_agent.tools import BashTool, EditTool, ReadTool, WriteTool
1717

1818

examples/03_session_notes.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -11,9 +11,9 @@
1111
import tempfile
1212
from pathlib import Path
1313

14+
from mini_agent import LLMClient
1415
from mini_agent.agent import Agent
1516
from mini_agent.config import Config
16-
from mini_agent.llm import LLMClient
1717
from mini_agent.tools import BashTool, ReadTool, WriteTool
1818
from mini_agent.tools.note_tool import RecallNoteTool, SessionNoteTool
1919

examples/04_full_agent.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -13,9 +13,9 @@
1313
import tempfile
1414
from pathlib import Path
1515

16+
from mini_agent import LLMClient
1617
from mini_agent.agent import Agent
1718
from mini_agent.config import Config
18-
from mini_agent.llm import LLMClient
1919
from mini_agent.tools import BashTool, EditTool, ReadTool, WriteTool
2020
from mini_agent.tools.mcp_loader import load_mcp_tools_async
2121
from mini_agent.tools.note_tool import RecallNoteTool, SessionNoteTool

examples/05_provider_selection.py

Lines changed: 190 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,190 @@
1+
"""Example: Using LLMClient with different providers.
2+
3+
This example demonstrates how to use the LLMClient wrapper with different
4+
LLM providers (Anthropic or OpenAI) through the provider parameter.
5+
"""
6+
7+
import asyncio
8+
import os
9+
from pathlib import Path
10+
11+
import yaml
12+
13+
from mini_agent import LLMClient, LLMProvider, Message
14+
15+
16+
async def demo_anthropic_provider():
17+
"""Demo using LLMClient with Anthropic provider."""
18+
print("\n" + "=" * 60)
19+
print("DEMO: LLMClient with Anthropic Provider")
20+
print("=" * 60)
21+
22+
# Load config
23+
config_path = Path("mini_agent/config/config.yaml")
24+
with open(config_path, encoding="utf-8") as f:
25+
config = yaml.safe_load(f)
26+
27+
# Initialize client with Anthropic provider
28+
client = LLMClient(
29+
api_key=config["api_key"],
30+
provider=LLMProvider.ANTHROPIC, # Specify Anthropic provider
31+
model=config.get("model", "MiniMax-M2"),
32+
)
33+
34+
print(f"Provider: {client.provider}")
35+
print(f"API Base: {client.api_base}")
36+
37+
# Simple question
38+
messages = [Message(role="user", content="Say 'Hello from Anthropic!'")]
39+
print(f"\n👤 User: {messages[0].content}")
40+
41+
try:
42+
response = await client.generate(messages)
43+
if response.thinking:
44+
print(f"💭 Thinking: {response.thinking}")
45+
print(f"💬 Model: {response.content}")
46+
print("✅ Anthropic provider demo completed")
47+
except Exception as e:
48+
print(f"❌ Error: {e}")
49+
50+
51+
async def demo_openai_provider():
52+
"""Demo using LLMClient with OpenAI provider."""
53+
print("\n" + "=" * 60)
54+
print("DEMO: LLMClient with OpenAI Provider")
55+
print("=" * 60)
56+
57+
# Load config
58+
config_path = Path("mini_agent/config/config.yaml")
59+
with open(config_path, encoding="utf-8") as f:
60+
config = yaml.safe_load(f)
61+
62+
# Initialize client with OpenAI provider
63+
client = LLMClient(
64+
api_key=config["api_key"],
65+
provider=LLMProvider.OPENAI, # Specify OpenAI provider
66+
model=config.get("model", "MiniMax-M2"),
67+
)
68+
69+
print(f"Provider: {client.provider}")
70+
print(f"API Base: {client.api_base}")
71+
72+
# Simple question
73+
messages = [Message(role="user", content="Say 'Hello from OpenAI!'")]
74+
print(f"\n👤 User: {messages[0].content}")
75+
76+
try:
77+
response = await client.generate(messages)
78+
if response.thinking:
79+
print(f"💭 Thinking: {response.thinking}")
80+
print(f"💬 Model: {response.content}")
81+
print("✅ OpenAI provider demo completed")
82+
except Exception as e:
83+
print(f"❌ Error: {e}")
84+
85+
86+
async def demo_default_provider():
87+
"""Demo using LLMClient with default provider."""
88+
print("\n" + "=" * 60)
89+
print("DEMO: LLMClient with Default Provider (Anthropic)")
90+
print("=" * 60)
91+
92+
# Load config
93+
config_path = Path("mini_agent/config/config.yaml")
94+
with open(config_path, encoding="utf-8") as f:
95+
config = yaml.safe_load(f)
96+
97+
# Initialize client without specifying provider (defaults to Anthropic)
98+
client = LLMClient(
99+
api_key=config["api_key"],
100+
model=config.get("model", "MiniMax-M2"),
101+
)
102+
103+
print(f"Provider (default): {client.provider}")
104+
print(f"API Base: {client.api_base}")
105+
106+
# Simple question
107+
messages = [Message(role="user", content="Say 'Hello with default provider!'")]
108+
print(f"\n👤 User: {messages[0].content}")
109+
110+
try:
111+
response = await client.generate(messages)
112+
print(f"💬 Model: {response.content}")
113+
print("✅ Default provider demo completed")
114+
except Exception as e:
115+
print(f"❌ Error: {e}")
116+
117+
118+
async def demo_provider_comparison():
119+
"""Compare responses from both providers."""
120+
print("\n" + "=" * 60)
121+
print("DEMO: Provider Comparison")
122+
print("=" * 60)
123+
124+
# Load config
125+
config_path = Path("mini_agent/config/config.yaml")
126+
with open(config_path, encoding="utf-8") as f:
127+
config = yaml.safe_load(f)
128+
129+
# Create clients for both providers
130+
anthropic_client = LLMClient(
131+
api_key=config["api_key"],
132+
provider=LLMProvider.ANTHROPIC,
133+
model=config.get("model", "MiniMax-M2"),
134+
)
135+
136+
openai_client = LLMClient(
137+
api_key=config["api_key"],
138+
provider=LLMProvider.OPENAI,
139+
model=config.get("model", "MiniMax-M2"),
140+
)
141+
142+
# Same question for both
143+
messages = [Message(role="user", content="What is 2+2?")]
144+
print(f"\n👤 Question: {messages[0].content}\n")
145+
146+
try:
147+
# Get response from Anthropic
148+
anthropic_response = await anthropic_client.generate(messages)
149+
print(f"🔵 Anthropic: {anthropic_response.content}")
150+
151+
# Get response from OpenAI
152+
openai_response = await openai_client.generate(messages)
153+
print(f"🟢 OpenAI: {openai_response.content}")
154+
155+
print("\n✅ Provider comparison completed")
156+
except Exception as e:
157+
print(f"❌ Error: {e}")
158+
159+
160+
async def main():
161+
"""Run all demos."""
162+
print("\n🚀 LLM Provider Selection Demo")
163+
print("This demo shows how to use LLMClient with different providers.")
164+
print("Make sure you have configured API key in config.yaml.")
165+
166+
try:
167+
# Demo default provider
168+
await demo_default_provider()
169+
170+
# Demo Anthropic provider
171+
await demo_anthropic_provider()
172+
173+
# Demo OpenAI provider
174+
await demo_openai_provider()
175+
176+
# Demo provider comparison
177+
await demo_provider_comparison()
178+
179+
print("\n✅ All demos completed successfully!")
180+
181+
except Exception as e:
182+
print(f"\n❌ Error: {e}")
183+
import traceback
184+
185+
traceback.print_exc()
186+
187+
188+
if __name__ == "__main__":
189+
asyncio.run(main())
190+

0 commit comments

Comments
 (0)