A KrakenD based Agent Gateway implementation that serves as an egress API gateway for routing incoming requests to exposed agents within the agentic platform.
Following plugins are included in this repository:
The following tools are required for development:
- Docker: For containerization and local development
make pluginsThis will compile the plugins and output it to build/.
Start the agent gateway using Docker Compose:
# Then start the services
docker compose up --buildStop the services:
docker compose downThe gateway provides OpenAI-compatible endpoints (/models and /chat/completions) for agent access. For comprehensive documentation including request/response formats, configuration options, and protocol transformation details, see the OpenAI to A2A Plugin README.
Quick Examples:
List available agents:
curl http://localhost:10000/modelsSend a chat completion request:
curl http://localhost:10000/chat/completions \
-H "Content-Type: application/json" \
-d '{
"model": "local/mock-agent",
"messages": [{"role": "user", "content": "Hello"}]
}'For detailed API documentation, agent routing behavior, and model parameter formats, refer to the plugin documentation linked above.
Verify the gateway proxy functionality with a JSON-RPC message:
curl http://localhost:10000/local/mock-agent \
-H "Content-Type: application/json" \
-d '{
"jsonrpc": "2.0",
"id": 1,
"method": "message/send",
"params": {
"message": {
"role": "user",
"parts": [
{
"kind": "text",
"text": "Hello, mock agent!"
}
],
"messageId": "9229e770-767c-417b-a0b0-f0741243c589",
"contextId": "abcd1234-5678-90ab-cdef-1234567890ab"
},
"metadata": {}
}
}' | jqEffective from version: v0.5.0
The legacy per-agent chat completions endpoints have been removed in favor of the standard OpenAI-compatible global endpoint.
Old Endpoint Pattern (Removed):
POST /{agent-name}/chat/completions
New Endpoint Pattern:
POST /chat/completions
Before:
# Old endpoint - NO LONGER WORKS
curl http://localhost:10000/mock-agent/chat/completions \
-H "Content-Type: application/json" \
-d '{
"model": "gpt-4",
"messages": [
{
"role": "user",
"content": "Hello, how can you help me?"
}
]
}'After:
# New endpoint - use model parameter to specify agent
curl http://localhost:10000/chat/completions \
-H "Content-Type: application/json" \
-d '{
"model": "local/mock-agent",
"messages": [
{
"role": "user",
"content": "Hello, how can you help me?"
}
]
}'- Endpoint URL: Use
/chat/completionsinstead of/{agent-name}/chat/completions - Model Parameter: The
modelfield now specifies which agent to route to. - Standardization: The new endpoint follows the OpenAI API specification exactly
- OpenAI Compatibility: Full compatibility with OpenAI client libraries and tools
- Simplified API: Single endpoint for all agents reduces API surface
See Contribution Guide for details on contribution, and the process for submitting pull requests.