Production-friendly OpenTelemetry GenAI instrumentation for Go. Add one middleware, one transport, and one helper to get a clean GenAI span tree in minutes.
Why teams adopt it fast:
- Drop-in HTTP + gRPC middleware and client interceptors.
- Span tree matches GenAI semconv for agents, inference, and tools.
- Conversation ID propagation for multi-turn tracing without long-lived traces.
- Sensitive content capture is opt-in with redaction and external hooks.
Span tree this library emits:
inbound request span (otelhttp/otelgrpc)
└─ invoke_agent (INTERNAL)
├─ chat|generate_content|embeddings (CLIENT)
└─ execute_tool <tool.name> (INTERNAL)
- Mirrors the OpenTelemetry GenAI semantic conventions for agent, inference, and tool spans.
- Works whether the model call is HTTP, gRPC, SDK, or an internal gateway.
- Keeps traces bounded while conversation IDs connect multi-turn flows.
GenAI semconv spec: https://opentelemetry.io/docs/specs/semconv/gen-ai/
go get github.com/pratikfandade/otel-ai-gocfg := config.DefaultConfig()
// Server side: agent spans
middleware := agent.NewMiddleware(cfg)
http.Handle("/chat", middleware.Wrap(chatHandler))
// Client side: inference spans
httpClient := &http.Client{Transport: client.NewTransport(cfg)}- Wrap inbound handlers with
agent.NewMiddleware(cfg)or gRPC server interceptors. - Use
client.NewTransport(cfg)for HTTP model calls. - Wrap tool functions with
tool.Executefor tool spans. - Enable conversation IDs via request headers or metadata.
- Turn on content capture only when needed.
cfg := config.DefaultConfig()
middleware := agent.NewMiddleware(cfg)
http.Handle("/chat", middleware.Wrap(chatHandler))cfg := config.DefaultConfig()
client := &http.Client{
Transport: client.NewTransport(cfg),
}For internal LLM gateways, pass request metadata via context:
ctx := client.WithCallInfo(ctx, client.CallInfo{
Provider: "internal-gateway",
Operation: "chat",
Model: "gateway-model-1",
})
conn, _ := grpc.DialContext(
ctx,
"localhost:50051",
grpc.WithUnaryInterceptor(client.UnaryClientInterceptor(config.DefaultConfig())),
grpc.WithStreamInterceptor(client.StreamClientInterceptor(config.DefaultConfig())),
)result, err := tool.Execute(ctx, tool.Call{
Name: "lookup_weather",
Type: "function",
CallID: "call_123",
}, func(ctx context.Context) (any, error) {
return map[string]any{"forecast": "rainy"}, nil
})cfg := config.DefaultConfig()
cfg.Content.CaptureEnabled = true
cfg.Content.CaptureFields = []string{
config.CaptureFieldPrompt,
config.CaptureFieldResponse,
}
cfg.Content.RedactFunc = func(value string) string {
return "[redacted]"
}- OpenAI (chat, embeddings)
- Anthropic (messages)
- Gemini / Vertex AI (generateContent, embeddings)
- AWS Bedrock (invoke model)
GenAI semantic conventions are still in Development. This library defaults to stable behavior and supports:
export OTEL_SEMCONV_STABILITY_OPT_IN=gen_ai_latest_experimental- Full stack span tree demo:
examples/full_stack/main.go - gRPC client call info:
examples/grpc_client/main.go
- Target semconv:
gen_aiv1.37.0 - Default behavior: stable emission
- Opt-in to experimental attributes with
OTEL_SEMCONV_STABILITY_OPT_IN=gen_ai_latest_experimental