diff --git a/docs/capabilities.md b/docs/capabilities.md index 428e7d1..fabc437 100644 --- a/docs/capabilities.md +++ b/docs/capabilities.md @@ -1,6 +1,6 @@ # What You Can Do -OpenTrace enables your AI assistant to answer questions about your system architecture. Here's what you can ask. +OpenTrace answers questions about your system architecture — either through the built-in [Chat](chat/index.md) or via [MCP integrations](integrations/index.md) with your existing AI tools. Here's what you can ask. ## Discover Your System diff --git a/docs/chat/index.md b/docs/chat/index.md new file mode 100644 index 0000000..5d9b7b9 --- /dev/null +++ b/docs/chat/index.md @@ -0,0 +1,89 @@ +# Chat + +OpenTrace Chat is a built-in AI assistant that answers questions about your system architecture directly in the OpenTrace dashboard. No external tools or MCP configuration needed — just open a conversation and ask. + +## Getting Started + +1. Log in to your OpenTrace dashboard +2. Click **Chat** in the sidebar navigation +3. Start a new conversation and ask a question about your system + +!!! tip + If you don't see Chat in the sidebar, the feature may not yet be enabled for your organization. Contact your admin or reach out to support. + +## Conversations + +Chat organizes your interactions into **conversations**. Each conversation maintains its own context, so you can have separate threads for different topics. + +### Creating a Conversation + +Click the **New Chat** button in the chat sidebar to start a fresh conversation. Your conversation will be automatically titled based on your first message. + +### Switching Between Conversations + +The chat sidebar shows all your conversations. Click any conversation to switch to it. Your message history and context are preserved. + +### Deleting Conversations + +To keep your conversation list manageable, you can delete conversations you no longer need. Note that deleting a conversation permanently removes it along with all its messages. + +## Asking Questions + +Chat has access to your full OpenTrace knowledge graph. You can ask the same types of questions you'd ask through any MCP-connected AI assistant: + +| Category | Example | +|----------|---------| +| **Discovery** | "What services exist in my system?" | +| **Dependencies** | "What does the checkout service depend on?" | +| **Impact Analysis** | "What breaks if the database goes down?" | +| **Connections** | "How does the frontend connect to payments?" | +| **Investigations** | "Help me debug why orders are slow" | + +## Understanding Responses + +Chat responses include several elements beyond plain text. + +### Tool Calls + +When the AI queries your knowledge graph, you'll see **tool call indicators** showing which tools were used and what data was retrieved. Click on a tool call to expand its details and see the raw results. + +### Artifacts + +Responses may include **artifact chips** — clickable references to nodes in your knowledge graph. These appear as inline links within the response text. + +Clicking an artifact chip opens the **Artifact Panel**, which displays a subgraph visualization centered on that node. This lets you visually explore the component and its immediate connections without leaving the chat. + +### Markdown Formatting + +Responses are rendered with full markdown support including code blocks, tables, and lists. + +## Tips for Effective Questions + +**Be specific about services and components:** + +> "What does order-service depend on?" works better than "Tell me about dependencies" + +**Use the actual names from your system:** + +> Use "user-auth-service" if that's the name in your graph, not just "auth" + +**Ask follow-up questions to drill deeper:** + +> Start with "What services are involved in checkout?" then follow up with "How does checkout-api connect to the database?" + +## Chat vs. MCP Integrations + +OpenTrace offers two ways to interact with your architecture knowledge: + +| | Chat | MCP Integrations | +|---|------|-----------------| +| **Where** | OpenTrace dashboard | Your existing AI tools (Claude, Copilot, etc.) | +| **Setup** | None — built in | Requires MCP configuration and API token | +| **Best for** | Quick exploration, visual graph browsing | Deep coding sessions, IDE workflows | +| **Graph visualization** | Inline artifact panel | Depends on the AI tool | + +Both approaches query the same knowledge graph and support the same types of questions. Choose whichever fits your workflow. + +## Chat from Slack + +You can also start OpenTrace Chat conversations directly from Slack. See the [Slack integration](slack.md) page for details. diff --git a/docs/chat/slack.md b/docs/chat/slack.md new file mode 100644 index 0000000..4282799 --- /dev/null +++ b/docs/chat/slack.md @@ -0,0 +1,40 @@ +# Slack + +Connect OpenTrace to Slack to ask questions about your system architecture directly from your workspace. + +## Overview + +The Slack integration lets you start OpenTrace Chat conversations from Slack threads. Ask a question in Slack, and the AI responds with answers grounded in your knowledge graph — without leaving your messaging workflow. + +## How It Works + +1. Start or mention OpenTrace in a Slack thread +2. The AI reads the thread context to understand your question +3. OpenTrace queries your knowledge graph and responds in the thread +4. The conversation is also visible in OpenTrace Chat for further exploration + +## Starting a Chat from Slack + +Mention OpenTrace in any channel or thread where the integration is installed. Your message and the surrounding thread context are used to inform the response. + +> @OpenTrace what services depend on the payment API? + +The AI will reply in the same thread with an answer based on your architecture graph. + +## Continuing Conversations + +Each Slack thread maps to an OpenTrace conversation. Follow-up messages in the same thread maintain context, so you can ask progressively deeper questions: + +> @OpenTrace what does checkout-api depend on? + +> @OpenTrace which of those are databases? + +> @OpenTrace what else connects to postgres-main? + +## Viewing in OpenTrace + +Conversations started from Slack also appear in the OpenTrace Chat sidebar. You can continue them in the dashboard if you want access to features like the artifact panel and graph visualization. + +## Setup + +To enable the Slack integration for your organization, contact your OpenTrace admin or visit the integrations page in the dashboard. diff --git a/docs/getting-started.md b/docs/getting-started.md index 4216316..e752cd4 100644 --- a/docs/getting-started.md +++ b/docs/getting-started.md @@ -4,9 +4,17 @@ OpenTrace gives your AI assistant deep knowledge about your system architecture. It understands your services, how they connect, and how they depend on each other. This means you can ask questions about your system and get accurate, contextual answers. -## Setup +## Try the Built-in Chat -To connect OpenTrace to your AI assistant, add the following to your MCP configuration: +The fastest way to get started is the built-in [Chat](chat/index.md). Log in to your OpenTrace dashboard, open Chat from the sidebar, and start asking questions — no configuration needed. + +> "What services are in my system?" + +> "What does the payment service depend on?" + +## Connect Your AI Assistant + +To use OpenTrace with an external AI assistant (Claude, GitHub Copilot, etc.), add the following to your MCP configuration: ```json { @@ -50,5 +58,6 @@ Your AI assistant now has the context it needs to give you meaningful answers ab ## Next Steps +- [Chat](chat/index.md) - Use the built-in chat to explore your architecture - [What You Can Do](capabilities.md) - Full list of questions you can ask - [Example Workflows](workflows.md) - Common scenarios and how to approach them diff --git a/docs/index.md b/docs/index.md index be4f0b9..6bf89e8 100644 --- a/docs/index.md +++ b/docs/index.md @@ -4,7 +4,7 @@ OpenTrace gives your AI assistant deep understanding of your system architecture ## Quick Start -Connect OpenTrace to your AI assistant and start asking questions: +Ask questions about your architecture using the built-in [Chat](chat/index.md) or connect OpenTrace to your AI assistant: > "What services are in my system?" @@ -22,6 +22,10 @@ Connect OpenTrace to your AI assistant and start asking questions: Setup OpenTrace and run your first queries +- **[Chat](chat/index.md)** + + Ask questions about your architecture directly in OpenTrace + - **[Integrations](integrations/index.md)** Connect GitHub, GitLab, and AWS EKS diff --git a/docs/workflows.md b/docs/workflows.md index b9c3ffc..9d4ee2d 100644 --- a/docs/workflows.md +++ b/docs/workflows.md @@ -112,6 +112,30 @@ The AI traces deeper connections you might have missed. --- +## Quick Architecture Q&A with Chat + +You need a fast answer about your system without switching tools. Use the built-in [Chat](chat/index.md): + +**1. Open Chat and ask your question** + +> "What services handle payments in our system?" + +The AI queries your knowledge graph and lists all payment-related services with their dependencies. + +**2. Click on an artifact to visualize** + +When the response mentions a service, click the artifact chip to open the graph visualization panel and see its connections. + +**3. Ask follow-up questions in the same conversation** + +> "Which of those services connect to external payment providers?" + +> "What would break if stripe-gateway goes down?" + +Chat maintains context across the conversation, so each follow-up builds on the previous answers. + +--- + ## Debugging Performance Issues The API is slow. Figure out why: diff --git a/mkdocs.yml b/mkdocs.yml index 6ef2030..5f33e96 100644 --- a/mkdocs.yml +++ b/mkdocs.yml @@ -69,6 +69,9 @@ extra: nav: - Home: index.md - Getting Started: getting-started.md + - Chat: + - Overview: chat/index.md + - Slack: chat/slack.md - Integrations: - Overview: integrations/index.md - AI Assistants: