-
Notifications
You must be signed in to change notification settings - Fork 10
docs: consolidate agent playbook and LLM reference pages #112
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Merged
Merged
Changes from all commits
Commits
File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -1,140 +1,115 @@ | ||
| --- | ||
| id: llm-api-reference | ||
| title: API Reference for LLMs | ||
| title: Building with AI Agents | ||
| slug: /api/llm-api-reference | ||
| --- | ||
|
|
||
| # GraphQL API Reference for LLM Agents | ||
| # Building with AI Agents | ||
|
|
||
| This page provides access to machine-readable formats of the Blink GraphQL API schema, specifically designed for consumption by Large Language Models (LLMs) and AI agents. | ||
| This guide is for **developers** who want to integrate AI agents or LLMs with the Blink API. It covers agent discovery, the machine-readable schema formats you can feed to your agent, and practical integration patterns. | ||
|
|
||
| ## Why Use Machine-Readable API References? | ||
| :::tip For AI agents | ||
| If you are an AI agent, skip this page and follow the [AI Agent API Playbook](/api/agent-playbook) directly. | ||
| ::: | ||
|
|
||
| When building applications that use AI agents or LLMs to interact with the Blink API, providing structured API documentation in formats that are optimized for machine consumption offers several advantages: | ||
| ## Agent Discovery with llms.txt | ||
|
|
||
| 1. **Improved Understanding**: LLMs can better understand the API structure, available operations, and data types | ||
| 2. **More Accurate Code Generation**: AI agents can generate more accurate API calls with proper parameters | ||
| 3. **Better Error Handling**: Understanding the API schema helps LLMs suggest appropriate error handling strategies | ||
| 4. **Reduced Hallucinations**: Structured documentation reduces the chance of LLMs "hallucinating" non-existent API features | ||
| The site publishes a [`llms.txt`](https://dev.blink.sv/llms.txt) file at the root — a lightweight, structured file that gives any agent the endpoints, canonical source URLs, and hard rules in a single fetch: | ||
|
|
||
| ## Recommended Formats for LLMs | ||
| ```bash | ||
| curl -s https://dev.blink.sv/llms.txt | ||
| ``` | ||
|
|
||
| Point your agent at this URL first. It contains everything needed to bootstrap: production/staging GraphQL endpoints, auth header name, prioritized source list, and safety rules. | ||
|
|
||
| We provide the following machine-readable formats of the Blink GraphQL API schema, optimized for LLM consumption: | ||
| ## Machine-Readable Schema Downloads | ||
|
|
||
| ### Enhanced JSON Schema (Recommended) | ||
| We provide the Blink GraphQL API schema in formats optimized for LLM consumption: | ||
|
|
||
| This is our recommended format for most LLM applications. It provides a clean, structured representation of the GraphQL schema with additional context: | ||
| | Format | Best for | Download | | ||
| |--------|----------|----------| | ||
| | Enhanced LLM-friendly JSON | Most LLM apps, code generation | <a href="/reference/graphql-api-for-llm.json" download>graphql-api-for-llm.json</a> | | ||
| | OpenAPI specification | Function calling, automated integrations | <a href="/reference/graphql-openapi.json" download>graphql-openapi.json</a> | | ||
|
|
||
| - <a href="/reference/graphql-api-for-llm.json" download>Download Enhanced LLM-Friendly Schema (JSON)</a> | ||
| ## Integration Patterns | ||
|
|
||
| ### OpenAPI Format (Alternative) | ||
| ### Pattern 1: System prompt + schema (framework-agnostic) | ||
|
|
||
| For systems that work better with OpenAPI specifications: | ||
| The simplest approach — fetch the schema and playbook, then include them in the system prompt of any LLM: | ||
|
|
||
| - <a href="/reference/graphql-openapi.json" download>Download OpenAPI Specification (JSON)</a> | ||
| ```bash | ||
| # Fetch the schema and playbook once | ||
| curl -s https://dev.blink.sv/reference/graphql-api-for-llm.json -o blink-schema.json | ||
| curl -s https://dev.blink.sv/api/agent-playbook -o playbook.md | ||
| ``` | ||
|
|
||
| ## How to Use with LLM Agents | ||
| Then in your system prompt: | ||
|
|
||
| Before generating API calls, follow the [AI Agent API Playbook](/api/agent-playbook). In particular: if anything is unclear, verify it against the machine-readable reference first. | ||
| ``` | ||
| You are a Blink API assistant. Follow the playbook rules below strictly. | ||
|
|
||
| <playbook> | ||
| {contents of playbook.md} | ||
| </playbook> | ||
|
|
||
| <api_schema> | ||
| {contents of blink-schema.json} | ||
| </api_schema> | ||
| ``` | ||
|
|
||
| Here are some examples of how to use these formats with popular LLM frameworks: | ||
| This works with any LLM provider — OpenAI, Anthropic, open-source models, etc. | ||
|
|
||
| ### Using with LangChain | ||
| ### Pattern 2: Python with httpx | ||
|
|
||
| ```python | ||
| from langchain.agents import Tool | ||
| from langchain.agents import initialize_agent | ||
| from langchain.llms import OpenAI | ||
| import requests | ||
| import httpx | ||
| import json | ||
|
|
||
| # Load the API schema | ||
| api_schema_url = "https://dev.blink.sv/reference/graphql-api-for-llm.json" | ||
| api_schema = json.loads(requests.get(api_schema_url).text) | ||
|
|
||
| # Create a tool that provides the API schema as context | ||
| tools = [ | ||
| Tool( | ||
| name="BlinkAPI", | ||
| func=lambda _: "API schema is already provided in your context", | ||
| description="Blink GraphQL API schema information" | ||
| ) | ||
| ] | ||
|
|
||
| # Initialize the agent with the API schema in its context | ||
| llm = OpenAI(temperature=0) | ||
| agent = initialize_agent(tools, llm, agent="zero-shot-react-description", verbose=True) | ||
|
|
||
| # Use the agent with the API schema as context | ||
| agent.run( | ||
| input="How do I create a lightning invoice using the Blink API?", | ||
| context={"api_schema": api_schema} | ||
| ) | ||
| # Fetch schema and playbook at startup | ||
| schema = httpx.get("https://dev.blink.sv/reference/graphql-api-for-llm.json").json() | ||
| playbook = httpx.get("https://dev.blink.sv/llms.txt").text | ||
|
|
||
| system_prompt = f""" | ||
| You are a Blink API assistant. | ||
| Follow these rules:\n{playbook} | ||
|
openoms marked this conversation as resolved.
|
||
|
|
||
| API schema:\n{json.dumps(schema, indent=2)} | ||
| """ | ||
|
|
||
| # Use with any LLM client | ||
| # client.chat(system=system_prompt, messages=[...]) | ||
| ``` | ||
|
|
||
| ### Using with OpenAI Assistants | ||
| ### Pattern 3: Node.js | ||
|
|
||
| ```javascript | ||
| const { OpenAI } = require('openai'); | ||
| const fs = require('fs'); | ||
| const path = require('path'); | ||
|
|
||
| const openai = new OpenAI({ | ||
| apiKey: process.env.OPENAI_API_KEY, | ||
| }); | ||
|
|
||
| async function createAssistantWithBlinkAPI() { | ||
| // Download the API schema | ||
| const response = await fetch('https://dev.blink.sv/reference/graphql-api-for-llm.json'); | ||
| const apiSchema = await response.json(); | ||
|
|
||
| // Save it to a file | ||
| fs.writeFileSync('blink-api-schema.json', JSON.stringify(apiSchema)); | ||
|
|
||
| // Create an assistant with the API schema as a file | ||
| const assistant = await openai.beta.assistants.create({ | ||
| name: "Blink API Assistant", | ||
| instructions: "You are an assistant that helps users interact with the Blink GraphQL API. Use the provided API schema to answer questions and generate code examples.", | ||
| model: "gpt-4-turbo", | ||
| tools: [{ type: "code_interpreter" }], | ||
| file_ids: [ | ||
| // Upload the API schema file to OpenAI and get its file ID | ||
| await openai.files.create({ | ||
| file: fs.createReadStream(path.resolve('blink-api-schema.json')), | ||
| purpose: 'assistants', | ||
| }).then(file => file.id) | ||
| ] | ||
| }); | ||
|
|
||
| console.log("Assistant created with ID:", assistant.id); | ||
| return assistant; | ||
| } | ||
|
|
||
| createAssistantWithBlinkAPI(); | ||
| const schema = await fetch('https://dev.blink.sv/reference/graphql-api-for-llm.json').then(r => r.json()); | ||
| const playbook = await fetch('https://dev.blink.sv/llms.txt').then(r => r.text()); | ||
|
|
||
|
openoms marked this conversation as resolved.
|
||
| const systemPrompt = ` | ||
| You are a Blink API assistant. | ||
| Follow these rules: | ||
| ${playbook} | ||
|
|
||
| API schema: | ||
| ${JSON.stringify(schema, null, 2)} | ||
| `; | ||
|
|
||
| // Use with any LLM SDK | ||
| // e.g. OpenAI, Anthropic, etc. | ||
| ``` | ||
|
|
||
| ## Generating Updated Schemas | ||
|
|
||
| The API reference files are automatically updated when the GraphQL schema changes. If you need to generate them manually, you can use the following script: | ||
| The API reference files are automatically updated when the GraphQL schema changes. To generate them manually: | ||
|
|
||
| ```bash | ||
| # Generate all formats | ||
| ./scripts/generate-api-reference-combined.sh | ||
| ``` | ||
|
|
||
| ## Recommended Format by Use Case | ||
|
|
||
| | Use Case | Recommended Format | | ||
| |----------|-------------------| | ||
| | Most LLM applications | Enhanced LLM-friendly JSON | | ||
| | OpenAI function calling | OpenAPI specification | | ||
| | Code generation | Enhanced LLM-friendly JSON | | ||
| | Automated API integration | OpenAPI specification | | ||
|
|
||
| ## Additional Resources | ||
|
|
||
| - [No API Key Operations](/api/no-api-key-operations) - Fast path for unauthenticated query/mutation discovery | ||
| - [AI Agent API Playbook](/api/agent-playbook) - Required workflow for reliable agent behavior | ||
| - [GraphQL Introduction](/api/graphql-intro) - Learn the basics of our GraphQL API | ||
| - [AI Agent API Playbook](/api/agent-playbook) - Concise instruction set consumed directly by AI agents | ||
| - [No API Key Operations](/api/no-api-key-operations) - Unauthenticated query/mutation discovery | ||
| - [GraphQL Introduction](/api/graphql-intro) - Learn the basics of the GraphQL API | ||
| - [Authentication](/api/auth) - How to authenticate with the Blink API | ||
| - [Postman Collection](/api/postman) - Test the API interactively | ||
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Uh oh!
There was an error while loading. Please reload this page.