From e79fe2a4f19c341c30ee46f6438c83be593aaa3d Mon Sep 17 00:00:00 2001 From: Thomas Payet Date: Tue, 24 Jun 2025 13:55:28 +0200 Subject: [PATCH 1/3] Improve chat API documentation clarity MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Based on customer support feedback, this commit enhances the chat completions documentation with: - Added link to index conversation settings for better content optimization - Clarified that PATCH /chats/{workspace}/settings automatically creates workspaces - Added explicit privacy section stating no conversation data is stored - Improved workspace explanation in getting started guide - Added conversation management example showing stateless nature - Made documentation more welcoming and easier to understand These changes address common customer questions and make the complex chat feature more approachable. 🤖 Generated with [Claude Code](https://claude.ai/code) Co-Authored-By: Claude --- guides/ai/getting_started_with_chat.mdx | 76 +++++++++++++++++++++++++ reference/api/chats.mdx | 29 +++++++++- 2 files changed, 104 insertions(+), 1 deletion(-) diff --git a/guides/ai/getting_started_with_chat.mdx b/guides/ai/getting_started_with_chat.mdx index efb2a5430..12d6365e1 100644 --- a/guides/ai/getting_started_with_chat.mdx +++ b/guides/ai/getting_started_with_chat.mdx @@ -13,6 +13,7 @@ The chat completions feature is experimental and must be enabled before use. See ## Prerequisites Before starting, ensure you have: + - Meilisearch instance running (v1.15.1 or later) - An API key from an LLM provider (OpenAI, Azure OpenAI, Mistral, Gemini, or access to a vLLM server) - At least one index with searchable content @@ -20,6 +21,16 @@ Before starting, ensure you have: ## Quick start +### Understanding workspaces + +Think of workspaces as different "assistants" you can create for various purposes. Each workspace can have its own personality (system prompt) and capabilities. The best part? **Workspaces are created automatically** when you configure them – no separate creation step needed! + +For example: + +- `customer-support` - A helpful assistant for customer queries +- `product-search` - An expert at finding the perfect product +- `docs-helper` - A technical assistant for documentation + ### Enable the chat completions feature First, enable the chat completions experimental feature: @@ -143,6 +154,7 @@ Workspaces allow you to create isolated chat configurations for different use ca - **Documentation**: Tune for technical Q&A Each workspace maintains its own: + - LLM provider configuration - System prompt @@ -267,6 +279,70 @@ except Exception as error: +## Managing conversations + +Since Meilisearch keeps your data private and doesn't store conversations, you'll need to manage conversation history in your application. Here's a simple approach: + + + +```javascript JavaScript +// Store conversation history in your app +const conversation = []; + +// Add user message +conversation.push({ role: 'user', content: 'What is Meilisearch?' }); + +// Get response and add to history +const response = await client.chat.completions.create({ + model: 'gpt-3.5-turbo', + messages: conversation, + stream: true, +}); + +// Add assistant response to history +let assistantMessage = ''; +for await (const chunk of response) { + assistantMessage += chunk.choices[0]?.delta?.content || ''; +} +conversation.push({ role: 'assistant', content: assistantMessage }); + +// Use the full conversation for follow-up questions +conversation.push({ role: 'user', content: 'Can it handle typos?' }); +// ... continue the conversation +``` + +```python Python +# Store conversation history in your app +conversation = [] + +# Add user message +conversation.append({"role": "user", "content": "What is Meilisearch?"}) + +# Get response and add to history +response = client.chat.completions.create( + model="gpt-3.5-turbo", + messages=conversation, + stream=True, +) + +# Add assistant response to history +assistant_message = "" +for chunk in response: + if chunk.choices[0].delta.content is not None: + assistant_message += chunk.choices[0].delta.content +conversation.append({"role": "assistant", "content": assistant_message}) + +# Use the full conversation for follow-up questions +conversation.append({"role": "user", "content": "Can it handle typos?"}) +# ... continue the conversation +``` + + + + +Remember: Each request is independent, so always send the full conversation history if you want the AI to remember previous exchanges. + + ## Next steps - Explore [advanced chat API features](/reference/api/chats) diff --git a/reference/api/chats.mdx b/reference/api/chats.mdx index 802dbe7be..22f016976 100644 --- a/reference/api/chats.mdx +++ b/reference/api/chats.mdx @@ -8,6 +8,10 @@ import { RouteHighlighter } from '/snippets/route_highlighter.mdx'; The `/chats` route enables AI-powered conversational search by integrating Large Language Models (LLMs) with your Meilisearch data. This feature allows users to ask questions in natural language and receive contextual answers based on your indexed content. + +To optimize how your content is presented to the LLM, configure the [conversation settings for each index](/reference/api/settings#conversation). This allows you to customize descriptions, document templates, and search parameters for better AI responses. + + This is an experimental feature. Use the Meilisearch Cloud UI or the experimental features endpoint to activate it: @@ -19,6 +23,7 @@ curl \ "chatCompletions": true }' ``` + ## Chat completions workspace object @@ -39,6 +44,10 @@ curl \ Configure the LLM provider and settings for a chat workspace. + +If the specified workspace doesn't exist, this endpoint will automatically create it for you. No need to explicitly create workspaces beforehand! + + ```json { "source": "openAi", @@ -82,7 +91,6 @@ Configure the LLM provider and settings for a chat workspace. | **`searchQParam`** | String | A prompt to explain what the `q` parameter of the search function does and how to use it | | **`searchIndexUidParam`** | String | A prompt to explain what the `indexUid` parameter of the search function does and how to use it | - ### Request body ```json @@ -391,6 +399,19 @@ curl \ } ``` +## Privacy and data storage + + +🔒 **Your conversations are private**: Meilisearch does not store any conversation history or context between requests. Each chat completion request is stateless and independent. Any conversation continuity must be managed by your application. + + +This design ensures: + +- Complete privacy of user conversations +- No data retention of questions or answers +- Full control over conversation history in your application +- Compliance with data protection regulations + ## Authentication The chat feature integrates with Meilisearch's authentication system: @@ -549,11 +570,13 @@ This tool reports real-time progress of internal search operations. When declare **Purpose**: Provides transparency about search operations and reduces perceived latency by showing users what's happening behind the scenes. **Arguments**: + - `call_id`: Unique identifier to track the search operation - `function_name`: Name of the internal function being executed (e.g., "_meiliSearchInIndex") - `function_parameters`: JSON-encoded string containing search parameters like `q` (query) and `index_uid` **Example Response**: + ```json { "function": { @@ -570,12 +593,14 @@ Since the `/chats/{workspace}/chat/completions` endpoint is stateless, this tool **Purpose**: Maintains conversation context for better response quality in subsequent requests by preserving tool calls and results. **Arguments**: + - `role`: Message author role ("user" or "assistant") - `content`: Message content (for tool results) - `tool_calls`: Array of tool calls made by the assistant - `tool_call_id`: ID of the tool call this message responds to **Example Response**: + ```json { "function": { @@ -592,10 +617,12 @@ This tool provides the source documents that were used by the LLM to generate re **Purpose**: Shows users which documents were used to generate responses, improving trust and enabling source verification. **Arguments**: + - `call_id`: Matches the `call_id` from `_meiliSearchProgress` to associate queries with results - `documents`: JSON object containing the source documents with only displayed attributes **Example Response**: + ```json { "function": { From 32df4c30b4d5ca52edb2aa7394ca6e3cb63eb715 Mon Sep 17 00:00:00 2001 From: "github-actions[bot]" Date: Tue, 24 Jun 2025 11:55:54 +0000 Subject: [PATCH 2/3] Update code samples [skip ci] --- snippets/samples/code_samples_typo_tolerance_guide_5.mdx | 6 ++++++ 1 file changed, 6 insertions(+) diff --git a/snippets/samples/code_samples_typo_tolerance_guide_5.mdx b/snippets/samples/code_samples_typo_tolerance_guide_5.mdx index 8f5efdef6..6733ece79 100644 --- a/snippets/samples/code_samples_typo_tolerance_guide_5.mdx +++ b/snippets/samples/code_samples_typo_tolerance_guide_5.mdx @@ -15,6 +15,12 @@ client.index('movies').updateTypoTolerance({ }) ``` +```python Python +client.index('movies').update_typo_tolerance({ + 'disableOnNumbers': True +}) +``` + ```php PHP $client->index('movies')->updateTypoTolerance([ 'disableOnNumbers' => true From 8c943351c85e517f703a089e18a28efe4ffb1b7d Mon Sep 17 00:00:00 2001 From: Thomas Payet Date: Tue, 24 Jun 2025 14:08:30 +0200 Subject: [PATCH 3/3] Fix duplicate section and replace Capsule with Note component MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit - Removed duplicate "Understanding workspaces" section in getting_started_with_chat.mdx - Replaced non-standard Capsule component with built-in Note component in chats.mdx These changes address PR feedback to prevent duplicate anchors and ensure compatibility with the documentation build system. 🤖 Generated with [Claude Code](https://claude.ai/code) Co-Authored-By: Claude --- guides/ai/getting_started_with_chat.mdx | 13 ------------- reference/api/chats.mdx | 4 ++-- 2 files changed, 2 insertions(+), 15 deletions(-) diff --git a/guides/ai/getting_started_with_chat.mdx b/guides/ai/getting_started_with_chat.mdx index 12d6365e1..65bf05027 100644 --- a/guides/ai/getting_started_with_chat.mdx +++ b/guides/ai/getting_started_with_chat.mdx @@ -145,19 +145,6 @@ curl \ }' ``` -## Understanding workspaces - -Workspaces allow you to create isolated chat configurations for different use cases: - -- **Customer support**: Configure with support-focused prompts -- **Product search**: Optimize for e-commerce queries -- **Documentation**: Tune for technical Q&A - -Each workspace maintains its own: - -- LLM provider configuration -- System prompt - ## Building a chat interface with OpenAI SDK Since Meilisearch's chat endpoint is OpenAI-compatible, you can use the official OpenAI SDK: diff --git a/reference/api/chats.mdx b/reference/api/chats.mdx index 22f016976..00e46b4a6 100644 --- a/reference/api/chats.mdx +++ b/reference/api/chats.mdx @@ -401,9 +401,9 @@ curl \ ## Privacy and data storage - + 🔒 **Your conversations are private**: Meilisearch does not store any conversation history or context between requests. Each chat completion request is stateless and independent. Any conversation continuity must be managed by your application. - + This design ensures: