From e491574e3575921021abbf7beeafd293d8e73453 Mon Sep 17 00:00:00 2001 From: Alem Tuzlak Date: Tue, 23 Dec 2025 14:04:40 +0100 Subject: [PATCH 01/19] docs: add community plugins docs --- docs/community-adapters/guide.md | 122 +++++++++++++++++++++++++++++++ docs/config.json | 9 +++ 2 files changed, 131 insertions(+) create mode 100644 docs/community-adapters/guide.md diff --git a/docs/community-adapters/guide.md b/docs/community-adapters/guide.md new file mode 100644 index 00000000..a648d82b --- /dev/null +++ b/docs/community-adapters/guide.md @@ -0,0 +1,122 @@ +--- +title: "Community Adapters Guide" +slug: /community-adapters/guide +order: 1 +--- + +# Community Adapters Guide + +In this guide, you'll learn how to create and contribute community adapters for the TanStack AI ecosystem. Community adapters allow you to extend the functionality of TanStack AI by integrating with various services, APIs, or custom logic. + +## What is a Community Adapter? + +A community adapter is a reusable module that connects TanStack AI with external services or APIs. These adapters can handle tasks such as connecting to different AI models, managing data sources, or implementing custom tools. + +These are not maintained by the core TanStack AI team but are contributed by the community. They can be shared and reused across different projects. + +## Creating a Community Adapter + +To create a community adapter, follow these steps: +1. **Set Up Your Project**: The best way to do that is to check out our internal adapter implementations in the [TanStack AI GitHub repository](https://github.com/tanstack/ai/tree/main/packages/typescript). You can use these as a reference for your own adapter and the most detailed implementation is the [OpenAI adapter](https://github.com/tanstack/ai/tree/main/packages/typescript/ai-openai). +2. **Implement the model metadata**: Check out how we define the adapter metadata in the [OpenAI model metadata](https://github.com/TanStack/ai/blob/main/packages/typescript/ai-openai/src/model-meta.ts). This includes defining the models name, +input and output modality support, what features are supported (like streaming, tools, etc), costs (if known), and any other relevant information. +3. **Define the model per functionality arrays**: After you define the model metadata, you need to implement arrays for different functionalities the model supports. Generally you want to do something like this: +```typescript +export const OPENAI_CHAT_MODELS = [ + // Frontier models + GPT5_2.name, + GPT5_2_PRO.name, + GPT5_2_CHAT.name, + GPT5_1.name, + GPT5_1_CODEX.name, + GPT5.name, + GPT5_MINI.name, + GPT5_NANO.name, + GPT5_PRO.name, + GPT5_CODEX.name, + // ...other models +] as const +export const OPENAI_IMAGE_MODELS = [ + GPT_IMAGE_1.name, + GPT_IMAGE_1_MINI.name, + DALL_E_3.name, + DALL_E_2.name, +] as const + +export const OPENAI_VIDEO_MODELS = [SORA2.name, SORA2_PRO.name] as const +``` +4. **Define the model provider options**: Every model has different configuration options that users can set when using the model. After you define the model metadata, you need to implement the provider options the model supports. Generally you want to do something like this: +```typescript +export type OpenAIChatModelProviderOptionsByName = { + [GPT5_2.name]: OpenAIBaseOptions & + OpenAIReasoningOptions & + OpenAIStructuredOutputOptions & + OpenAIToolsOptions & + OpenAIStreamingOptions & + OpenAIMetadataOptions + [GPT5_2_CHAT.name]: OpenAIBaseOptions & + OpenAIReasoningOptions & + OpenAIStructuredOutputOptions & + OpenAIToolsOptions & + OpenAIStreamingOptions & + OpenAIMetadataOptions + // ... repeat for each model +} + +``` +5. **Define the model input modalities**: Every model usually supports different input modalities (like text, images, etc). After you define the model metadata, you need to implement the input modalities the model supports. Generally you want to +do something like this: +```typescript +export type OpenAIModelInputModalitiesByName = { + [GPT5_2.name]: typeof GPT5_2.supports.input + [GPT5_2_PRO.name]: typeof GPT5_2_PRO.supports.input + [GPT5_2_CHAT.name]: typeof GPT5_2_CHAT.supports.input + // ... repeat for each model +} +``` +6. **Define your model options**: After you define the model metadata, you need to implement the model options the model supports. Generally you want to do something like this (you can see an example for OpenAI models [here](https://github.com/TanStack/ai/blob/main/packages/typescript/ai-openai/src/text/text-provider-options.ts)): +```typescript +export interface OpenAIBaseOptions { + // base options that every chat model supports +} + +// Feature fragments that can be stitched per-model + +/** + * Reasoning options for models + */ +export interface OpenAIReasoningOptions { + //... +} + +/** + * Structured output options for models. + */ +export interface OpenAIStructuredOutputOptions { + //... +} +``` + +What you are going for is very specific to the adapter you are building so there is no one-size-fits-all example here. +But the general rule of thumb is that you have the base options that every model supports and then you have feature fragments that can be stitched together per-model. + +Here's an example of one of the gpt models that supports every feature: +```typescript +export type OpenAIChatModelProviderOptionsByName = { + [GPT5_2.name]: OpenAIBaseOptions & + OpenAIReasoningOptions & + OpenAIStructuredOutputOptions & + OpenAIToolsOptions & + OpenAIStreamingOptions & + OpenAIMetadataOptions +} +``` + +7. **Implement the model logic**: Finally, you need to implement the actual logic for your adapter. This includes handling requests to the external service, processing responses, and integrating with TanStack AI's APIs. You can refer to the [OpenAI adapter implementation](https://github.com/TanStack/ai/blob/main/packages/typescript/ai-openai/src/adapters/text.ts) for a detailed example. What's important here is to make sure you properly type the inputs and outputs based on the model options you defined earlier. Every functionality is split per adapter, so make sure to implement the text adapter, chat adapter, image adapter, etc as needed. + +8. **Publish your package and open up a PR**: Once your adapter is complete and tested, you can publish it as an npm package and open up a PR to the [TanStack AI repository](https://github.com/TanStack/ai/pulls) to have it listed in the community adapters section of the documentation. Add it into [here](https://github.com/TanStack/ai/tree/main/docs/community-adapters). + +9. **Run the script to configure the docs**: After adding your adapter in make sure to run `pnpm run sync-docs-config` in the root of the TanStack AI monorepo to have your adapter show up in the docs, after that open up a PR to have it merged +and that is it! + +10. **Maintain your adapter**: As a community adapter, it's important to keep your adapter up-to-date with any changes in the external service or TanStack AI's APIs. Monitor issues and feedback from users to ensure your adapter remains functional and useful. If you add any changes or new features open up a new PR towards the docs to have them reflected there as well. \ No newline at end of file diff --git a/docs/config.json b/docs/config.json index 375489c0..51f7caf0 100644 --- a/docs/config.json +++ b/docs/config.json @@ -142,6 +142,15 @@ } ] }, + { + "label": "Community Adapters", + "children": [ + { + "label": "Community Adapters Guide", + "to": "community-adapters/guide" + } + ] + }, { "label": "Class References", "collapsible": true, From 94587f6305c098d532623064bc57c42b7a8408e2 Mon Sep 17 00:00:00 2001 From: Alem Tuzlak Date: Wed, 24 Dec 2025 11:46:57 +0100 Subject: [PATCH 02/19] Update docs/community-adapters/guide.md Co-authored-by: Sarah --- docs/community-adapters/guide.md | 4 +++- 1 file changed, 3 insertions(+), 1 deletion(-) diff --git a/docs/community-adapters/guide.md b/docs/community-adapters/guide.md index a648d82b..d942b95a 100644 --- a/docs/community-adapters/guide.md +++ b/docs/community-adapters/guide.md @@ -6,7 +6,9 @@ order: 1 # Community Adapters Guide -In this guide, you'll learn how to create and contribute community adapters for the TanStack AI ecosystem. Community adapters allow you to extend the functionality of TanStack AI by integrating with various services, APIs, or custom logic. +This guide explains how to create and contribute community adapters for the TanStack AI ecosystem. + +Community adapters extend TanStack AI by integrating external services, APIs, or custom model logic. They are authored and maintained by the community and can be reused across projects. ## What is a Community Adapter? From cb5c4e4b63f93fd91e6a29be019a5c94cfe3f530 Mon Sep 17 00:00:00 2001 From: Alem Tuzlak Date: Wed, 24 Dec 2025 11:47:05 +0100 Subject: [PATCH 03/19] Update docs/community-adapters/guide.md Co-authored-by: Sarah --- docs/community-adapters/guide.md | 10 ++++++++-- 1 file changed, 8 insertions(+), 2 deletions(-) diff --git a/docs/community-adapters/guide.md b/docs/community-adapters/guide.md index d942b95a..8fc9a579 100644 --- a/docs/community-adapters/guide.md +++ b/docs/community-adapters/guide.md @@ -12,9 +12,15 @@ Community adapters extend TanStack AI by integrating external services, APIs, or ## What is a Community Adapter? -A community adapter is a reusable module that connects TanStack AI with external services or APIs. These adapters can handle tasks such as connecting to different AI models, managing data sources, or implementing custom tools. +A community adapter is a reusable module that connects TanStack AI to an external provider or system. -These are not maintained by the core TanStack AI team but are contributed by the community. They can be shared and reused across different projects. +Common use cases include: +- Integrating third-party AI model providers +- Implementing custom inference or routing logic +- Exposing provider-specific tools or capabilities +- Connecting to non-LLM AI services (e.g. images, embeddings, video) + +Community adapters are **not maintained by the core TanStack AI team**, but can be and reused across different projects. ## Creating a Community Adapter From 265cda16afa9ed6d1015e7ac5e7e81a43ab5f651 Mon Sep 17 00:00:00 2001 From: Alem Tuzlak Date: Wed, 24 Dec 2025 11:47:12 +0100 Subject: [PATCH 04/19] Update docs/community-adapters/guide.md Co-authored-by: Sarah --- docs/community-adapters/guide.md | 7 ++++++- 1 file changed, 6 insertions(+), 1 deletion(-) diff --git a/docs/community-adapters/guide.md b/docs/community-adapters/guide.md index 8fc9a579..0d342abe 100644 --- a/docs/community-adapters/guide.md +++ b/docs/community-adapters/guide.md @@ -25,7 +25,12 @@ Community adapters are **not maintained by the core TanStack AI team**, but can ## Creating a Community Adapter To create a community adapter, follow these steps: -1. **Set Up Your Project**: The best way to do that is to check out our internal adapter implementations in the [TanStack AI GitHub repository](https://github.com/tanstack/ai/tree/main/packages/typescript). You can use these as a reference for your own adapter and the most detailed implementation is the [OpenAI adapter](https://github.com/tanstack/ai/tree/main/packages/typescript/ai-openai). +### 1. Set up your project + +Start by reviewing the [existing internal adapter implementations in the TanStack AI GitHub repository](https://github.com/tanstack/ai/tree/main/packages/typescript). These define the expected structure, conventions, and integration patterns. + +For a complete, detailed reference, use the [OpenAI adapter]((https://github.com/tanstack/ai/tree/main/packages/typescript/ai-openai), which is the most fully featured implementation. + 2. **Implement the model metadata**: Check out how we define the adapter metadata in the [OpenAI model metadata](https://github.com/TanStack/ai/blob/main/packages/typescript/ai-openai/src/model-meta.ts). This includes defining the models name, input and output modality support, what features are supported (like streaming, tools, etc), costs (if known), and any other relevant information. 3. **Define the model per functionality arrays**: After you define the model metadata, you need to implement arrays for different functionalities the model supports. Generally you want to do something like this: From 2e36caf6f72cfcb25e0425b3d52c950a3e48254c Mon Sep 17 00:00:00 2001 From: Alem Tuzlak Date: Wed, 24 Dec 2025 11:47:17 +0100 Subject: [PATCH 05/19] Update docs/community-adapters/guide.md Co-authored-by: Sarah --- docs/community-adapters/guide.md | 16 ++++++++++++++-- 1 file changed, 14 insertions(+), 2 deletions(-) diff --git a/docs/community-adapters/guide.md b/docs/community-adapters/guide.md index 0d342abe..ecf99630 100644 --- a/docs/community-adapters/guide.md +++ b/docs/community-adapters/guide.md @@ -31,8 +31,20 @@ Start by reviewing the [existing internal adapter implementations in the TanStac For a complete, detailed reference, use the [OpenAI adapter]((https://github.com/tanstack/ai/tree/main/packages/typescript/ai-openai), which is the most fully featured implementation. -2. **Implement the model metadata**: Check out how we define the adapter metadata in the [OpenAI model metadata](https://github.com/TanStack/ai/blob/main/packages/typescript/ai-openai/src/model-meta.ts). This includes defining the models name, -input and output modality support, what features are supported (like streaming, tools, etc), costs (if known), and any other relevant information. +### Define model metadata + +Model metadata describes each model’s capabilities and constraints and is used by TanStack AI for compatibility checks and feature selection. + +Your metadata should define, at a minimum: + +- Model name and identifier +- Supported input and output modalities +- Supported features (e.g. streaming, tools, structured output) +- Pricing or cost information (if available) +- Any provider-specific notes or limitations + +Refer to the [OpenAI adapter’s model metadata](https://github.com/TanStack/ai/blob/main/packages/typescript/ai-openai/src/model-meta.ts) for a concrete example. + 3. **Define the model per functionality arrays**: After you define the model metadata, you need to implement arrays for different functionalities the model supports. Generally you want to do something like this: ```typescript export const OPENAI_CHAT_MODELS = [ From 9bf2ec405264f582541c3a4eda7d36a7d6bcac8f Mon Sep 17 00:00:00 2001 From: Alem Tuzlak Date: Wed, 24 Dec 2025 11:47:28 +0100 Subject: [PATCH 06/19] Update docs/community-adapters/guide.md Co-authored-by: Sarah --- docs/community-adapters/guide.md | 2 ++ 1 file changed, 2 insertions(+) diff --git a/docs/community-adapters/guide.md b/docs/community-adapters/guide.md index ecf99630..e155ad42 100644 --- a/docs/community-adapters/guide.md +++ b/docs/community-adapters/guide.md @@ -137,6 +137,8 @@ export type OpenAIChatModelProviderOptionsByName = { } ``` +There is no single correct composition; this structure should reflect the capabilities of the provider you are integrating. + 7. **Implement the model logic**: Finally, you need to implement the actual logic for your adapter. This includes handling requests to the external service, processing responses, and integrating with TanStack AI's APIs. You can refer to the [OpenAI adapter implementation](https://github.com/TanStack/ai/blob/main/packages/typescript/ai-openai/src/adapters/text.ts) for a detailed example. What's important here is to make sure you properly type the inputs and outputs based on the model options you defined earlier. Every functionality is split per adapter, so make sure to implement the text adapter, chat adapter, image adapter, etc as needed. 8. **Publish your package and open up a PR**: Once your adapter is complete and tested, you can publish it as an npm package and open up a PR to the [TanStack AI repository](https://github.com/TanStack/ai/pulls) to have it listed in the community adapters section of the documentation. Add it into [here](https://github.com/TanStack/ai/tree/main/docs/community-adapters). From a05a29e1584d29dc43f8d30c0666640c108657d6 Mon Sep 17 00:00:00 2001 From: Alem Tuzlak Date: Wed, 24 Dec 2025 11:47:35 +0100 Subject: [PATCH 07/19] Update docs/community-adapters/guide.md Co-authored-by: Sarah --- docs/community-adapters/guide.md | 20 +++++++++++++++++++- 1 file changed, 19 insertions(+), 1 deletion(-) diff --git a/docs/community-adapters/guide.md b/docs/community-adapters/guide.md index e155ad42..6882d666 100644 --- a/docs/community-adapters/guide.md +++ b/docs/community-adapters/guide.md @@ -139,7 +139,25 @@ export type OpenAIChatModelProviderOptionsByName = { There is no single correct composition; this structure should reflect the capabilities of the provider you are integrating. -7. **Implement the model logic**: Finally, you need to implement the actual logic for your adapter. This includes handling requests to the external service, processing responses, and integrating with TanStack AI's APIs. You can refer to the [OpenAI adapter implementation](https://github.com/TanStack/ai/blob/main/packages/typescript/ai-openai/src/adapters/text.ts) for a detailed example. What's important here is to make sure you properly type the inputs and outputs based on the model options you defined earlier. Every functionality is split per adapter, so make sure to implement the text adapter, chat adapter, image adapter, etc as needed. +### 7. Implement adapter logic + +Finally, implement the adapter’s runtime logic. + +This includes: +- Sending requests to the external service +- Handling streaming and non-streaming responses +- Mapping provider responses to TanStack AI types +- Enforcing model-specific options and constraints + +Adapters are implemented per capability, so only implement what your provider supports: + +- Text adapter +- Chat adapter +- Image adapter +- Embeddings adapter +- Video adapter + +Refer to the [OpenAI adapter](https://github.com/TanStack/ai/blob/main/packages/typescript/ai-openai/src/adapters/text.ts) for a complete, end-to-end implementation example. 8. **Publish your package and open up a PR**: Once your adapter is complete and tested, you can publish it as an npm package and open up a PR to the [TanStack AI repository](https://github.com/TanStack/ai/pulls) to have it listed in the community adapters section of the documentation. Add it into [here](https://github.com/TanStack/ai/tree/main/docs/community-adapters). From 43d5b4a7228389ad40f3ade538f797f04610e9c9 Mon Sep 17 00:00:00 2001 From: Alem Tuzlak Date: Wed, 24 Dec 2025 11:47:42 +0100 Subject: [PATCH 08/19] Update docs/community-adapters/guide.md Co-authored-by: Sarah --- docs/community-adapters/guide.md | 7 ++++++- 1 file changed, 6 insertions(+), 1 deletion(-) diff --git a/docs/community-adapters/guide.md b/docs/community-adapters/guide.md index 6882d666..3a217dbb 100644 --- a/docs/community-adapters/guide.md +++ b/docs/community-adapters/guide.md @@ -159,7 +159,12 @@ Adapters are implemented per capability, so only implement what your provider su Refer to the [OpenAI adapter](https://github.com/TanStack/ai/blob/main/packages/typescript/ai-openai/src/adapters/text.ts) for a complete, end-to-end implementation example. -8. **Publish your package and open up a PR**: Once your adapter is complete and tested, you can publish it as an npm package and open up a PR to the [TanStack AI repository](https://github.com/TanStack/ai/pulls) to have it listed in the community adapters section of the documentation. Add it into [here](https://github.com/TanStack/ai/tree/main/docs/community-adapters). +### 8. Publish and submit a PR + +Once your adapter is complete: +1. Publish it as an npm package +2. Open a PR to the [TanStack AI repository](https://github.com/TanStack/ai/pulls) +3. Add your adapter to the [Community Adapters list in the documentation](https://github.com/TanStack/ai/tree/main/docs/community-adapters) 9. **Run the script to configure the docs**: After adding your adapter in make sure to run `pnpm run sync-docs-config` in the root of the TanStack AI monorepo to have your adapter show up in the docs, after that open up a PR to have it merged and that is it! From 9546d8313e62c956ee5874ce25e650b8aeaefe3b Mon Sep 17 00:00:00 2001 From: Alem Tuzlak Date: Wed, 24 Dec 2025 11:47:50 +0100 Subject: [PATCH 09/19] Update docs/community-adapters/guide.md Co-authored-by: Sarah --- docs/community-adapters/guide.md | 3 ++- 1 file changed, 2 insertions(+), 1 deletion(-) diff --git a/docs/community-adapters/guide.md b/docs/community-adapters/guide.md index 3a217dbb..8cc202e2 100644 --- a/docs/community-adapters/guide.md +++ b/docs/community-adapters/guide.md @@ -24,7 +24,8 @@ Community adapters are **not maintained by the core TanStack AI team**, but can ## Creating a Community Adapter -To create a community adapter, follow these steps: +Follow the steps below to build a well-structured, type-safe adapter. + ### 1. Set up your project Start by reviewing the [existing internal adapter implementations in the TanStack AI GitHub repository](https://github.com/tanstack/ai/tree/main/packages/typescript). These define the expected structure, conventions, and integration patterns. From 764805843e80c007cc851667188558b2266c2998 Mon Sep 17 00:00:00 2001 From: Alem Tuzlak Date: Wed, 24 Dec 2025 11:47:55 +0100 Subject: [PATCH 10/19] Update docs/community-adapters/guide.md Co-authored-by: Sarah --- docs/community-adapters/guide.md | 5 +++-- 1 file changed, 3 insertions(+), 2 deletions(-) diff --git a/docs/community-adapters/guide.md b/docs/community-adapters/guide.md index 8cc202e2..fc78755d 100644 --- a/docs/community-adapters/guide.md +++ b/docs/community-adapters/guide.md @@ -167,7 +167,8 @@ Once your adapter is complete: 2. Open a PR to the [TanStack AI repository](https://github.com/TanStack/ai/pulls) 3. Add your adapter to the [Community Adapters list in the documentation](https://github.com/TanStack/ai/tree/main/docs/community-adapters) -9. **Run the script to configure the docs**: After adding your adapter in make sure to run `pnpm run sync-docs-config` in the root of the TanStack AI monorepo to have your adapter show up in the docs, after that open up a PR to have it merged -and that is it! +### 9. Sync documentation configuration + +After adding your adapter, run the `pnpm run sync-docs-config` in the root of the TanStack AI monorepo. This ensures your adapter appears correctly in the documentation navigation. Open a PR with the generated changes. 10. **Maintain your adapter**: As a community adapter, it's important to keep your adapter up-to-date with any changes in the external service or TanStack AI's APIs. Monitor issues and feedback from users to ensure your adapter remains functional and useful. If you add any changes or new features open up a new PR towards the docs to have them reflected there as well. \ No newline at end of file From 867d413c80094cefc8f1057eb61b91bbfcacfa18 Mon Sep 17 00:00:00 2001 From: Alem Tuzlak Date: Wed, 24 Dec 2025 11:48:00 +0100 Subject: [PATCH 11/19] Update docs/community-adapters/guide.md Co-authored-by: Sarah --- docs/community-adapters/guide.md | 13 ++++++++++++- 1 file changed, 12 insertions(+), 1 deletion(-) diff --git a/docs/community-adapters/guide.md b/docs/community-adapters/guide.md index fc78755d..f75500bb 100644 --- a/docs/community-adapters/guide.md +++ b/docs/community-adapters/guide.md @@ -171,4 +171,15 @@ Once your adapter is complete: After adding your adapter, run the `pnpm run sync-docs-config` in the root of the TanStack AI monorepo. This ensures your adapter appears correctly in the documentation navigation. Open a PR with the generated changes. -10. **Maintain your adapter**: As a community adapter, it's important to keep your adapter up-to-date with any changes in the external service or TanStack AI's APIs. Monitor issues and feedback from users to ensure your adapter remains functional and useful. If you add any changes or new features open up a new PR towards the docs to have them reflected there as well. \ No newline at end of file +### 10. Maintain your adapter + +As a community adapter author, you are responsible for ongoing maintenance. + +This includes: + +- Tracking upstream provider API changes +- Keeping compatibility with TanStack AI releases +- Addressing issues and feedback from users +- Updating documentation when features change + +If you add new features or breaking changes, open a follow-up PR to keep the docs in sync. \ No newline at end of file From 7d410fb1db25e7d7d212a4625ec9ae5ca5463279 Mon Sep 17 00:00:00 2001 From: Alem Tuzlak Date: Wed, 24 Dec 2025 11:48:09 +0100 Subject: [PATCH 12/19] Update docs/community-adapters/guide.md Co-authored-by: Sarah --- docs/community-adapters/guide.md | 6 +++++- 1 file changed, 5 insertions(+), 1 deletion(-) diff --git a/docs/community-adapters/guide.md b/docs/community-adapters/guide.md index f75500bb..8cc23818 100644 --- a/docs/community-adapters/guide.md +++ b/docs/community-adapters/guide.md @@ -46,7 +46,11 @@ Your metadata should define, at a minimum: Refer to the [OpenAI adapter’s model metadata](https://github.com/TanStack/ai/blob/main/packages/typescript/ai-openai/src/model-meta.ts) for a concrete example. -3. **Define the model per functionality arrays**: After you define the model metadata, you need to implement arrays for different functionalities the model supports. Generally you want to do something like this: +### 3. Define model capability arrays + +After defining metadata, group models by supported functionality using exported arrays. These arrays allow TanStack AI to automatically select compatible models for a given task. + +Example: ```typescript export const OPENAI_CHAT_MODELS = [ // Frontier models From 87edd1f5c2cdea22087a3d9511476d026611a07d Mon Sep 17 00:00:00 2001 From: Alem Tuzlak Date: Wed, 24 Dec 2025 11:48:18 +0100 Subject: [PATCH 13/19] Update docs/community-adapters/guide.md Co-authored-by: Sarah --- docs/community-adapters/guide.md | 8 +++++++- 1 file changed, 7 insertions(+), 1 deletion(-) diff --git a/docs/community-adapters/guide.md b/docs/community-adapters/guide.md index 8cc23818..125eb031 100644 --- a/docs/community-adapters/guide.md +++ b/docs/community-adapters/guide.md @@ -75,7 +75,13 @@ export const OPENAI_IMAGE_MODELS = [ export const OPENAI_VIDEO_MODELS = [SORA2.name, SORA2_PRO.name] as const ``` -4. **Define the model provider options**: Every model has different configuration options that users can set when using the model. After you define the model metadata, you need to implement the provider options the model supports. Generally you want to do something like this: +Each array should only include models that fully support the associated functionality. + +### 4. Define model provider options + +Each model exposes a different set of configurable options. These options must be typed per model name so that users only see valid configuration options. + +Example: ```typescript export type OpenAIChatModelProviderOptionsByName = { [GPT5_2.name]: OpenAIBaseOptions & From 8c02442af50319606d8942d7f5f5f97f7f4c001b Mon Sep 17 00:00:00 2001 From: Alem Tuzlak Date: Wed, 24 Dec 2025 11:48:27 +0100 Subject: [PATCH 14/19] Update docs/community-adapters/guide.md Co-authored-by: Sarah --- docs/community-adapters/guide.md | 9 +++++++-- 1 file changed, 7 insertions(+), 2 deletions(-) diff --git a/docs/community-adapters/guide.md b/docs/community-adapters/guide.md index 125eb031..b71c40ba 100644 --- a/docs/community-adapters/guide.md +++ b/docs/community-adapters/guide.md @@ -100,8 +100,13 @@ export type OpenAIChatModelProviderOptionsByName = { } ``` -5. **Define the model input modalities**: Every model usually supports different input modalities (like text, images, etc). After you define the model metadata, you need to implement the input modalities the model supports. Generally you want to -do something like this: +This ensures strict type safety and feature correctness at compile time. + +### 5. Define supported input modalities + +Models typically support different input modalities (e.g. text, images, audio). These must be defined per model to prevent invalid usage. + +Example: ```typescript export type OpenAIModelInputModalitiesByName = { [GPT5_2.name]: typeof GPT5_2.supports.input From bd3df0411d936bec69d4f89c4d247984874227ab Mon Sep 17 00:00:00 2001 From: Alem Tuzlak Date: Wed, 24 Dec 2025 11:48:35 +0100 Subject: [PATCH 15/19] Update docs/community-adapters/guide.md Co-authored-by: Sarah --- docs/community-adapters/guide.md | 11 ++++++++++- 1 file changed, 10 insertions(+), 1 deletion(-) diff --git a/docs/community-adapters/guide.md b/docs/community-adapters/guide.md index b71c40ba..2eaed17d 100644 --- a/docs/community-adapters/guide.md +++ b/docs/community-adapters/guide.md @@ -115,7 +115,16 @@ export type OpenAIModelInputModalitiesByName = { // ... repeat for each model } ``` -6. **Define your model options**: After you define the model metadata, you need to implement the model options the model supports. Generally you want to do something like this (you can see an example for OpenAI models [here](https://github.com/TanStack/ai/blob/main/packages/typescript/ai-openai/src/text/text-provider-options.ts)): + +### 6. Define model option fragments + +Model options should be composed from reusable fragments rather than duplicated per model. + +A common pattern is: +- Base options shared by all models +- Feature fragments that are stitched together per model + +Example (based on [OpenAI models](https://github.com/TanStack/ai/blob/main/packages/typescript/ai-openai/src/text/text-provider-options.ts)): ```typescript export interface OpenAIBaseOptions { // base options that every chat model supports From 8da35746e3b4bdbd44d1ddf6b60aea7877e589e7 Mon Sep 17 00:00:00 2001 From: Alem Tuzlak Date: Wed, 24 Dec 2025 11:48:43 +0100 Subject: [PATCH 16/19] Update docs/community-adapters/guide.md Co-authored-by: Sarah --- docs/community-adapters/guide.md | 5 ++--- 1 file changed, 2 insertions(+), 3 deletions(-) diff --git a/docs/community-adapters/guide.md b/docs/community-adapters/guide.md index 2eaed17d..889e2cdb 100644 --- a/docs/community-adapters/guide.md +++ b/docs/community-adapters/guide.md @@ -147,10 +147,9 @@ export interface OpenAIStructuredOutputOptions { } ``` -What you are going for is very specific to the adapter you are building so there is no one-size-fits-all example here. -But the general rule of thumb is that you have the base options that every model supports and then you have feature fragments that can be stitched together per-model. -Here's an example of one of the gpt models that supports every feature: +Models can then opt into only the features they support: + ```typescript export type OpenAIChatModelProviderOptionsByName = { [GPT5_2.name]: OpenAIBaseOptions & From 748d01a86e3fb4e6ef4a89434e0088d5ccc64d59 Mon Sep 17 00:00:00 2001 From: Alem Tuzlak Date: Wed, 24 Dec 2025 15:33:17 +0100 Subject: [PATCH 17/19] Update docs/community-adapters/guide.md Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com> --- docs/community-adapters/guide.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/community-adapters/guide.md b/docs/community-adapters/guide.md index 889e2cdb..44464d68 100644 --- a/docs/community-adapters/guide.md +++ b/docs/community-adapters/guide.md @@ -30,7 +30,7 @@ Follow the steps below to build a well-structured, type-safe adapter. Start by reviewing the [existing internal adapter implementations in the TanStack AI GitHub repository](https://github.com/tanstack/ai/tree/main/packages/typescript). These define the expected structure, conventions, and integration patterns. -For a complete, detailed reference, use the [OpenAI adapter]((https://github.com/tanstack/ai/tree/main/packages/typescript/ai-openai), which is the most fully featured implementation. +For a complete, detailed reference, use the [OpenAI adapter](https://github.com/tanstack/ai/tree/main/packages/typescript/ai-openai), which is the most fully featured implementation. ### Define model metadata From fafd507bcb6c446a53d623a3b5c8146f52c473a8 Mon Sep 17 00:00:00 2001 From: Alem Tuzlak Date: Wed, 24 Dec 2025 15:33:44 +0100 Subject: [PATCH 18/19] Update docs/community-adapters/guide.md Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com> --- docs/community-adapters/guide.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/community-adapters/guide.md b/docs/community-adapters/guide.md index 44464d68..252c1d5d 100644 --- a/docs/community-adapters/guide.md +++ b/docs/community-adapters/guide.md @@ -20,7 +20,7 @@ Common use cases include: - Exposing provider-specific tools or capabilities - Connecting to non-LLM AI services (e.g. images, embeddings, video) -Community adapters are **not maintained by the core TanStack AI team**, but can be and reused across different projects. +Community adapters are **not maintained by the core TanStack AI team**, and can be reused across different projects. ## Creating a Community Adapter From 1e1f46d3538bb5692ec60ac0649d9d1c531d4c74 Mon Sep 17 00:00:00 2001 From: Alem Tuzlak Date: Wed, 24 Dec 2025 15:34:07 +0100 Subject: [PATCH 19/19] Update docs/community-adapters/guide.md Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com> --- docs/community-adapters/guide.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/community-adapters/guide.md b/docs/community-adapters/guide.md index 252c1d5d..22168564 100644 --- a/docs/community-adapters/guide.md +++ b/docs/community-adapters/guide.md @@ -32,7 +32,7 @@ Start by reviewing the [existing internal adapter implementations in the TanStac For a complete, detailed reference, use the [OpenAI adapter](https://github.com/tanstack/ai/tree/main/packages/typescript/ai-openai), which is the most fully featured implementation. -### Define model metadata +### 2. Define model metadata Model metadata describes each model’s capabilities and constraints and is used by TanStack AI for compatibility checks and feature selection.