-
-
Notifications
You must be signed in to change notification settings - Fork 87
docs: add community plugins docs #181
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. Weβll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
π WalkthroughWalkthroughAdded a new "Community Adapters Guide" documentation page and updated docs navigation to include a "Community Adapters" section linking to that guide. No executable code or public API changes. Changes
Estimated code review effortπ― 2 (Simple) | β±οΈ ~10 minutes Poem
Pre-merge checks and finishing touchesβ Failed checks (2 warnings)
β Passed checks (1 passed)
β¨ Finishing touchesπ§ͺ Generate unit tests (beta)
π Recent review detailsConfiguration used: defaults Review profile: CHILL Plan: Pro π Files selected for processing (2)
π§ Files skipped from review as they are similar to previous changes (1)
π Additional comments (1)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
π§Ή Nitpick comments (5)
docs/community-adapters/guide.md (5)
24-47: Clarify that model names are illustrative examples.The code examples reference models like
GPT5_2,SORA2, and other variants that don't currently exist. Consider adding a note that these are illustrative placeholders, and adapter creators should replace them with actual model names from their target service.π Suggested clarification
Add a note before or after the code block:
3. **Define the model per functionality arrays**: After you define the model metadata, you need to implement arrays for different functionalities the model supports. Generally you want to do something like this: + +> **Note**: The model names in these examples (GPT5_2, SORA2, etc.) are illustrative. Replace them with the actual model identifiers from your target service. + ```typescript
48-77: Consider varying the instructional phrasing.Steps 4, 5, and 6 all follow the same pattern: "After you define the model metadata, you need to implement..." followed by "Generally you want to do something like this." Varying this language would improve readability and engagement.
π Example variations
4. **Define the model provider options**: Configure the options each model accepts: 5. **Specify input modalities**: Map which input types (text, images, audio, etc.) each model supports: 6. **Configure model-specific options**: Set up the configuration interface for your models:
115-120: Consider adding packaging and distribution best practices.The guide covers implementation details but doesn't mention packaging considerations like module exports, tree-shaking support, or package structure. Adding a section on these topics would help community adapter creators follow the established patterns in official adapters.
Based on learnings, official adapters use tree-shakeable exports with clear subpath patterns.
π Suggested addition
Consider adding a step between 7 and 8:
8. **Configure your package for optimal distribution**: - Use clear subpath exports in `package.json` (e.g., `your-package/adapters/text`) - Enable tree-shaking by exporting individual adapter modules separately - Follow TypeScript best practices for type exports - Example package.json exports: ```json { "exports": { "./adapters/text": "./dist/adapters/text.js", "./adapters/image": "./dist/adapters/image.js" } }</details> --- `117-120`: **Clarify the PR submission process.** Steps 8-9 could be clearer about what community contributors are submitting to the TanStack AI repository. Since community adapters are separate npm packages, clarify that the PR to TanStack AI should only include documentation, not the adapter implementation code. <details> <summary>π Suggested clarification</summary> ```diff -8. **Publish your package and open up a PR**: Once your adapter is complete and tested, you can publish it as an npm package and open up a PR to the [TanStack AI repository](https://github.com/TanStack/ai/pulls) to have it listed in the community adapters section of the documentation. Add it into [here](https://github.com/TanStack/ai/tree/main/docs/community-adapters). +8. **Publish your package and open up a PR**: Once your adapter is complete and tested: + - Publish it as an npm package to your own registry + - Create documentation for your adapter (following the structure of other adapter docs) + - Open a PR to the [TanStack AI repository](https://github.com/TanStack/ai/pulls) that adds only your documentation file to the [community adapters docs folder](https://github.com/TanStack/ai/tree/main/docs/community-adapters) + - The adapter code itself remains in your own repository/package
115-115: Consider expanding on implementing different adapter types.The guide mentions "text adapter, chat adapter, image adapter, etc" but only links to the text adapter implementation. While the OpenAI adapter serves as a reference, briefly mentioning what differentiates these adapter types could help guide creators.
π Example addition
7. **Implement the model logic**: Finally, you need to implement the actual logic for your adapter. This includes handling requests to the external service, processing responses, and integrating with TanStack AI's APIs. You can refer to the [OpenAI adapter implementation](https://github.com/TanStack/ai/blob/main/packages/typescript/ai-openai/src/adapters/text.ts) for a detailed example. What's important here is to make sure you properly type the inputs and outputs based on the model options you defined earlier. Every functionality is split per adapter, so make sure to implement the text adapter, chat adapter, image adapter, etc as needed. + + Different adapter types handle different capabilities: + - **Text/Chat adapters**: Handle text generation and conversation + - **Image adapters**: Handle image generation from prompts + - **Embedding adapters**: Convert text to vector embeddings + - **Audio adapters**: Handle transcription and text-to-speech + + Implement only the adapter types your service supports.
π Review details
Configuration used: defaults
Review profile: CHILL
Plan: Pro
π Files selected for processing (2)
docs/community-adapters/guide.mddocs/config.json
π§° Additional context used
π§ Learnings (2)
π Learning: 2025-12-13T17:09:09.794Z
Learnt from: CR
Repo: TanStack/ai PR: 0
File: CLAUDE.md:0-0
Timestamp: 2025-12-13T17:09:09.794Z
Learning: Applies to packages/typescript/*/src/index.ts : Export tree-shakeable adapters with clear subpath exports in package.json (e.g., `tanstack/ai/adapters`, `tanstack/ai-openai/adapters`) to minimize bundle size
Applied to files:
docs/community-adapters/guide.md
π Learning: 2025-12-13T17:09:09.794Z
Learnt from: CR
Repo: TanStack/ai PR: 0
File: CLAUDE.md:0-0
Timestamp: 2025-12-13T17:09:09.794Z
Learning: Applies to packages/typescript/*/src/adapters/*.ts : Create individual adapter implementations for each provider capability (text, embed, summarize, image) with separate exports to enable tree-shaking
Applied to files:
docs/community-adapters/guide.md
πͺ LanguageTool
docs/community-adapters/guide.md
[style] ~48-~48: You have already used this phrasing in nearby sentences. Consider replacing it to add variety to your writing.
Context: ...tions the model supports. Generally you want to do something like this: ```typescript e...
(REP_WANT_TO_VB)
[style] ~67-~67: You have already used this phrasing in nearby sentences. Consider replacing it to add variety to your writing.
Context: ...fter you define the model metadata, you need to implement the input modalities the mode...
(REP_NEED_TO_VB)
[style] ~67-~67: You have already used βGenerallyβ in other nearby sentences. Consider replacing or removing it to add variety to your writing.
Context: ...he input modalities the model supports. Generally you want to do something like this: ```...
(REP_SENT_START_ADVERBLY)
[style] ~67-~67: You have already used this phrasing in nearby sentences. Consider replacing it to add variety to your writing.
Context: ...ities the model supports. Generally you want to do something like this: ```typescript e...
(REP_WANT_TO_VB)
[style] ~77-~77: You have already used this phrasing in nearby sentences. Consider replacing it to add variety to your writing.
Context: ...fter you define the model metadata, you need to implement the model options the model s...
(REP_NEED_TO_VB)
[style] ~77-~77: You have already used βGenerallyβ in other nearby sentences. Consider replacing or removing it to add variety to your writing.
Context: ...t the model options the model supports. Generally you want to do something like this (you...
(REP_SENT_START_ADVERBLY)
[style] ~77-~77: You have already used this phrasing in nearby sentences. Consider replacing it to add variety to your writing.
Context: ...tions the model supports. Generally you want to do something like this (you can see an ...
(REP_WANT_TO_VB)
πͺ markdownlint-cli2 (0.18.1)
docs/community-adapters/guide.md
77-77: Link text should be descriptive
(MD059, descriptive-link-text)
117-117: Link text should be descriptive
(MD059, descriptive-link-text)
π Additional comments (2)
docs/config.json (1)
145-153: LGTM!The navigation entry is well-structured and appropriately positioned after the Adapters section, providing clear access to the community adapters documentation.
docs/community-adapters/guide.md (1)
1-16: LGTM!The introduction clearly defines community adapters and appropriately sets expectations that these are community-maintained extensions.
docs/community-adapters/guide.md
Outdated
| // ... repeat for each model | ||
| } | ||
| ``` | ||
| 6. **Define your model options**: After you define the model metadata, you need to implement the model options the model supports. Generally you want to do something like this (you can see an example for OpenAI models [here](https://github.com/TanStack/ai/blob/main/packages/typescript/ai-openai/src/text/text-provider-options.ts)): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Replace non-descriptive link text for better accessibility.
Using "here" as link text is not descriptive and creates accessibility issues. Links should describe their destination.
π Suggested improvements
Line 77:
-Generally you want to do something like this (you can see an example for OpenAI models [here](https://github.com/TanStack/ai/blob/main/packages/typescript/ai-openai/src/text/text-provider-options.ts)):
+Generally you want to do something like this (see the [OpenAI text provider options implementation](https://github.com/TanStack/ai/blob/main/packages/typescript/ai-openai/src/text/text-provider-options.ts)):Line 117:
-8. **Publish your package and open up a PR**: Once your adapter is complete and tested, you can publish it as an npm package and open up a PR to the [TanStack AI repository](https://github.com/TanStack/ai/pulls) to have it listed in the community adapters section of the documentation. Add it into [here](https://github.com/TanStack/ai/tree/main/docs/community-adapters).
+8. **Publish your package and open up a PR**: Once your adapter is complete and tested, you can publish it as an npm package and open up a PR to the [TanStack AI repository](https://github.com/TanStack/ai/pulls) to have it listed in the community adapters section of the documentation. Add your documentation file to the [community adapters docs folder](https://github.com/TanStack/ai/tree/main/docs/community-adapters).Also applies to: 117-117
π§° Tools
πͺ LanguageTool
[style] ~77-~77: You have already used this phrasing in nearby sentences. Consider replacing it to add variety to your writing.
Context: ...fter you define the model metadata, you need to implement the model options the model s...
(REP_NEED_TO_VB)
[style] ~77-~77: You have already used βGenerallyβ in other nearby sentences. Consider replacing or removing it to add variety to your writing.
Context: ...t the model options the model supports. Generally you want to do something like this (you...
(REP_SENT_START_ADVERBLY)
[style] ~77-~77: You have already used this phrasing in nearby sentences. Consider replacing it to add variety to your writing.
Context: ...tions the model supports. Generally you want to do something like this (you can see an ...
(REP_WANT_TO_VB)
πͺ markdownlint-cli2 (0.18.1)
77-77: Link text should be descriptive
(MD059, descriptive-link-text)
π€ Prompt for AI Agents
In docs/community-adapters/guide.md around lines 77 and 117, replace
non-descriptive link text "here" with descriptive, accessible link text (e.g.,
"OpenAI model options example" or "example of OpenAI text provider options") so
the link describes its destination; update both occurrences to use clear,
specific phrases that convey the target and keep the existing URL unchanged.
Co-authored-by: Sarah <hello@sarahgerrard.me>
Co-authored-by: Sarah <hello@sarahgerrard.me>
Co-authored-by: Sarah <hello@sarahgerrard.me>
Co-authored-by: Sarah <hello@sarahgerrard.me>
Co-authored-by: Sarah <hello@sarahgerrard.me>
Co-authored-by: Sarah <hello@sarahgerrard.me>
Co-authored-by: Sarah <hello@sarahgerrard.me>
Co-authored-by: Sarah <hello@sarahgerrard.me>
Co-authored-by: Sarah <hello@sarahgerrard.me>
Co-authored-by: Sarah <hello@sarahgerrard.me>
Co-authored-by: Sarah <hello@sarahgerrard.me>
Co-authored-by: Sarah <hello@sarahgerrard.me>
Co-authored-by: Sarah <hello@sarahgerrard.me>
Co-authored-by: Sarah <hello@sarahgerrard.me>
Co-authored-by: Sarah <hello@sarahgerrard.me>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 3
π§Ή Nitpick comments (1)
docs/community-adapters/guide.md (1)
166-184: Consider enhancing adapter implementation guidance with tree-shakeable architecture emphasis.The implementation section provides good guidance on adapter logic, but it could be strengthened by explicitly mentioning the tree-shakeable architecture pattern. Based on the project's architecture learnings, adapters should be exported with separate imports for each capability (text, image, embeddings, etc.) to enable tree-shaking and reduce bundle size.
Consider adding guidance like:
π Suggested enhancement
Adapters are implemented per capability, so only implement what your provider supports: - Text adapter - Chat adapter - Image adapter - Embeddings adapter - Video adapter +**Each adapter should be exported as a separate subpath import** (e.g., `@myorg/ai-provider/adapters/text`, `@myorg/ai-provider/adapters/image`) to enable tree-shaking and allow users to bundle only the adapters they need. + Refer to the [OpenAI adapter](https://github.com/TanStack/ai/blob/main/packages/typescript/ai-openai/src/adapters/text.ts) for a complete, end-to-end implementation example.Based on learnings, this enhancement would align the guide more closely with TanStack AI's architecture patterns for tree-shakeable adapter design.
π Review details
Configuration used: defaults
Review profile: CHILL
Plan: Pro
π Files selected for processing (1)
docs/community-adapters/guide.md
π§° Additional context used
π§ Learnings (8)
π Learning: 2025-12-13T17:09:09.794Z
Learnt from: CR
Repo: TanStack/ai PR: 0
File: CLAUDE.md:0-0
Timestamp: 2025-12-13T17:09:09.794Z
Learning: Applies to packages/typescript/*/src/adapters/*.ts : Create individual adapter implementations for each provider capability (text, embed, summarize, image) with separate exports to enable tree-shaking
Applied to files:
docs/community-adapters/guide.md
π Learning: 2025-12-13T17:09:09.794Z
Learnt from: CR
Repo: TanStack/ai PR: 0
File: CLAUDE.md:0-0
Timestamp: 2025-12-13T17:09:09.794Z
Learning: Ensure provider-agnostic design where the core AI SDK remains independent of any specific AI provider implementation
Applied to files:
docs/community-adapters/guide.md
π Learning: 2025-12-13T17:09:09.794Z
Learnt from: CR
Repo: TanStack/ai PR: 0
File: CLAUDE.md:0-0
Timestamp: 2025-12-13T17:09:09.794Z
Learning: Applies to **/*.{ts,tsx} : Use tree-shakeable adapter architecture for provider implementations - export specialized adapters (text, embedding, summarize, image) as separate imports from `/adapters` subpath rather than monolithic adapters
Applied to files:
docs/community-adapters/guide.md
π Learning: 2025-12-13T17:09:09.794Z
Learnt from: CR
Repo: TanStack/ai PR: 0
File: CLAUDE.md:0-0
Timestamp: 2025-12-13T17:09:09.794Z
Learning: Implement framework integrations using the headless `tanstack/ai-client` for state management with framework-specific hooks (useChat) on top
Applied to files:
docs/community-adapters/guide.md
π Learning: 2025-12-13T17:09:09.794Z
Learnt from: CR
Repo: TanStack/ai PR: 0
File: CLAUDE.md:0-0
Timestamp: 2025-12-13T17:09:09.794Z
Learning: Applies to packages/typescript/*/src/index.ts : Export tree-shakeable adapters with clear subpath exports in package.json (e.g., `tanstack/ai/adapters`, `tanstack/ai-openai/adapters`) to minimize bundle size
Applied to files:
docs/community-adapters/guide.md
π Learning: 2025-12-13T17:09:09.794Z
Learnt from: CR
Repo: TanStack/ai PR: 0
File: CLAUDE.md:0-0
Timestamp: 2025-12-13T17:09:09.794Z
Learning: Applies to packages/typescript/*/src/model-meta.ts : Maintain model metadata files that define provider options and capabilities per model for per-model type safety
Applied to files:
docs/community-adapters/guide.md
π Learning: 2025-12-13T17:09:09.794Z
Learnt from: CR
Repo: TanStack/ai PR: 0
File: CLAUDE.md:0-0
Timestamp: 2025-12-13T17:09:09.794Z
Learning: Applies to **/*.{ts,tsx} : Use type-safe per-model configuration with provider options typed based on selected model to ensure compile-time safety
Applied to files:
docs/community-adapters/guide.md
π Learning: 2025-12-13T17:09:09.794Z
Learnt from: CR
Repo: TanStack/ai PR: 0
File: CLAUDE.md:0-0
Timestamp: 2025-12-13T17:09:09.794Z
Learning: Maintain type safety through multimodal content support (image, audio, video, document) with model capability awareness
Applied to files:
docs/community-adapters/guide.md
πͺ markdownlint-cli2 (0.18.1)
docs/community-adapters/guide.md
33-33: Bare URL used
(MD034, no-bare-urls)
π Additional comments (1)
docs/community-adapters/guide.md (1)
1-5: Documentation structure and accessibility are generally well done.The guide provides a comprehensive, step-by-step approach to creating community adapters. The content aligns well with TanStack AI's architecture patterns, code examples are concrete, and links to reference implementations (OpenAI adapter) are helpful. After addressing the critical link syntax error on line 33, the grammar issue on line 23, and the missing section number on line 35, this documentation will be ready for publication.
|
@LadyBluenotes thank you Sarah, implemented all the changes |
|
@AlemTuzlak code rabbit picked up on two errors i made (section number and a duplicate "and"). if you could add those i can approve so we can merge this :) |
Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>
Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>
Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>
π― Changes
β Checklist
pnpm run test:pr.π Release Impact
Summary by CodeRabbit
βοΈ Tip: You can customize this high-level summary in your review settings.