Add Mistral AI provider#31
Conversation
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: 0dc40ab3e4
ℹ️ About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".
| nvidia: 'https://integrate.api.nvidia.com/v1/chat/completions', | ||
| github: 'https://models.inference.ai.azure.com/v1/chat/completions', | ||
| boa: 'https://api.bayofassets.com/v1/chat/completions', | ||
| mistral: 'https://api.mistral.ai/v1/chat/completions', |
There was a problem hiding this comment.
Normalize Mistral content before returning from callLLM
Adding mistral to OPENAI_COMPAT_ENDPOINTS routes it through the generic OpenAI branch in callLLM, which returns d?.choices?.[0]?.message?.content verbatim; Mistral can return that field as an array of content parts (as already handled in providers/mistral.ts). In that case downstream planner/retry paths that do raw.trim() will hit raw.trim is not a function, causing failed turns and unintended fallback behavior whenever Mistral is selected.
Useful? React with 👍 / 👎.
| nvidia: 'https://integrate.api.nvidia.com/v1/chat/completions', | ||
| github: 'https://models.inference.ai.azure.com/chat/completions', | ||
| boa: 'https://api.bayofassets.com/v1/chat/completions', | ||
| mistral: 'https://api.mistral.ai/v1/chat/completions', |
There was a problem hiding this comment.
Parse Mistral content parts in provider-racing helper
By adding mistral to COMPAT_ENDPOINTS, fetchProviderResponse now returns Mistral responses through the same unnormalized message.content path used for string-only providers. If Mistral returns content-part arrays, raceProviders calls winner.text.trim()/result.text.trim() and treats the call as failed, so pinned or raced Mistral providers can be skipped even when the API returned a valid answer.
Useful? React with 👍 / 👎.
Summary
https://api.mistral.ai/v1/chat/completions/provider add, API onboarding, provider validation, and streaming endpoint mapsMISTRAL_API_KEYto.env.exampleand a disabled defaultmistralconfig entry usingmistral-large-latestNotes
Validation
npx tsc --noEmit --target ES2020 --module commonjs --moduleResolution node --esModuleInterop --allowSyntheticDefaultImports --skipLibCheck --strict false --lib ES2020,DOM providers/mistral.tsnpx esbuild providers/mistral.ts --bundle --platform=node --target=node18 --outfile=/tmp/aiden-mistral-provider.jsgit diff --checkExisting local build blockers observed
npm cicurrently fails before installing becausepackage.jsonandpackage-lock.jsonare out of sync (aiden-os@3.17.0missing from the lockfile).npm run buildstops on an existing minimatch import/type issue incore/toolRegistry.ts.npm run build:apialso stops on unresolved@aws-sdk/client-s3fromunzipperin this local install.Closes #25