Skip to content

feat: add MiniMax provider support (chat + TTS)#34

Open
octo-patch wants to merge 2 commits intocalesthio:mainfrom
octo-patch:feature/add-minimax-provider
Open

feat: add MiniMax provider support (chat + TTS)#34
octo-patch wants to merge 2 commits intocalesthio:mainfrom
octo-patch:feature/add-minimax-provider

Conversation

@octo-patch
Copy link
Copy Markdown

@octo-patch octo-patch commented Apr 17, 2026

Summary

  • MiniMax TTS (tools/audio/minimax_tts.py): cost-effective text-to-speech using speech-2.8-hd / speech-2.8-turbo models, 6 English voice IDs, hex-encoded audio via api.minimax.io/v1/t2a_v2
  • MiniMax Chat (lib/providers/minimax.py): OpenAI-compatible LLM provider for MiniMax-M2.7 and MiniMax-M2.7-highspeed, activated by setting llm.provider: minimax in config.yaml
  • Provider factory (lib/providers/__init__.py): build_provider(llm_config) routes to the correct backend; lib/providers/base.py defines the shared interface
  • 25 unit tests for the chat provider + 21 unit tests for the TTS provider — all mocked, no real API keys required
  • Docs updated in docs/PROVIDERS.md and .env.example

Both tools share a single MINIMAX_API_KEY. Temperature is clamped to (0.0, 1.0] (MiniMax rejects temperature=0). Base URL is always https://api.minimax.io (not api.minimax.chat).

API references

Test plan

  • pytest tests/tools/test_minimax_tts.py — 21 passed
  • pytest tests/tools/test_minimax_provider.py — 25 passed
  • ✅ Full suite pytest tests/contracts/ tests/tools/ — 303 passed, 3 pre-existing PyTorch failures unrelated to this PR

- Add MiniMax TTS provider (tools/audio/minimax_tts.py) using
  the T2A v2 API (https://api.minimax.io/v1/t2a_v2)
- Supports models speech-2.8-hd (default) and speech-2.8-turbo
- Auto-discovered by the TTS selector — no manual registration needed
- Requires MINIMAX_API_KEY environment variable
- Add 21 unit tests (tests/tools/test_minimax_tts.py)
- Document MINIMAX_API_KEY in .env.example and docs/PROVIDERS.md
@octo-patch octo-patch requested a review from calesthio as a code owner April 17, 2026 11:21
- Add lib/providers/base.py with BaseLLMProvider abstract base class
- Add lib/providers/minimax.py with MiniMaxProvider (OpenAI-compatible)
  - Supports MiniMax-M2.7 and MiniMax-M2.7-highspeed models
  - Default temperature 1.0 (MiniMax range: (0.0, 1.0])
  - Uses https://api.minimax.io/v1/chat/completions endpoint
- Add lib/providers/__init__.py with build_provider() factory function
  - Instantiates the correct provider from LLMConfig.provider field
- Add 25 unit tests in tests/tools/test_minimax_provider.py
- Update docs/PROVIDERS.md with MiniMax Chat section
- Update .env.example comment to reflect chat + TTS usage

API reference: https://platform.minimax.io/docs/api-reference/text-openai-api
@octo-patch octo-patch changed the title feat: add MiniMax TTS provider feat: add MiniMax provider support (chat + TTS) Apr 17, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant