Skip to content

Add MiniMax as first-class chat model provider#1529

Open
octo-patch wants to merge 1 commit intomicrosoft:mainfrom
octo-patch:feature/add-minimax-provider
Open

Add MiniMax as first-class chat model provider#1529
octo-patch wants to merge 1 commit intomicrosoft:mainfrom
octo-patch:feature/add-minimax-provider

Conversation

@octo-patch
Copy link
Copy Markdown

Summary

  • Add MiniMaxChatTarget as a new prompt target for MiniMax AI chat models via OpenAI-compatible API
  • Support MiniMax-M2.7, M2.7-highspeed, M2.5, and M2.5-highspeed models (up to 1M token context)
  • Includes temperature clamping [0, 1], <think> tag stripping for reasoning models, JSON mode, and token usage tracking
  • 40 unit tests + 3 integration tests
  • Environment variable configuration in .env_example and documentation in prompt targets overview

Details

New Files

  • pyrit/prompt_target/minimax/__init__.py - Package init
  • pyrit/prompt_target/minimax/minimax_chat_target.py - Main implementation (~250 lines)
  • tests/unit/target/test_minimax_chat_target.py - 40 unit tests
  • tests/integration/targets/test_minimax_chat_target_integration.py - 3 integration tests

Modified Files

  • pyrit/prompt_target/__init__.py - Register MiniMaxChatTarget
  • .env_example - Add MINIMAX_API_KEY, MINIMAX_CHAT_ENDPOINT, MINIMAX_CHAT_MODEL
  • doc/code/targets/0_prompt_targets.md - Add MiniMax to targets table

Implementation Approach

MiniMaxChatTarget extends PromptChatTarget directly (rather than OpenAITarget) to provide a clean, focused implementation:

  • Uses AsyncOpenAI client pointing at https://api.minimax.io/v1 (OpenAI-compatible)
  • Temperature clamping: MiniMax accepts [0, 1] range vs OpenAI's [0, 2]; values > 1.0 are clamped with a warning
  • Think tag stripping: MiniMax M2.5+ models may include <think>...</think> reasoning tags; these are automatically stripped
  • JSON mode: Supports json_object response format (falls back from json_schema since MiniMax does not support structured output schemas)
  • Environment variables: MINIMAX_API_KEY (required), MINIMAX_CHAT_ENDPOINT (optional), MINIMAX_CHAT_MODEL (optional)

Usage Example

from pyrit.prompt_target import MiniMaxChatTarget

target = MiniMaxChatTarget(
    api_key="your-minimax-api-key",
    model_name="MiniMax-M2.7",
    temperature=0.7,
)

Test Plan

  • 40 unit tests covering initialization, validation, request building, message construction, error handling, and token usage
  • 3 integration tests covering basic completion, multi-turn conversation, and JSON mode (require MINIMAX_API_KEY)
  • All unit tests pass locally

Add MiniMaxChatTarget for MiniMax AI's OpenAI-compatible chat completions API,
supporting M2.7, M2.7-highspeed, M2.5, and M2.5-highspeed models with
temperature clamping [0,1], thinking tag stripping, JSON mode, and token usage
tracking.

- New prompt target: pyrit/prompt_target/minimax/ (MiniMaxChatTarget)
- 40 unit tests + 3 integration tests
- Environment config in .env_example
- Documentation in prompt targets overview
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant