Skip to content

feat: add MiniMax as built-in LLM provider (M2.7 default)#1374

Open
octo-patch wants to merge 2 commits intosimonw:mainfrom
octo-patch:feature/add-minimax-provider
Open

feat: add MiniMax as built-in LLM provider (M2.7 default)#1374
octo-patch wants to merge 2 commits intosimonw:mainfrom
octo-patch:feature/add-minimax-provider

Conversation

@octo-patch
Copy link
Copy Markdown

@octo-patch octo-patch commented Mar 17, 2026

Summary

  • Add MiniMax as a built-in LLM provider with M2.7 as the default model
  • Register MiniMax-M2.7, MiniMax-M2.7-highspeed, MiniMax-M2.5, and MiniMax-M2.5-highspeed models
  • MiniMaxChat/MiniMaxAsyncChat classes with temperature clamping and JSON mode support
  • Aliases: minimax → M2.7, minimax-fast → M2.7-highspeed

Changes

  • llm/default_plugins/minimax_models.py — New MiniMax provider plugin
  • llm/plugins.py — Register MiniMax as default plugin
  • tests/test_minimax_models.py — 31 unit tests + 3 integration tests
  • README.md — Add MiniMax usage examples
  • pyproject.toml — Update description

Why

MiniMax-M2.7 is the latest flagship model with enhanced reasoning and coding capabilities. Adding it as a built-in provider makes it easy for users to access MiniMax models without installing plugins.

Testing

  • All 31 unit tests passing
  • Integration tested with MiniMax API

PR Bot and others added 2 commits March 17, 2026 19:09
Register MiniMax-M2.5 and MiniMax-M2.5-highspeed as default models
with OpenAI-compatible API integration. Includes temperature clamping
(0, 1] per MiniMax API constraints, model aliases (minimax, m2.5,
minimax-fast, m2.5-highspeed), and 30 tests (27 unit + 3 integration).
- Add MiniMax-M2.7 and MiniMax-M2.7-highspeed to model list
- Set MiniMax-M2.7 as default model (aliases: minimax, m2.7)
- Keep all previous models (M2.5, M2.5-highspeed) as alternatives
- Update related tests and README examples
@octo-patch octo-patch changed the title Add MiniMax as built-in LLM provider feat: add MiniMax as built-in LLM provider (M2.7 default) Mar 18, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant