Skip to content

feat: upgrade MiniMax default model to M2.7#2316

Open
octo-patch wants to merge 2 commits intoarc53:mainfrom
octo-patch:feature/upgrade-minimax-m27
Open

feat: upgrade MiniMax default model to M2.7#2316
octo-patch wants to merge 2 commits intoarc53:mainfrom
octo-patch:feature/upgrade-minimax-m27

Conversation

@octo-patch
Copy link
Copy Markdown

Summary

  • Add MiniMax-M2.7 and MiniMax-M2.7-highspeed to the model selection list
  • Set MiniMax-M2.7 as the new default model
  • Retain all previous models (M2.5, M2.5-highspeed) as available alternatives
  • Update documentation example model reference
  • Update and add related unit tests

Why

MiniMax-M2.7 is the latest flagship model with enhanced reasoning and coding capabilities, replacing M2.5 as the recommended default.

Changes

  • application/core/model_configs.py: Add M2.7 and M2.7-highspeed models at the top of MINIMAX_MODELS list
  • docs/content/Models/cloud-providers.mdx: Update example model from M2.5 to M2.7
  • tests/llm/test_minimax_llm.py: Update existing tests to use M2.7, add 3 new tests for model list verification

Testing

  • All 13 unit tests passing (10 updated + 3 new)
  • New tests verify: M2.7 models present, M2.7 is first in list, old M2.5 models still available
PR Bot added 2 commits March 19, 2026 01:26
Add MiniMax (https://www.minimaxi.com) as a native provider for both LLM inference and text-to-speech, giving users access to MiniMax-M2.5 and MiniMax-M2.5-highspeed models (204K context window) via the OpenAI-compatible API at api.minimax.io. Changes: - LLM provider (`application/llm/minimax.py`): extends OpenAILLM with MiniMax base URL, temperature clamping to (0, 1], and response_format passthrough disabled - TTS provider (`application/tts/minimax_tts.py`): speech-2.8-hd model with hex-to-base64 audio decoding - Model registry: MiniMax-M2.5 and MiniMax-M2.5-highspeed with tool calling and image attachment support - Settings: MINIMAX_API_KEY environment variable with normalizer - Setup scripts: MiniMax option in both bash and PowerShell wizards - Documentation: cloud-providers.mdx table and README feature list - Tests: 10 LLM unit tests + 4 TTS unit tests, all passing
- Add MiniMax-M2.7 and MiniMax-M2.7-highspeed to model list - Set MiniMax-M2.7 as default model - Keep all previous models (M2.5, M2.5-highspeed) as alternatives - Update documentation example model reference - Update and add related unit tests
@vercel
Copy link
Copy Markdown

vercel bot commented Mar 18, 2026

Someone is attempting to deploy a commit to the Arc53 Team on Vercel.

A member of the Team first needs to authorize it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

1 participant