Commit graph

2 commits

Author SHA1 Message Date
Octopus
87ff7f2fcf feat: upgrade MiniMax default model to M2.7
- Add MiniMax-M2.7 and MiniMax-M2.7-highspeed to model list
- Set MiniMax-M2.7 as default model
- Keep all previous models as alternatives
2026-03-18 05:05:56 -05:00
PR Bot
034fe4a2af Add MiniMax to LLM Providers guardrails and fix patterns
Add MiniMax (MiniMax-M2.5, MiniMax-M2.5-highspeed) as a supported LLM
provider in the WFGY troubleshooting ecosystem. MiniMax offers an
OpenAI-compatible API with 204K context windows, making it relevant for
RAG and agent workflows.

Changes:
- New minimax.md with provider-specific guardrails, fix patterns, and
  known quirks (temperature > 0 constraint, long-context drift, Chinese
  tokenizer considerations, OpenAI SDK base_url configuration)
- Updated LLM_Providers/README.md orientation table and keywords
2026-03-16 00:56:05 +08:00