Add MiniMax (MiniMax-M2.5, MiniMax-M2.5-highspeed) as a supported LLM
provider in the WFGY troubleshooting ecosystem. MiniMax offers an
OpenAI-compatible API with 204K context windows, making it relevant for
RAG and agent workflows.
Changes:
- New minimax.md with provider-specific guardrails, fix patterns, and
known quirks (temperature > 0 constraint, long-context drift, Chinese
tokenizer considerations, OpenAI SDK base_url configuration)
- Updated LLM_Providers/README.md orientation table and keywords