Add MiniMax (MiniMax-M2.5 and MiniMax-M2.5-highspeed) as a supported LLM provider. MiniMax offers an OpenAI-compatible API with 204K context window support. Changes: - Add MiniMax provider routing in factory (via OpenAI-compatible client) - Add MiniMax API endpoint and key handling in OpenAIClient - Add MiniMax model validation in validators - Add MiniMax models to CLI quick/deep thinking selection - Add MiniMax to provider selection in CLI - Update .env.example with MINIMAX_API_KEY - Update README with MiniMax documentation |
||
|---|---|---|
| .. | ||
| agents | ||
| dataflows | ||
| graph | ||
| llm_clients | ||
| default_config.py | ||