- Add MiniMax as a new LLM provider via OpenAI-compatible API - Support MiniMax-M2.7 (default), MiniMax-M2.7-highspeed, and legacy M2.5 models - Wire MiniMax into factory, validator, CLI model selection, and provider list - Update README with MiniMax API key docs and provider references |
||
|---|---|---|
| .. | ||
| agents | ||
| dataflows | ||
| graph | ||
| llm_clients | ||
| __init__.py | ||
| default_config.py | ||