- Add MiniMax as a new LLM provider via OpenAI-compatible API - Support MiniMax-M2.7 (default), MiniMax-M2.7-highspeed, and legacy M2.5 models - Wire MiniMax into factory, validator, CLI model selection, and provider list - Update README with MiniMax API key docs and provider references |
||
|---|---|---|
| .. | ||
| TODO.md | ||
| __init__.py | ||
| anthropic_client.py | ||
| base_client.py | ||
| factory.py | ||
| google_client.py | ||
| openai_client.py | ||
| validators.py | ||