- Add MiniMax as a new LLM provider via OpenAI-compatible API - Support MiniMax-M2.7 (default), MiniMax-M2.7-highspeed, and legacy M2.5 models - Wire MiniMax into factory, validator, CLI model selection, and provider list - Update README with MiniMax API key docs and provider references |
||
|---|---|---|
| .. | ||
| static | ||
| __init__.py | ||
| announcements.py | ||
| config.py | ||
| main.py | ||
| models.py | ||
| stats_handler.py | ||
| utils.py | ||