Adds MiniMax as a natively supported LLM provider via its OpenAI-compatible API. Includes model catalog entries for all four variants and retry handling for null-choices responses caused by MiniMax's content moderation filter. Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com> |
||
|---|---|---|
| .. | ||
| agents | ||
| dataflows | ||
| graph | ||
| llm_clients | ||
| __init__.py | ||
| default_config.py | ||