Adds MiniMax as a natively supported LLM provider via its OpenAI-compatible
API. Includes model catalog entries for all four variants and retry handling
for null-choices responses caused by MiniMax's content moderation filter.
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
- Add .env.example file with API key placeholders
- Update README.md with .env file setup instructions
- Add dotenv loading in main.py for environment variables
🤖 Generated with [Claude Code](https://claude.ai/code)
Co-Authored-By: Claude <noreply@anthropic.com>
- Added support for running CLI and Ollama server via Docker
- Introduced tests for local embeddings model and standalone Docker setup
- Enabled conditional Ollama server launch via LLM_PROVIDER