Adds MiniMax as a natively supported LLM provider via its OpenAI-compatible
API. Includes retry handling for null-choices responses caused by MiniMax's
content moderation filter.
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
- Add http_client and http_async_client parameters to all LLM clients
- OpenAIClient, GoogleClient, AnthropicClient now support custom httpx clients
- Fixes SSL certificate verification errors on Windows Conda environments
- Users can now pass custom httpx.Client with verify=False or custom certs
Fixes#369
- Add StatsCallbackHandler for tracking LLM calls, tool calls, and tokens
- Integrate callbacks into TradingAgentsGraph and all LLM clients
- Dynamic agent/report counts based on selected analysts
- Fix report completion counting (tied to agent completion)