TradingAgents/cli
octo-patch 46bd80347a feat: add MiniMax as LLM provider
Add MiniMax (MiniMax-M2.5 and MiniMax-M2.5-highspeed) as a supported
LLM provider. MiniMax offers an OpenAI-compatible API with 204K context
window support.

Changes:
- Add MiniMax provider routing in factory (via OpenAI-compatible client)
- Add MiniMax API endpoint and key handling in OpenAIClient
- Add MiniMax model validation in validators
- Add MiniMax models to CLI quick/deep thinking selection
- Add MiniMax to provider selection in CLI
- Update .env.example with MINIMAX_API_KEY
- Update README with MiniMax documentation
2026-03-15 19:18:24 +08:00
..
static chore(release): v0.1.0 – initial public release of TradingAgents 2025-06-05 04:27:57 -07:00
__init__.py chore(release): v0.1.0 – initial public release of TradingAgents 2025-06-05 04:27:57 -07:00
announcements.py feat: add announcements panel fetching from api.tauric.ai/v1/announcements 2026-02-03 22:27:20 +00:00
config.py feat: add announcements panel fetching from api.tauric.ai/v1/announcements 2026-02-03 22:27:20 +00:00
main.py feat: add post-analysis report saving and fix display truncation 2026-02-03 22:27:20 +00:00
models.py chore(release): v0.1.0 – initial public release of TradingAgents 2025-06-05 04:27:57 -07:00
stats_handler.py feat: add footer statistics tracking with LangChain callbacks 2026-02-03 22:27:20 +00:00
utils.py feat: add MiniMax as LLM provider 2026-03-15 19:18:24 +08:00