TradingAgents/tradingagents/llm_clients
Jiaxu Liu 24e97fb703 fix: use Responses API for Copilot models that don't support chat/completions
gpt-5.4, gpt-5.4-mini, and codex variants only support /responses,
not /chat/completions on the Copilot endpoint. Auto-detect and set
use_responses_api=True for these models.

Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
2026-03-23 13:50:55 +00:00
..
TODO.md feat: add multi-provider LLM support with factory pattern 2026-02-03 22:27:20 +00:00
__init__.py feat: add multi-provider LLM support with factory pattern 2026-02-03 22:27:20 +00:00
anthropic_client.py feat: add Anthropic effort level support for Claude models 2026-03-22 21:57:05 +00:00
base_client.py chore: consolidate install, fix CLI portability, normalize LLM responses 2026-03-22 21:38:01 +00:00
factory.py feat: add GitHub Copilot provider with OAuth auth via gh CLI 2026-03-23 13:18:55 +00:00
google_client.py chore: consolidate install, fix CLI portability, normalize LLM responses 2026-03-22 21:38:01 +00:00
openai_client.py fix: use Responses API for Copilot models that don't support chat/completions 2026-03-23 13:50:55 +00:00
validators.py feat: add GitHub Copilot provider with OAuth auth via gh CLI 2026-03-23 13:18:55 +00:00