TradingAgents/tradingagents/llm_clients
阳虎 0c2b04a04d fix: support custom Ollama URL via base_url or OLLAMA_HOST env var
- Support base_url parameter for Docker/remote Ollama deployments
- Support OLLAMA_HOST environment variable fallback
- Normalize URL to ensure /v1 suffix for OpenAI-compatible API
- Fixes #396 (Docker Ollama connection issues)

Usage:
  - In config: set llm_config with base_url='http://host.docker.internal:11434'
  - Or env var: export OLLAMA_HOST=http://192.168.1.100:11434
2026-03-19 17:11:39 +08:00
..
TODO.md feat: add multi-provider LLM support with factory pattern 2026-02-03 22:27:20 +00:00
__init__.py feat: add multi-provider LLM support with factory pattern 2026-02-03 22:27:20 +00:00
anthropic_client.py fix: add http_client support for SSL certificate customization 2026-03-16 07:41:20 +08:00
base_client.py feat: add multi-provider LLM support with factory pattern 2026-02-03 22:27:20 +00:00
factory.py fix: add http_client support for SSL certificate customization 2026-03-16 07:41:20 +08:00
google_client.py fix: add http_client support for SSL certificate customization 2026-03-16 07:41:20 +08:00
openai_client.py fix: support custom Ollama URL via base_url or OLLAMA_HOST env var 2026-03-19 17:11:39 +08:00
validators.py chore: update model lists, bump to v0.2.1, fix package build 2026-03-15 23:34:50 +00:00