TradingAgents/tradingagents/llm_clients
OpenClaw Assistant 3e509bfa32 feat: add llama.cpp local LLM support via .env configuration
Add 'llamacpp' as a new provider for running TradingAgents fully
offline with a local llama-server (llama.cpp).

Changes:
- factory.py: register 'llamacpp' provider alongside openai/ollama
- validators.py: accept any model name for llamacpp (like ollama)
- openai_client.py: llamacpp branch sets base_url from env/config,
  uses placeholder api_key so no auth error is raised
- default_config.py: load .env via python-dotenv (optional dep);
  LLM_PROVIDER, BACKEND_URL, DEEP_THINK_LLM, QUICK_THINK_LLM are
  all overridable via environment variables
- .env.example: document llamacpp setup alongside cloud providers
- .gitignore: ensure .env is ignored, .env.example is tracked

Fully backward-compatible: OpenAI remains the default when no
.env is present. Also works for LM Studio, vLLM, or any other
OpenAI-compatible local server via BACKEND_URL + LLM_PROVIDER=openai.

Tested with: llama.cpp llama-server + Qwen3.5-35B-A3B-Q3_K_M
2026-03-21 10:26:48 +01:00
..
TODO.md feat: add multi-provider LLM support with factory pattern 2026-02-03 22:27:20 +00:00
__init__.py feat: add multi-provider LLM support with factory pattern 2026-02-03 22:27:20 +00:00
anthropic_client.py fix: add http_client support for SSL certificate customization 2026-03-16 07:41:20 +08:00
base_client.py feat: add multi-provider LLM support with factory pattern 2026-02-03 22:27:20 +00:00
factory.py feat: add llama.cpp local LLM support via .env configuration 2026-03-21 10:26:48 +01:00
google_client.py fix: add http_client support for SSL certificate customization 2026-03-16 07:41:20 +08:00
openai_client.py feat: add llama.cpp local LLM support via .env configuration 2026-03-21 10:26:48 +01:00
validators.py feat: add llama.cpp local LLM support via .env configuration 2026-03-21 10:26:48 +01:00