Commit Graph

6 Commits

Author SHA1 Message Date
OpenClaw Assistant 3e509bfa32 feat: add llama.cpp local LLM support via .env configuration
Add 'llamacpp' as a new provider for running TradingAgents fully
offline with a local llama-server (llama.cpp).

Changes:
- factory.py: register 'llamacpp' provider alongside openai/ollama
- validators.py: accept any model name for llamacpp (like ollama)
- openai_client.py: llamacpp branch sets base_url from env/config,
  uses placeholder api_key so no auth error is raised
- default_config.py: load .env via python-dotenv (optional dep);
  LLM_PROVIDER, BACKEND_URL, DEEP_THINK_LLM, QUICK_THINK_LLM are
  all overridable via environment variables
- .env.example: document llamacpp setup alongside cloud providers
- .gitignore: ensure .env is ignored, .env.example is tracked

Fully backward-compatible: OpenAI remains the default when no
.env is present. Also works for LM Studio, vLLM, or any other
OpenAI-compatible local server via BACKEND_URL + LLM_PROVIDER=openai.

Tested with: llama.cpp llama-server + Qwen3.5-35B-A3B-Q3_K_M
2026-03-21 10:26:48 +01:00
Yijia Xiao 80aab35119
docs: update README for v0.2.0 release
- TradingAgents v0.2.0 release
- Trading-R1 announcement
- Multi-provider LLM documentation
2026-02-04 00:13:10 +00:00
Yijia Xiao 102b026d23
refactor: clean up codebase and streamline documentation
- Remove debug prints from vendor routing (interface.py)
- Simplify vendor fallback to only handle rate limits
- Reorder CLI provider menu: OpenAI, Google, Anthropic, xAI, OpenRouter, Ollama
- Remove dead files: local.py, reddit_utils.py, openai.py, google.py, googlenews_utils.py, yfin_utils.py
2026-02-03 22:27:20 +00:00
luohy15 7fc9c28a94 Add environment variable configuration support
- Add .env.example file with API key placeholders
- Update README.md with .env file setup instructions
- Add dotenv loading in main.py for environment variables

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-09-26 23:58:51 +08:00
Yijia Xiao 26c5ba5a78
Revert "Docker support and Ollama support (#47)" (#57)
This reverts commit 78ea029a0b.
2025-06-26 00:07:58 -04:00
Geeta Chauhan 78ea029a0b
Docker support and Ollama support (#47)
- Added support for running CLI and Ollama server via Docker
- Introduced tests for local embeddings model and standalone Docker setup
- Enabled conditional Ollama server launch via LLM_PROVIDER
2025-06-25 23:57:05 -04:00