- AnthropicClient now uses base_url param or ANTHROPIC_BASE_URL env var
to connect to custom endpoints (subscription proxies, self-hosted, etc.)
- Add ANTHROPIC_BASE_URL to .env.example
- Document custom base URL usage in README
https://claude.ai/code/session_01AbNbJYL7gHy47BQ9BJJTAv
- Change default llm_provider from "openai" to "anthropic"
- Update default models: deep_think_llm to claude-sonnet-4-6, quick_think_llm to claude-haiku-4-5
- Update backend_url to Anthropic API endpoint
- Reorder provider lists to show Anthropic first in CLI and docs
- Update README examples and .env.example to reflect new defaults
https://claude.ai/code/session_01AbNbJYL7gHy47BQ9BJJTAv
- Add .env.example file with API key placeholders
- Update README.md with .env file setup instructions
- Add dotenv loading in main.py for environment variables
🤖 Generated with [Claude Code](https://claude.ai/code)
Co-Authored-By: Claude <noreply@anthropic.com>
- Added support for running CLI and Ollama server via Docker
- Introduced tests for local embeddings model and standalone Docker setup
- Enabled conditional Ollama server launch via LLM_PROVIDER