- Support base_url parameter for Docker/remote Ollama deployments - Support OLLAMA_HOST environment variable fallback - Normalize URL to ensure /v1 suffix for OpenAI-compatible API - Fixes #396 (Docker Ollama connection issues) Usage: - In config: set llm_config with base_url='http://host.docker.internal:11434' - Or env var: export OLLAMA_HOST=http://192.168.1.100:11434 |
||
|---|---|---|
| .. | ||
| TODO.md | ||
| __init__.py | ||
| anthropic_client.py | ||
| base_client.py | ||
| factory.py | ||
| google_client.py | ||
| openai_client.py | ||
| validators.py | ||