- Support base_url parameter for Docker/remote Ollama deployments - Support OLLAMA_HOST environment variable fallback - Normalize URL to ensure /v1 suffix for OpenAI-compatible API - Fixes #396 (Docker Ollama connection issues) Usage: - In config: set llm_config with base_url='http://host.docker.internal:11434' - Or env var: export OLLAMA_HOST=http://192.168.1.100:11434 |
||
|---|---|---|
| .. | ||
| agents | ||
| dataflows | ||
| graph | ||
| llm_clients | ||
| __init__.py | ||
| default_config.py | ||