- Added support for running CLI and Ollama server via Docker - Introduced tests for local embeddings model and standalone Docker setup - Enabled conditional Ollama server launch via LLM_PROVIDER |
||
|---|---|---|
| .. | ||
| agents | ||
| dataflows | ||
| graph | ||
| default_config.py | ||
- Added support for running CLI and Ollama server via Docker - Introduced tests for local embeddings model and standalone Docker setup - Enabled conditional Ollama server launch via LLM_PROVIDER |
||
|---|---|---|
| .. | ||
| agents | ||
| dataflows | ||
| graph | ||
| default_config.py | ||