- Added support for running CLI and Ollama server via Docker - Introduced tests for local embeddings model and standalone Docker setup - Enabled conditional Ollama server launch via LLM_PROVIDER |
||
|---|---|---|
| .. | ||
| static | ||
| __init__.py | ||
| main.py | ||
| models.py | ||
| utils.py | ||
- Added support for running CLI and Ollama server via Docker - Introduced tests for local embeddings model and standalone Docker setup - Enabled conditional Ollama server launch via LLM_PROVIDER |
||
|---|---|---|
| .. | ||
| static | ||
| __init__.py | ||
| main.py | ||
| models.py | ||
| utils.py | ||