- Added support for running CLI and Ollama server via Docker - Introduced tests for local embeddings model and standalone Docker setup - Enabled conditional Ollama server launch via LLM_PROVIDER |
||
|---|---|---|
| .. | ||
| agent_states.py | ||
| agent_utils.py | ||
| memory.py | ||
- Added support for running CLI and Ollama server via Docker - Introduced tests for local embeddings model and standalone Docker setup - Enabled conditional Ollama server launch via LLM_PROVIDER |
||
|---|---|---|
| .. | ||
| agent_states.py | ||
| agent_utils.py | ||
| memory.py | ||