- Added support for running CLI and Ollama server via Docker - Introduced tests for local embeddings model and standalone Docker setup - Enabled conditional Ollama server launch via LLM_PROVIDER |
||
|---|---|---|
| .. | ||
| analysts | ||
| managers | ||
| researchers | ||
| risk_mgmt | ||
| trader | ||
| utils | ||
| __init__.py | ||