- Added LLMFactory for provider-agnostic LLM creation - Supports OpenAI, Ollama (local/free), Anthropic, Google, Groq, Azure, Together, HuggingFace, OpenRouter - Updated memory system to be provider-agnostic - Fixed Ollama integration with tool calling support (llama3.2, llama3.1, mistral-nemo, qwen2.5) - Added comprehensive documentation and examples - Updated CLI with new Ollama model selections - 100% backward compatible - OpenAI remains default - Verified working with tests |
||
|---|---|---|
| .. | ||
| analysts | ||
| managers | ||
| researchers | ||
| risk_mgmt | ||
| trader | ||
| utils | ||
| __init__.py | ||