- Added LLMFactory for provider-agnostic LLM creation - Supports OpenAI, Ollama (local/free), Anthropic, Google, Groq, Azure, Together, HuggingFace, OpenRouter - Updated memory system to be provider-agnostic - Fixed Ollama integration with tool calling support (llama3.2, llama3.1, mistral-nemo, qwen2.5) - Added comprehensive documentation and examples - Updated CLI with new Ollama model selections - 100% backward compatible - OpenAI remains default - Verified working with tests |
||
|---|---|---|
| .. | ||
| agent_states.py | ||
| agent_utils.py | ||
| core_stock_tools.py | ||
| fundamental_data_tools.py | ||
| memory.py | ||
| news_data_tools.py | ||
| technical_indicators_tools.py | ||