- Add explicit OPENROUTER_API_KEY environment variable handling
- Add HTTP-Referer and X-Title headers for OpenRouter attribution
- Fix case sensitivity for provider names (ollama now case-insensitive)
- Add embedding fallback to OpenAI when using OpenRouter (since OpenRouter lacks embedding API)
- Add comprehensive test suite (30 tests) for OpenRouter integration
- Update README.md and PROJECT.md with OpenRouter configuration docs
- Add CHANGELOG.md documenting the changes
Patterns borrowed from ~/.claude/lib/genai_validate.py for multi-provider support.
Closes#1🤖 Generated with [Claude Code](https://claude.com/claude-code)
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
- Switch to Anthropic Claude as LLM provider in main.py
- Add get_google_global_news() wrapper for proper signature matching
- Add get_fundamentals() to yfinance vendor
- Register yfinance and google as vendors for fundamentals and global news
- Make memory system fail gracefully when OpenAI embeddings unavailable
- Replace hardcoded column indices with column name lookup
- Add mapping for all supported indicators to their expected CSV column names
- Handle missing columns gracefully with descriptive error messages
- Strip whitespace from header parsing for reliability
🤖 Generated with [Claude Code](https://claude.ai/code)
Co-Authored-By: Claude <noreply@anthropic.com>
- Replace FinnHub with Alpha Vantage API in README documentation
- Implement comprehensive Alpha Vantage modules:
- Stock data (daily OHLCV with date filtering)
- Technical indicators (SMA, EMA, MACD, RSI, Bollinger Bands, ATR)
- Fundamental data (overview, balance sheet, cashflow, income statement)
- News and sentiment data with insider transactions
- Update news analyst tools to use ticker-based news search
- Integrate Alpha Vantage vendor methods into interface routing
- Maintain backward compatibility with existing vendor system
🤖 Generated with [Claude Code](https://claude.ai/code)
Co-Authored-By: Claude <noreply@anthropic.com>
- Added support for running CLI and Ollama server via Docker
- Introduced tests for local embeddings model and standalone Docker setup
- Enabled conditional Ollama server launch via LLM_PROVIDER