Features:
- API key management with secure browser localStorage
- Model selection for Deep Think (Opus) and Quick Think (Sonnet/Haiku)
- Configurable max debate rounds (1-5)
- Full analysis pipeline visualization with 9-step progress tracking
- Agent reports display (Market, News, Social, Fundamentals analysts)
- Investment debate viewer (Bull vs Bear with Research Manager decision)
- Risk debate viewer (Aggressive vs Conservative vs Neutral)
- Data sources tracking panel
- Dark mode support throughout
- Bulk "Analyze All" functionality for all 50 stocks
Backend:
- Added analysis config parameters to API endpoints
- Support for provider/model selection in analysis requests
- Indian market data integration improvements
Documentation:
- Comprehensive README with 10 feature screenshots
- API endpoint documentation
- Project structure guide
- Getting started instructions
- Add React + Vite + Tailwind CSS frontend for Nifty50 recommendations
- Add FastAPI backend for serving stock recommendations
- Add Indian market data sources (jugaad_data, markets API)
- Add Nifty50 stock recommender modules
- Update dataflows for Indian market support
- Fix various utility and configuration updates
- Replace hardcoded column indices with column name lookup
- Add mapping for all supported indicators to their expected CSV column names
- Handle missing columns gracefully with descriptive error messages
- Strip whitespace from header parsing for reliability
- Replace FinnHub with Alpha Vantage API in README documentation
- Implement comprehensive Alpha Vantage modules:
- Stock data (daily OHLCV with date filtering)
- Technical indicators (SMA, EMA, MACD, RSI, Bollinger Bands, ATR)
- Fundamental data (overview, balance sheet, cashflow, income statement)
- News and sentiment data with insider transactions
- Update news analyst tools to use ticker-based news search
- Integrate Alpha Vantage vendor methods into interface routing
- Maintain backward compatibility with existing vendor system
- Added support for running CLI and Ollama server via Docker
- Introduced tests for local embeddings model and standalone Docker setup
- Enabled conditional Ollama server launch via LLM_PROVIDER