- asyncpg + psycopg2-binary: same fix as gex-llm — service had
DATABASE_URL and alembic env.py but no postgres driver installed
- sse-starlette: imported by app.py but never declared; the running
container had it from a prior manual pip install. Rebuilding the
image from canonical requirements dropped it and crashed the
container with ModuleNotFoundError on startup.
Bundles a pre-existing uncommitted restructuring of requirements.txt
into pinned + categorized groups (Core, LLM Clients, Data, Analysis,
Database, CLI & UI, Testing, Utilities).
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Alpaca provides 10k calls/min free with 7yr history via IEX feed.
Hybrid approach: Alpaca for price bars, snapshots, sector ETF perf,
and moving averages; yfinance for fundamentals (PE, margins, 13F).
- Add alpaca_data.py: bars, snapshots, MAs, sector ETF perf, news
- Update get_macro_indicators: sector ETF performance via Alpaca
- Update get_sector_rotation: compute relative strength vs SPY
- Update entry timing node: Alpaca MAs from actual bar data
- Add alpaca-py to requirements.txt
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Swap Chainlit chatbot UI for a minimal FastAPI service with:
- POST /analyze to start analysis
- GET /analyze/{id}/stream for SSE progress events
- GET /health for Railway healthcheck
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
- Added support for running CLI and Ollama server via Docker
- Introduced tests for local embeddings model and standalone Docker setup
- Enabled conditional Ollama server launch via LLM_PROVIDER