Commit Graph

5 Commits

Author SHA1 Message Date
dtarkent2-sys 8c48c3cffd WIP: local TradingAgents customizations through 2026-04-13
Bulk commit of accumulated local changes on the dtarkent2-sys fork.
Spans agents, dataflows, llm_clients, graph orchestration, CLI, and
docs. Primary work areas:

- llm_clients/ — multi-LLM client layer (anthropic, google, openai,
  factory, base, validators) for swappable provider support
- dataflows/alpaca_data.py — Alpaca integration alongside existing
  alpha_vantage and y_finance flows
- agents/structured/ — portfolio, scoring, and tier1/2/3 layers
- agents/analysts, researchers, risk_mgmt — local prompt and logic
  customizations
- graph/ — orchestration tweaks (parallel_analysts, propagation,
  reflection, signal_processing, trading_graph)
- alembic scaffolding inherited from prior commit
- chainlit web UI design notes in docs/plans/

This is a single WIP snapshot to preserve work before any upstream
merge. History can be cleaned up with interactive rebase later.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-13 22:01:00 -04:00
dtarkent2-sys e0ed485098 Fix .dockerignore: don't exclude requirements.txt
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-21 03:22:31 +00:00
dtarkent2-sys 3ac1c5ad3d Harden security, fix memory leak, clean up deps
- Add API key auth (AGENTS_API_KEY env var) on /analyze endpoints
- Add CORS_ORIGINS env var instead of hardcoded wildcard
- Add memory cleanup (30min TTL) and concurrency semaphore (max 3)
- Add 10-minute analysis timeout
- Fix ticker validation (alphanumeric check)
- Remove unused deps (redis, backtrader, parsel, rich, typer, questionary)
- Fix pyproject.toml: replace chainlit with actual FastAPI deps
- Add .dockerignore, add eval_results/ to .gitignore

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-21 03:17:11 +00:00
Yijia Xiao 26c5ba5a78
Revert "Docker support and Ollama support (#47)" (#57)
This reverts commit 78ea029a0b.
2025-06-26 00:07:58 -04:00
Geeta Chauhan 78ea029a0b
Docker support and Ollama support (#47)
- Added support for running CLI and Ollama server via Docker
- Introduced tests for local embeddings model and standalone Docker setup
- Enabled conditional Ollama server launch via LLM_PROVIDER
2025-06-25 23:57:05 -04:00