Commit Graph

8 Commits

Author SHA1 Message Date
Clayton Brown b6c99e1dde feat(034): add Polymarket prediction market signals
Fetch crowd-sourced probability estimates from Polymarket's public
Gamma API and inject them as context into fundamentals, sentiment,
and news analyst prompts.

- New tradingagents/signals/polymarket.py module
- Filter: >=40% probability, >=100K USD volume, financially relevant keywords
- Categorize: Fed/Rates, Economy, Trade, Regulation, Corporate, Crypto, Energy, Tech, Macro
- Config: polymarket_enabled (default True) in default_config.py
- CLI: --polymarket/--no-polymarket flags on analyze command
- State: polymarket_context field in AgentState
- 30 unit tests and integration tests
- Polymarket section in README
2026-04-21 16:38:36 +10:00
Yijia Xiao c61242a28c
Merge pull request #464 from CadeYu/sync-validator-models
sync model validation with cli catalog
2026-03-29 11:07:51 -07:00
CadeYu 8793336dad sync model validation with cli catalog 2026-03-25 21:23:02 +08:00
javierdejesusda 047b38971c refactor: simplify api_key mapping and consolidate tests
Apply review suggestions: use concise `or` pattern for API key
resolution, consolidate tests into parameterized subTest, move
import to module level per PEP 8.
2026-03-24 14:52:51 +01:00
javierdejesusda f5026009f9 fix(llm_clients): standardize Google API key to unified api_key param
GoogleClient now accepts the unified `api_key` parameter used by
OpenAI and Anthropic clients, mapping it to the provider-specific
`google_api_key` that ChatGoogleGenerativeAI expects. Legacy
`google_api_key` still works for backward compatibility.

Resolves TODO.md item #2 (inconsistent parameter handling).
2026-03-24 14:35:02 +01:00
CadeYu 08bfe70a69 fix: preserve exchange-qualified tickers across agent prompts 2026-03-21 21:10:13 +08:00
Yijia Xiao 26c5ba5a78
Revert "Docker support and Ollama support (#47)" (#57)
This reverts commit 78ea029a0b.
2025-06-26 00:07:58 -04:00
Geeta Chauhan 78ea029a0b
Docker support and Ollama support (#47)
- Added support for running CLI and Ollama server via Docker
- Introduced tests for local embeddings model and standalone Docker setup
- Enabled conditional Ollama server launch via LLM_PROVIDER
2025-06-25 23:57:05 -04:00