- **Standalone HTML Reports**: Refactored report generation to perform server-side Markdown-to-HTML rendering using Python.

- Removed dependency on client-side `marked.js` and CDNs.
    - Reports are now fully offline-capable.
    - Cleaned up JSON keys to remove `.md` extensions for cleaner data structure.
- **Google News Adapter**: Implemented `get_google_global_news` adapter in `google.py` to match the standard `(curr_date, look_back_days)` interface, adhering to the Adapter Pattern and fixing signature mismatches.
- **Robust Demo Script**: Created `run_agent.py` (replacing demo scripts) with:
    - Automatic `.env` loading.
    - `backend_url` handling (clearing OpenAI defaults when using Anthropic).
    - Hardened configuration for "Deep Analysis" (Debate Rounds=2).
    - Pre-configured Google News vendor to bypass AlphaVantage rate limits.

### Fixed
- **Rate Limit Crash**: Fixed `AlphaVantageRateLimitError` by switching default news vendor to Google in `run_agent.py`.
This commit is contained in:
swj.premkumar 2026-01-14 05:58:33 -06:00
parent 05ce55125f
commit 24edac65c4
15 changed files with 371 additions and 47 deletions

1
.gitignore vendored
View File

@ -11,3 +11,4 @@ eval_data/
.env
venv_torture_test
*.log
data_cache

View File

@ -2,6 +2,37 @@
All notable changes to the **TradingAgents** project will be documented in this file.
## [Unreleased] - 2026-01-14
### Added
- **Standalone HTML Reports**: Refactored report generation to perform server-side Markdown-to-HTML rendering using Python.
- Removed dependency on client-side `marked.js` and CDNs.
- Reports are now fully offline-capable.
- Cleaned up JSON keys to remove `.md` extensions for cleaner data structure.
- **Google News Adapter**: Implemented `get_google_global_news` adapter in `google.py` to match the standard `(curr_date, look_back_days)` interface, adhering to the Adapter Pattern and fixing signature mismatches.
- **Robust Demo Script**: Created `run_agent.py` (replacing demo scripts) with:
- Automatic `.env` loading.
- `backend_url` handling (clearing OpenAI defaults when using Anthropic).
- Hardened configuration for "Deep Analysis" (Debate Rounds=2).
- Pre-configured Google News vendor to bypass AlphaVantage rate limits.
### Fixed
- **Rate Limit Crash**: Fixed `AlphaVantageRateLimitError` by switching default news vendor to Google in `run_agent.py`.
- **Interface Mismatch**: Fixed `TypeError` in `get_global_news` where string dates were passed to integer arguments.
- **Logic Crash**: Fixed `TypeError` in `TradingAgentsGraph.apply_trend_override` caused by duplicate arguments in the method call.
- **Broken Entry Point**: Updated `startAgent.sh` to point to the correct `run_agent.py` script instead of a non-existent file.
## [Released] - 2026-01-13
### Added
- **Dynamic Parameter Tuning (The Learning Loop)**: Implemented full self-reflection cycle. The Reflector agent now parses its own advice into JSON (`rsi_period`, `stop_loss_pct`), persists it to `data_cache/runtime_config.json`, and the Market Analyst loads it to tune the Regime Detector in real-time.
- **Audit Archival**: Every tuning event is now archived to `results/{TICKER}/{DATE}/runtime_config.json` for historical auditing, ensuring we can reproduce why parameters changed on any given day.
- **Atomic Persistence**: Implemented `agent_utils.write_json_atomic` to prevent race conditions during config saves.
- **Centralized Config**: Moved hardcoded paths to `default_config.py` (DRY principle).
### Fixed
- **Reflector Logic Gap**: The Reflector was previously "shouting into the void"—making suggestions but having no mechanism to apply them. This circuit is now closed.
## [Unreleased] - 2026-01-11
### Added

View File

@ -185,6 +185,18 @@ cp .env.example .env
**Note:** We are happy to partner with Alpha Vantage to provide robust API support for TradingAgents. You can get a free AlphaVantage API [here](https://www.alphavantage.co/support/#api-key), TradingAgents-sourced requests also have increased rate limits to 60 requests per minute with no daily limits. Typically the quota is sufficient for performing complex tasks with TradingAgents thanks to Alpha Vantages open-source support program. If you prefer to use OpenAI for these data sources instead, you can modify the data vendor settings in `tradingagents/default_config.py`.
### Quick Start (Recommended)
To run a deep analysis on a specific ticker and automatically open the result:
```bash
./startAgent.sh [TICKER] [DATE]
# Example:
./startAgent.sh NVDA 2024-05-10
```
This script executes the robust `run_agent.py` logic (configured for Deep Analysis) and generates a **Standalone HTML Report** that is fully offline-capable (no external JS dependencies).
### CLI Usage
You can also try out the CLI directly by running:

140
run_agent.py Normal file
View File

@ -0,0 +1,140 @@
import sys
import os
import argparse
from dotenv import load_dotenv
from datetime import datetime
from tradingagents.graph.trading_graph import TradingAgentsGraph
from tradingagents.default_config import DEFAULT_CONFIG
def main():
parser = argparse.ArgumentParser(description="Run Trading Agent with Deep Analysis and Claude Sonnet 4.5 Thinking")
parser.add_argument("ticker", type=str, help="Stock Ticker Symbol (e.g., AAPL)")
parser.add_argument("--date", type=str, default=datetime.now().strftime("%Y-%m-%d"), help="Trade Date (YYYY-MM-DD)")
args = parser.parse_args()
# Load environment variables
load_dotenv()
# 1. Configuration Setup
# Mixing CLI args with required distinct configuration
config = DEFAULT_CONFIG.copy()
# User Request: "anthropic claude sonnet 4.5 thinking"
config["llm_provider"] = "anthropic"
config["deep_think_llm"] = "claude-sonnet-4-5-thinking"
# FIX: Clear backend_url so it doesn't default to OpenAI's endpoint,
# unless specified in environment (e.g. for proxy)
config["backend_url"] = os.getenv("BACKEND_URL")
# Also setting quick_think to a high-quality model to support the deep analysis,
# though usually this is lighter. User emphasis was on "thinking" model.
config["quick_think_llm"] = "claude-sonnet-4-5-thinking"
# User Request: "Deep Analysis"
# We enable debate rounds to trigger the deep thinking loops
config["max_debate_rounds"] = 2
config["max_risk_discuss_rounds"] = 2
# 2. Tool Configuration (Data Vendors)
# FIX: Use Google for news to avoid AlphaVantage rate limits (and handle fallback better)
config["tool_vendors"] = {
"get_news": "google",
"get_global_news": "google"
}
print(f"🚀 Initializing Trading Agent for {args.ticker} on {args.date}")
print(f"🧠 Model: {config['deep_think_llm']} (Provider: {config['llm_provider']})")
print(f"🔍 Deep Analysis: ENABLED (Debate Rounds: {config['max_debate_rounds']})")
print(f"📰 News Vendor: Google (Rate Limit Bypass)")
# 3. Initialize Graph
# User Request: "Fundamental Analysis" (Explicitly included)
analysts = ["market", "fundamentals", "news", "social"]
try:
agent_graph = TradingAgentsGraph(
selected_analysts=analysts,
config=config,
debug=True # Enable debug to see the "Thinking" process in logs
)
# 4. Run Propagation
final_state, signal = agent_graph.propagate(args.ticker, args.date)
# 5. Output Summary
print("\n" + "="*50)
print(f"🏁 FINAL DECISION for {args.ticker}")
print("="*50)
decision = final_state.get("final_trade_decision", "NO DECISION")
if isinstance(decision, dict):
print(f"ACTION: {decision.get('action')}")
print(f"QUANTITY: {decision.get('quantity')}")
print(f"REASONING: {decision.get('reasoning')}")
else:
print(f"DECISION: {decision}")
print("\n✅ Run Complete. Check 'eval_results' for detailed logs and reports.")
# 6. Generate HTML Report
print("\n📊 Generating Standalone HTML Report...")
# 6.1 Identify Reports Directory
base_dir = os.path.dirname(os.path.abspath(__file__))
results_dir = os.path.join(base_dir, "results", args.ticker, args.date)
reports_dir = os.path.join(results_dir, "reports")
os.makedirs(reports_dir, exist_ok=True)
# 6.2 Write Markdown Files from State
# Map state keys to friendly filenames
report_map = {
"market_report": "market_analyst.md",
"news_report": "news_analyst.md",
"fundamentals_report": "fundamentals_analyst.md",
"sentiment_report": "sentiment_analyst.md",
"investment_plan": "investment_plan.md",
"trader_investment_plan": "trader_decision.md" # Optional/Internal
}
print(f"DEBUG: Calculated raw reports_dir: {reports_dir}")
if not os.path.exists(reports_dir):
print(f"DEBUG: creating directory {reports_dir}")
os.makedirs(reports_dir, exist_ok=True)
for key, filename in report_map.items():
content = final_state.get(key)
if content:
# Ensure directory exists (User Request)
if not os.path.exists(reports_dir):
os.makedirs(reports_dir, exist_ok=True)
file_path = os.path.join(reports_dir, filename)
with open(file_path, "w", encoding="utf-8") as f:
f.write(str(content))
# 6.3 Call Generator with CORRECT Path
import subprocess
generator_script = os.path.join(base_dir, "scripts", "generate_report_html.py")
try:
# generator expects: <report_dir>
print(f"DEBUG: Calling generator with path: {reports_dir}")
cmd = [sys.executable, generator_script, reports_dir]
subprocess.run(cmd, check=True)
print(f"✅ Report Generated Successfully: {reports_dir}/index.html")
except subprocess.CalledProcessError as e:
print(f"⚠️ Report Generation Failed: {e}")
except Exception as e:
print(f"⚠️ Error running report generator: {e}")
except Exception as e:
print(f"\n❌ ERROR: Agents failed to run: {e}")
import traceback
traceback.print_exc()
if __name__ == "__main__":
main()

View File

@ -1,5 +1,6 @@
import sys
import json
import markdown
from pathlib import Path
# Add project root to sys.path to allow importing tradingagents
@ -18,7 +19,6 @@ TEMPLATE = """<!DOCTYPE html>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>Trading Agent Report - {ticker} - {date}</title>
<script src="https://cdn.jsdelivr.net/npm/marked/marked.min.js"></script>
<style>
:root {
--bg-color: #0d1117;
@ -184,7 +184,7 @@ TEMPLATE = """<!DOCTYPE html>
document.getElementById(`nav-${key}`).classList.add('active');
// Render Content
contentDiv.innerHTML = marked.parse(reportData[key]);
contentDiv.innerHTML = reportData[key];
window.scrollTo(0, 0);
}
@ -194,7 +194,7 @@ TEMPLATE = """<!DOCTYPE html>
const navItem = document.createElement('div');
navItem.className = 'nav-item';
navItem.id = `nav-${key}`;
navItem.innerText = key.replace(/_/g, ' ').replace('.md', '').toUpperCase();
navItem.innerText = key.replace(/_/g, ' ').toUpperCase();
navItem.onclick = () => renderReport(key);
navContainer.appendChild(navItem);
});
@ -243,8 +243,10 @@ def generate_report(report_dir):
# Deanonymize content if possible
if anonymizer:
content = anonymizer.deanonymize_text(content)
reports[file.name] = content
# Convert Markdown to HTML server-side
html_content = markdown.markdown(content, extensions=['tables', 'fenced_code'])
reports[file.stem] = html_content
except Exception as e:
print(f"Failed to read {file}: {e}")

View File

@ -71,7 +71,7 @@ fi
echo "🚀 Starting Trading Agents..."
# Note: Debug print() statements will appear in the terminal
# Rich library's Live display handles the animated UI
python3 -m cli.main
python3 run_agent.py $1 --date $2
# 4. Open Reports
echo "📊 Searching for latest generated reports..."

101
startAgentInteractive.sh Executable file
View File

@ -0,0 +1,101 @@
#!/bin/bash
# 0. Check & Start Claude Proxy
# Check if port 10909 is open (Proxy running) using pure bash TCP check
if ! (echo > /dev/tcp/localhost/10909) 2>/dev/null; then
echo "🔌 Claude Proxy not detected on port 10909"
echo "Select Proxy Provider:"
echo "1) gemini (default)"
echo "2) anthropic"
read -p "Choice [1]: " choice
case $choice in
2) PROXY_TYPE="anthropic" ;;
*) PROXY_TYPE="gemini" ;;
esac
echo "🔌 Starting Claude Proxy ($PROXY_TYPE)..."
/home/prem/git/antigravity-claude-proxy/startProxy.sh "$PROXY_TYPE" &
# Wait a moment for it to initialize with a progress bar
echo -n "⏳ Initializing proxy: ["
for i in {1..20}; do
echo -n "■"
sleep 0.1
done
echo "] 100% Ready!"
else
echo "✅ Claude Proxy already running on port 10909"
fi
./startEmbedding.sh
# 1. Activate Virtual Environment
if [ -d ".venv" ]; then
source .venv/bin/activate
echo "✅ Virtual Environment (.venv) Activated"
else
echo "❌ Virtual Environment not found! Run 'uv venv --python 3.13' first."
exit 1
fi
# 2. Export API Keys (PLACEHOLDERS - PLEASE UPDATE)
# You can also load these from a .env file if preferred
if [ -f ".env" ]; then
export $(grep -v '^#' .env | xargs)
echo "✅ Loaded keys from .env"
else
echo "⚠️ No .env file found. Using default/exported keys."
fi
# Check if keys are set
if [ -z "$OPENAI_API_KEY" ]; then
echo "⚠️ OPENAI_API_KEY is missing! Set it if using OpenAI."
fi
if [ -z "$GOOGLE_API_KEY" ]; then
echo "⚠️ GOOGLE_API_KEY is missing! Set it if using Gemini."
fi
# Ensure Embedding URL is set (default to local TEI port 11434)
if [ -z "$EMBEDDING_API_URL" ]; then
echo " Setting default EMBEDDING_API_URL to http://localhost:11434/v1"
export EMBEDDING_API_URL="http://localhost:11434/v1"
export EMBEDDING_MODEL="all-MiniLM-L6-v2"
fi
if [ -z "$EMBEDDING_TRUNCATION_LIMIT" ]; then
export EMBEDDING_TRUNCATION_LIMIT=800
fi
# 3. Start the Trading Agents
echo "🚀 Starting Trading Agents..."
# Note: Debug print() statements will appear in the terminal
# Rich library's Live display handles the animated UI
python3 -m cli.main
# 4. Open Reports
echo "📊 Searching for latest generated reports..."
# Find the latest "reports" directory by modification time (most recent last -> tail -1)
# Works by printing timestamp (%T@) and path (%p), sorting numerically, picking last, cleaning output
LATEST_REPORT_DIR=$(find results -type d -name "reports" -printf '%T@ %p\n' | sort -n | tail -1 | cut -f2- -d" ")
if [ -n "$LATEST_REPORT_DIR" ]; then
echo "✅ Found reports in: $LATEST_REPORT_DIR"
# Generate HTML Dashboard
echo "🎨 Generating Report Dashboard..."
python3 scripts/generate_report_html.py "$LATEST_REPORT_DIR"
REPORT_HTML="$LATEST_REPORT_DIR/index.html"
# Check if xdg-open exists (Linux)
if [ -f "$REPORT_HTML" ] && command -v xdg-open &> /dev/null; then
echo "🌐 Opening dashboard in browser..."
xdg-open "$REPORT_HTML" &> /dev/null &
else
echo " Dashboard generated at:"
echo " file://$(pwd)/$REPORT_HTML"
fi
else
echo "⚠️ No reports found to open."
fi

View File

@ -29,10 +29,16 @@ def get_stock_data(
if not real_ticker:
real_ticker = symbol # Fallback if not anonymized
# 2. Get Data using Real Ticker
raw_data = route_to_vendor("get_stock_data", real_ticker, start_date, end_date, format=format)
# 3. Anonymize Output (AAPL -> ASSET_XXX)
anonymized_data = anonymizer.anonymize_text(raw_data, real_ticker)
try:
# 2. Get Data using Real Ticker
raw_data = route_to_vendor("get_stock_data", real_ticker, start_date, end_date, format=format)
# 3. Anonymize Output (AAPL -> ASSET_XXX)
anonymized_data = anonymizer.anonymize_text(raw_data, real_ticker)
return anonymized_data
except Exception as e:
return f"Error executing tool get_stock_data: {str(e)}"
return anonymized_data

View File

@ -13,11 +13,16 @@ def _process_vendor_call(func_name, ticker, *args):
if not real_ticker:
real_ticker = ticker
# 2. Get Data
raw_data = route_to_vendor(func_name, real_ticker, *args)
# 3. Anonymize Output
return anonymizer.anonymize_text(raw_data, real_ticker)
try:
# 2. Get Data
raw_data = route_to_vendor(func_name, real_ticker, *args)
# 3. Anonymize Output
return anonymizer.anonymize_text(raw_data, real_ticker)
except Exception as e:
# RETURN string error instead of raising.
# This ensures ToolNode generates a ToolMessage result, preventing "Dangling Tool Use" error.
return f"Error executing tool {func_name}: {str(e)}"
@tool
def get_fundamentals(

View File

@ -15,18 +15,17 @@ def _process_vendor_call(func_name, ticker=None, *args):
if not real_ticker:
real_ticker = ticker
# 2. Get Data
# Handle optional ticker for global_news
call_args = [real_ticker] + list(args) if ticker else list(args)
raw_data = route_to_vendor(func_name, *call_args)
# 3. Anonymize Output
# For global news, passing ticker=None to anonymize_text might skip ticker-specific masking,
# but still mask known mapped tickers if logic supports it.
# Current anonymize_text requires ticker context for "Company X".
# For global news, we might need a generic pass or skip specific company names if unknown.
# However, for now we pass real_ticker if available.
return anonymizer.anonymize_text(raw_data, real_ticker) if real_ticker else raw_data
try:
# 2. Get Data
# Handle optional ticker for global_news
call_args = [real_ticker] + list(args) if ticker else list(args)
raw_data = route_to_vendor(func_name, *call_args)
# 3. Anonymize Output
return anonymizer.anonymize_text(raw_data, real_ticker) if real_ticker else raw_data
except Exception as e:
return f"Error executing tool {func_name}: {str(e)}"
@tool
def get_news(
@ -74,7 +73,12 @@ def get_global_news(
# Ideally, get_global_news should probably stay raw or be masked for the 'current company of interest'
# but tools don't know the agent's context unless passed.
# Leaving global news RAW for now as it provides macro context.
return route_to_vendor("get_global_news", curr_date, look_back_days, limit)
# However, for now we pass real_ticker if available.
# Leaving global news RAW for now as it provides macro context.
try:
return route_to_vendor("get_global_news", curr_date, look_back_days, limit)
except Exception as e:
return f"Error executing tool get_global_news: {str(e)}"
@tool
def get_insider_sentiment(

View File

@ -29,10 +29,16 @@ def get_indicators(
if not real_ticker:
real_ticker = symbol
# 2. Get Data
raw_data = route_to_vendor("get_indicators", real_ticker, indicator, curr_date, look_back_days)
try:
# 2. Get Data
raw_data = route_to_vendor("get_indicators", real_ticker, indicator, curr_date, look_back_days)
# 3. Anonymize Output
anonymized_data = anonymizer.anonymize_text(raw_data, real_ticker)
# 3. Anonymize Output
anonymized_data = anonymizer.anonymize_text(raw_data, real_ticker)
return anonymized_data
except Exception as e:
return f"Error executing tool get_indicators: {str(e)}"
return anonymized_data

View File

@ -6,16 +6,13 @@ from .googlenews_utils import getNewsData
def get_google_news(
query: Annotated[str, "Query to search with"],
curr_date: Annotated[str, "Curr date in yyyy-mm-dd format"],
look_back_days: Annotated[int, "how many days to look back"],
start_date: Annotated[str, "Start date in yyyy-mm-dd format"],
end_date: Annotated[str, "End date in yyyy-mm-dd format"],
) -> str:
query = query.replace(" ", "+")
start_date = datetime.strptime(curr_date, "%Y-%m-%d")
before = start_date - relativedelta(days=look_back_days)
before = before.strftime("%Y-%m-%d")
news_results = getNewsData(query, before, curr_date)
# Direct pass-through since getNewsData handles the dates
news_results = getNewsData(query, start_date, end_date)
news_str = ""
@ -28,4 +25,24 @@ def get_google_news(
if len(news_results) == 0:
return ""
return f"## {query} Google News, from {before} to {curr_date}:\n\n{news_str}"
return f"## {query} Google News, from {start_date} to {end_date}:\n\n{news_str}"
def get_google_global_news(
curr_date: Annotated[str, "Current date in yyyy-mm-dd format"],
look_back_days: Annotated[int, "Number of days to look back"] = 7,
limit: Annotated[int, "Maximum number of articles to return"] = 5
) -> str:
"""
Retrieve global news data using Google News scraper.
Adapts the signature (curr_date, look_back_days) to (start_date, end_date).
"""
# Calculate start date
end_date_dt = datetime.strptime(curr_date, "%Y-%m-%d")
start_date_dt = end_date_dt - relativedelta(days=look_back_days)
start_date = start_date_dt.strftime("%Y-%m-%d")
# Use a generic global market query
query = "Global Financial Markets"
return get_google_news(query, start_date, curr_date)

View File

@ -20,8 +20,8 @@ def is_rate_limited(response):
@retry(
retry=(retry_if_result(is_rate_limited)),
wait=wait_exponential(multiplier=1, min=4, max=60),
stop=stop_after_attempt(5),
wait=wait_exponential(multiplier=1, min=2, max=10),
stop=stop_after_attempt(3),
)
def make_request(url, headers):
"""Make a request with retry logic for rate limiting"""

View File

@ -3,7 +3,7 @@ from typing import Annotated
# Import from vendor-specific modules
from .local import get_YFin_data, get_finnhub_news, get_finnhub_company_insider_sentiment, get_finnhub_company_insider_transactions, get_simfin_balance_sheet, get_simfin_cashflow, get_simfin_income_statements, get_reddit_global_news, get_reddit_company_news
from .y_finance import get_YFin_data_online, get_stock_stats_indicators_window, get_balance_sheet as get_yfinance_balance_sheet, get_cashflow as get_yfinance_cashflow, get_income_statement as get_yfinance_income_statement, get_insider_transactions as get_yfinance_insider_transactions, get_fundamentals as get_fundamentals_yfinance
from .google import get_google_news
from .google import get_google_news, get_google_global_news
from .openai import get_stock_news_openai, get_global_news_openai, get_fundamentals_openai
from .alpha_vantage import (
get_stock as get_alpha_vantage_stock,
@ -109,7 +109,7 @@ VENDOR_METHODS = {
},
"get_global_news": {
"alpha_vantage": get_alpha_vantage_global_news,
"google": get_google_news,
"google": get_google_global_news,
"local": get_reddit_global_news
},
"get_insider_sentiment": {

View File

@ -254,7 +254,6 @@ class TradingAgentsGraph:
raw_llm_decision,
self.hard_data,
regime_val,
regime_val,
final_state.get("net_insider_flow", 0.0),
final_state.get("portfolio", {})
)