feat: daily digest consolidation and Google NotebookLM sync (#23)

* feat: daily digest consolidation and NotebookLM sync

- Add tradingagents/daily_digest.py: appends timestamped entries from
  analyze and scan runs into a single reports/daily/{date}/daily_digest.md
- Add tradingagents/notebook_sync.py: uploads digest to Google NotebookLM
  via nlm CLI, deleting the previous version before uploading (opt-in,
  skips silently if NOTEBOOK_ID is not set)
- Add get_digest_path() helper to report_paths.py
- Hook both analyze and scan CLI commands to append + sync after each run
- Add NOTEBOOK_ID to .env.example

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>

* docs: update agent memory for daily digest + NotebookLM sync

Update CURRENT_STATE, ARCHITECTURE, and COMPONENTS context files to
reflect the feat/daily-digest-notebooklm implementation.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>

* fix: correct nlm CLI commands and env var name for NotebookLM sync

- Use nlm note list/create/update instead of source list/add/delete
- Parse notes from {"notes": [...]} response structure
- Rename NOTEBOOK_ID -> NOTEBOOKLM_ID in both code and .env.example
- Auto-discover nlm at ~/.local/bin/nlm when not in PATH
- Tested: create on first run, update on subsequent runs

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>

---------

Co-authored-by: Claude Sonnet 4.6 <noreply@anthropic.com>
This commit is contained in:
ahmet guzererler 2026-03-19 12:21:03 +01:00 committed by GitHub
parent 97ab49bb99
commit d4fefc804e
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194
8 changed files with 221 additions and 15 deletions

View File

@ -6,6 +6,8 @@ ANTHROPIC_API_KEY=
XAI_API_KEY=
OPENROUTER_API_KEY=
NOTEBOOKLM_ID=e8fd4391-9cb2-43ff-b893-1316a52857b6
# ── Data Provider API Keys ───────────────────────────────────────────
ALPHA_VANTAGE_API_KEY=
# Free at https://finnhub.io — required for earnings/economic calendars and insider transactions
@ -62,6 +64,10 @@ TRADINGAGENTS_MAX_DEBATE_ROUNDS=2
# TRADINGAGENTS_MAX_RISK_DISCUSS_ROUNDS=1 # risk analyst discussion rounds (15)
# TRADINGAGENTS_MAX_RECUR_LIMIT=100 # LangGraph recursion limit
# ── Google NotebookLM sync (optional) ────────────────────────────────
# Notebook ID for daily digest upload via the nlm CLI tool
# NOTEBOOK_ID=
# ── Data vendor routing ──────────────────────────────────────────────
# Category-level vendor selection (yfinance | alpha_vantage | finnhub)
# TRADINGAGENTS_VENDOR_CORE_STOCK_APIS=yfinance

View File

@ -29,6 +29,8 @@ from rich.rule import Rule
from tradingagents.graph.trading_graph import TradingAgentsGraph
from tradingagents.report_paths import get_daily_dir, get_market_dir, get_ticker_dir
from tradingagents.daily_digest import append_to_digest
from tradingagents.notebook_sync import sync_to_notebooklm
from tradingagents.default_config import DEFAULT_CONFIG
from cli.models import AnalystType
from cli.utils import *
@ -1173,6 +1175,14 @@ def run_analysis():
except Exception as e:
console.print(f"[red]Error saving report: {e}[/red]")
# Append to daily digest and sync to NotebookLM
digest_content = message_buffer.final_report or ""
if digest_content:
digest_path = append_to_digest(
selections["analysis_date"], "analyze", selections["ticker"], digest_content
)
sync_to_notebooklm(digest_path)
# Write observability log
log_dir = get_ticker_dir(selections["analysis_date"], selections["ticker"])
log_dir.mkdir(parents=True, exist_ok=True)
@ -1269,6 +1279,12 @@ def run_scan(date: Optional[str] = None):
)
set_run_logger(None)
# Append to daily digest and sync to NotebookLM
macro_content = result.get("macro_scan_summary", "")
if macro_content:
digest_path = append_to_digest(scan_date, "scan", "Market Scan", macro_content)
sync_to_notebooklm(digest_path)
console.print(f"\n[green]Results saved to {save_dir}[/green]")

View File

@ -1,27 +1,23 @@
# Current Milestone
Report path unification complete. Observability logging (data sources, LLM calls, tool calls, token counts) is the active task. Next: `pipeline` CLI command.
Daily digest consolidation and Google NotebookLM sync shipped (PR open: `feat/daily-digest-notebooklm`). All analyses now append to a single `daily_digest.md` per day and auto-upload to NotebookLM via `nlm` CLI. Next: PR review and merge.
# Recent Progress
- **PR #22 merged**: Unified report paths, structured observability logging, memory system update
- **feat/daily-digest-notebooklm** (open PR): Daily digest consolidation + NotebookLM sync
- `tradingagents/daily_digest.py``append_to_digest()` appends timestamped entries to `reports/daily/{date}/daily_digest.md`
- `tradingagents/notebook_sync.py``sync_to_notebooklm()` deletes old source then uploads new digest via `nlm` CLI (opt-in via `NOTEBOOK_ID` env var)
- `tradingagents/report_paths.py` — added `get_digest_path(date)`
- `cli/main.py``analyze` and `scan` commands both call digest + sync after each run
- `.env.example``NOTEBOOK_ID` added
- **PR #21 merged**: Memory system v2 — builder/reader skills, 5 context files, post-commit hook
- **PR #18 merged**: Opt-in vendor fallback — fail-fast by default, `FALLBACK_ALLOWED` whitelist for fungible data only (ADR 011)
- **PR #19 merged**: Merge conflict resolution after PR #18
- **Report path unification** (`80e174c`): All reports now written under `reports/daily/{date}/{ticker}/` for per-ticker analysis and `reports/daily/{date}/market/` for scanner output
- `pipeline` CLI command implemented — scan JSON → filter by conviction → per-ticker deep dive via `MacroBridge`
- `extract_json()` utility in `agents/utils/json_utils.py` handles DeepSeek R1 `<think>` blocks and markdown fences
- Memory builder and reader skills created in `.claude/skills/`
- Structured context files generated under `docs/agent/context/` (ARCHITECTURE, CONVENTIONS, COMPONENTS, TECH_STACK, GLOSSARY)
- **PR #18 merged**: Opt-in vendor fallback — fail-fast by default (ADR 011)
- 220+ offline tests passing
- 12 pre-existing test failures fixed across 5 files
# In Progress
- **Observability logging**: Structured logging for data source calls (vendor, endpoint, success/failure), LLM requests (model name, agent, token counts), and tool invocations (tool name, duration). Goal: understand what's being called, by whom, and at what cost per run.
# Planned Next
- Report path unification tests (verify new paths in integration tests)
- Awaiting `NOTEBOOK_ID` from user to enable end-to-end NotebookLM test
# Active Blockers

View File

@ -87,6 +87,7 @@ All generated artifacts live under `reports/daily/{YYYY-MM-DD}/`:
```
reports/
└── daily/{YYYY-MM-DD}/
├── daily_digest.md # consolidated daily report (all runs appended)
├── market/ # scan results (geopolitical_report.md, etc.)
├── {TICKER}/ # per-ticker analysis / pipeline
│ ├── 1_analysts/
@ -95,10 +96,20 @@ reports/
└── summary.md # pipeline combined summary
```
Helper functions: `get_daily_dir()`, `get_market_dir()`, `get_ticker_dir()`, `get_eval_dir()`.
Helper functions: `get_daily_dir()`, `get_market_dir()`, `get_ticker_dir()`, `get_eval_dir()`, `get_digest_path()`.
Source: `tradingagents/report_paths.py`
## Daily Digest & NotebookLM Sync
After every `analyze` or `scan` run, the CLI:
1. Calls `append_to_digest(date, entry_type, label, content)` → appends a timestamped section to `reports/daily/{date}/daily_digest.md` (creates the file on first run)
2. Calls `sync_to_notebooklm(digest_path)` → deletes the previous `daily_digest.md` source from the configured NotebookLM notebook, then uploads the updated file via the `nlm` CLI tool
`NOTEBOOK_ID` env var controls the target notebook. If unset, the sync step is silently skipped (opt-in).
Source: `tradingagents/daily_digest.py`, `tradingagents/notebook_sync.py`
## Observability
`RunLogger` accumulates structured events (JSON-lines) for a single run. Four event kinds: `llm` (model, agent, tokens in/out, latency), `tool` (tool name, args, success, latency), `vendor` (method, vendor, success, latency), `report` (path). Thread-safe via `_lock`.

View File

@ -9,6 +9,8 @@ tradingagents/
├── __init__.py
├── default_config.py # All config keys, defaults, env var overrides
├── report_paths.py # Unified report path helpers (reports/daily/{date}/)
├── daily_digest.py # append_to_digest() — consolidates runs into daily_digest.md
├── notebook_sync.py # sync_to_notebooklm() — uploads digest to NotebookLM via nlm CLI
├── observability.py # RunLogger, _LLMCallbackHandler, structured event logging
├── agents/
│ ├── __init__.py

View File

@ -0,0 +1,46 @@
"""Daily digest consolidation.
Appends individual report entries (analyze or scan) into a single
``daily_digest.md`` file under ``reports/daily/{date}/``.
"""
from __future__ import annotations
from datetime import datetime
from pathlib import Path
from tradingagents.report_paths import get_digest_path
def append_to_digest(date: str, entry_type: str, label: str, content: str) -> Path:
"""Append a timestamped section to the daily digest file.
Parameters
----------
date:
Date string (YYYY-MM-DD) used to locate the digest file.
entry_type:
Category of the entry, e.g. ``"analyze"`` or ``"scan"``.
label:
Human-readable label, e.g. ticker symbol or ``"Market Scan"``.
content:
The report content to append.
Returns
-------
Path
The path to the digest file.
"""
digest_path = get_digest_path(date)
digest_path.parent.mkdir(parents=True, exist_ok=True)
existing = digest_path.read_text() if digest_path.exists() else ""
if not existing:
existing = f"# Daily Trading Report — {date}\n\n"
timestamp = datetime.now().strftime("%H:%M")
section = f"---\n### {timestamp}{label} ({entry_type})\n\n{content}\n\n"
digest_path.write_text(existing + section)
return digest_path

View File

@ -0,0 +1,124 @@
"""Google NotebookLM sync via the ``nlm`` CLI tool (jacob-bd/notebooklm-mcp-cli).
Uploads the daily digest as a note to a NotebookLM notebook, updating the
existing note if one with the same title already exists. Entirely opt-in:
if no ``NOTEBOOK_ID`` is configured the function is a silent no-op.
"""
from __future__ import annotations
import json
import os
import shutil
import subprocess
from pathlib import Path
from rich.console import Console
console = Console()
_NOTE_TITLE = "Daily Trading Digest"
# Common install locations outside of PATH (e.g. pip install --user)
_FALLBACK_PATHS = [
Path.home() / ".local" / "bin" / "nlm",
Path("/usr/local/bin/nlm"),
]
def _find_nlm() -> str | None:
"""Return the path to the nlm binary, or None if not found."""
found = shutil.which("nlm")
if found:
return found
for p in _FALLBACK_PATHS:
if p.exists():
return str(p)
return None
def sync_to_notebooklm(digest_path: Path, notebook_id: str | None = None) -> None:
"""Upload *digest_path* content to Google NotebookLM as a note.
If a note titled ``Daily Trading Digest`` already exists it is updated
in-place; otherwise a new note is created.
Parameters
----------
digest_path:
Path to the digest markdown file to upload.
notebook_id:
NotebookLM notebook ID. Falls back to the ``NOTEBOOK_ID``
environment variable when *None*.
"""
if notebook_id is None:
notebook_id = os.environ.get("NOTEBOOKLM_ID")
if not notebook_id:
return # opt-in — silently skip when not configured
nlm = _find_nlm()
if not nlm:
console.print("[yellow]Warning: nlm CLI not found — skipping NotebookLM sync[/yellow]")
return
content = digest_path.read_text()
# Check for an existing note with the same title
existing_note_id = _find_note(nlm, notebook_id)
if existing_note_id:
_update_note(nlm, notebook_id, existing_note_id, content)
else:
_create_note(nlm, notebook_id, content)
def _find_note(nlm: str, notebook_id: str) -> str | None:
"""Return the note ID for the daily digest note, or None if not found."""
try:
result = subprocess.run(
[nlm, "note", "list", notebook_id, "--json"],
capture_output=True,
text=True,
)
if result.returncode != 0:
return None
data = json.loads(result.stdout)
notes = data.get("notes", data) if isinstance(data, dict) else data
for note in notes:
if isinstance(note, dict) and note.get("title") == _NOTE_TITLE:
return note.get("id") or note.get("noteId")
except (ValueError, KeyError, OSError):
pass
return None
def _create_note(nlm: str, notebook_id: str, content: str) -> None:
"""Create a new note in the notebook."""
try:
result = subprocess.run(
[nlm, "note", "create", notebook_id, "--title", _NOTE_TITLE, "--content", content],
capture_output=True,
text=True,
)
if result.returncode == 0:
console.print(f"[green]✓ Created NotebookLM note: {_NOTE_TITLE}[/green]")
else:
console.print(f"[yellow]Warning: nlm note create failed: {result.stderr.strip()}[/yellow]")
except OSError as exc:
console.print(f"[yellow]Warning: could not create NotebookLM note: {exc}[/yellow]")
def _update_note(nlm: str, notebook_id: str, note_id: str, content: str) -> None:
"""Update an existing note's content."""
try:
result = subprocess.run(
[nlm, "note", "update", notebook_id, note_id, "--content", content],
capture_output=True,
text=True,
)
if result.returncode == 0:
console.print(f"[green]✓ Updated NotebookLM note: {_NOTE_TITLE}[/green]")
else:
console.print(f"[yellow]Warning: nlm note update failed: {result.stderr.strip()}[/yellow]")
except OSError as exc:
console.print(f"[yellow]Warning: could not update NotebookLM note: {exc}[/yellow]")

View File

@ -42,3 +42,8 @@ def get_ticker_dir(date: str, ticker: str) -> Path:
def get_eval_dir(date: str, ticker: str) -> Path:
"""``reports/daily/{date}/{TICKER}/eval/``"""
return get_ticker_dir(date, ticker) / "eval"
def get_digest_path(date: str) -> Path:
"""``reports/daily/{date}/daily_digest.md``"""
return get_daily_dir(date) / "daily_digest.md"