� Environment Configuration Localization:
- Translate all Chinese comments to English
- Maintain same configuration structure and values
- Update project references from TradingAgents-CN to TradingAgents
- Add enhanced usage instructions and quick start guide
� Key Changes:
- Header: TradingAgents Environment Variables Configuration Example
- API Keys: Clear English descriptions for all providers
- Database: English explanations for MongoDB and Redis configuration
- Reddit API: English comments for social media integration
- Usage Instructions: Comprehensive English setup guide
� Enhanced Documentation:
- Added Quick Start Guide section
- Separate instructions for minimal vs full feature setup
- Clear step-by-step configuration process
- Docker commands for database setup
- Dependency installation instructions
Now .env.example provides clear English guidance for:
- International users and developers
- API key configuration and sources
- Database setup (optional)
- Quick start vs full feature deployment
- Testing and validation steps
This makes the project more accessible to global users
while maintaining all functionality and configuration options.
- Add .env.example with comprehensive configuration template
- Add .env with sanitized default values (no real API keys)
- Based on Chinese version configuration with English comments
- Includes all necessary API configurations:
* DashScope (Chinese LLM)
* Finnhub (Financial data)
* OpenAI, Google AI, Anthropic (Optional LLMs)
* Reddit API (Social sentiment)
* MongoDB and Redis (Database/Cache)
Configuration features:
- Clear instructions for each API key
- Database enable/disable switches
- Comprehensive comments and usage guide
- Security warnings about sensitive data
Usage:
1. Copy .env.example to .env
2. Fill in actual API keys
3. Configure at minimum: DASHSCOPE_API_KEY and FINNHUB_API_KEY
4. Run: python main.py
- Added support for running CLI and Ollama server via Docker
- Introduced tests for local embeddings model and standalone Docker setup
- Enabled conditional Ollama server launch via LLM_PROVIDER