� Fixed Issues:
- Resolve 404 error when using DashScope LLM with OpenAI embeddings
- Add missing DashScope model options in CLI selection
- Improve embedding provider fallback logic
� Memory System Updates:
- Add DashScope embedding support (text-embedding-v3)
- Implement intelligent fallback: DashScope → OpenAI → Error
- Add proper API key validation and error messages
- Support multiple embedding providers based on LLM selection
� CLI Enhancements:
- Add DashScope models to SHALLOW_AGENT_OPTIONS and DEEP_AGENT_OPTIONS
- Support qwen-turbo, qwen-plus, qwen-max, qwen-max-longcontext
- Fix KeyError when selecting DashScope as LLM provider
✅ Test Results:
- All .env configuration tests pass
- DashScope embeddings working correctly
- Proper fallback to OpenAI when DashScope unavailable
- Memory initialization successful with real API keys
� Now users can successfully:
1. Select DashScope in CLI without errors
2. Use DashScope embeddings for memory system
3. Automatic fallback if DashScope unavailable
4. Complete end-to-end DashScope workflow
- Add 18 new feature files from Chinese version
- Support for Chinese market data (A-shares)
- Database integration with MongoDB
- Advanced caching system with adaptive strategies
- LLM adapters for DashScope and other providers
- API services and real-time data utilities
- Enhanced configuration management
- Comprehensive English documentation
New features:
- Chinese finance data aggregation
- TDX (TongDaXin) API integration
- Optimized China stock data provider
- Adaptive and integrated caching
- Database cache management
- Stock data services
- Real-time news utilities
Breaking changes: None (all new features are additive)
Dependencies: Added pymongo, beautifulsoup4, dashscope (optional)
For detailed information, see MERGE_SUMMARY.md
- Added support for running CLI and Ollama server via Docker
- Introduced tests for local embeddings model and standalone Docker setup
- Enabled conditional Ollama server launch via LLM_PROVIDER