- Updated .env.example to include API keys and configuration for various LLM providers, including LM Studio.
- Modified cli/utils.py to implement fetch_lmstudio_models function for dynamic model retrieval.
- Updated cli/main.py and cli/utils.py to support LM Studio as a provider option.
- Enhanced tradingagents/llm_clients to accommodate LM Studio in client creation and API key handling.
- Updated .gitignore to exclude results and reports directories.
- Add .env.example file with API key placeholders
- Update README.md with .env file setup instructions
- Add dotenv loading in main.py for environment variables
🤖 Generated with [Claude Code](https://claude.ai/code)
Co-Authored-By: Claude <noreply@anthropic.com>
- Added support for running CLI and Ollama server via Docker
- Introduced tests for local embeddings model and standalone Docker setup
- Enabled conditional Ollama server launch via LLM_PROVIDER