Commit Graph

3 Commits

Author SHA1 Message Date
Godnight1006 aafc3fbe31 Fix #275: Make openai dataflow compatible with Gemini and OpenRouter
- Replace OpenAI-specific responses.create() API with standard chat.completions.create()
- Update get_stock_news_openai, get_global_news_openai, and get_fundamentals_openai
- Add comprehensive tests for OpenAI, Gemini, and OpenRouter compatibility
- All functions now use standard OpenAI-compatible chat completion API
- Fixes RuntimeError: All vendor implementations failed for method 'get_global_news'
- Fixes RuntimeError: All vendor implementations failed for method 'get_indicators'

The issue was that the openai vendor functions used OpenAI-specific API features
(responses.create with web_search_preview tools) that are not supported by
Gemini or OpenRouter. By switching to the standard chat completions API,
these functions now work with any OpenAI-compatible provider.
2025-11-14 11:38:56 +08:00
Yijia Xiao 26c5ba5a78
Revert "Docker support and Ollama support (#47)" (#57)
This reverts commit 78ea029a0b.
2025-06-26 00:07:58 -04:00
Geeta Chauhan 78ea029a0b
Docker support and Ollama support (#47)
- Added support for running CLI and Ollama server via Docker
- Introduced tests for local embeddings model and standalone Docker setup
- Enabled conditional Ollama server launch via LLM_PROVIDER
2025-06-25 23:57:05 -04:00