fix: correct Ollama URL normalization and add Azure OpenAI support
- Fix bug where URL with trailing slash (e.g., http://my-ollama/v1/) would incorrectly become http://my-ollama/v1/v1 - Add Azure OpenAI as first-class LLM provider - Use DRY pattern for environment variable fallbacks in AzureClient - Address Gemini Code Assist review feedback
This commit is contained in:
parent
6bdea25402
commit
33600b8e6d
|
|
@ -5,6 +5,7 @@ from .openai_client import OpenAIClient
|
|||
from .anthropic_client import AnthropicClient
|
||||
from .google_client import GoogleClient
|
||||
from .azure_client import AzureClient
|
||||
from .azure_client import AzureClient
|
||||
|
||||
|
||||
def create_llm_client(
|
||||
|
|
|
|||
Loading…
Reference in New Issue