feat: support OpenWebUI as Ollama gateway
- Connect via OpenWebUI API at https://llm.reifonas.cloud/api - Use /api/v1/chat/completions format for OpenWebUI - Keep native Ollama format as fallback - Auto-detect models from both endpoints
This commit is contained in:
@@ -7,6 +7,7 @@ export const OLLAMA_AUTO_DETECT_URLS = [
|
||||
'http://127.0.0.1:11434',
|
||||
'http://192.168.1.100:11434',
|
||||
'http://10.0.0.1:11434',
|
||||
'http://10.0.1.1:11434',
|
||||
'https://llm.reifonas.cloud',
|
||||
'http://ollama:11434',
|
||||
'http://host.docker.internal:11434',
|
||||
|
||||
Reference in New Issue
Block a user