feat: support OpenWebUI as Ollama gateway

- Connect via OpenWebUI API at https://llm.reifonas.cloud/api
- Use /api/v1/chat/completions format for OpenWebUI
- Keep native Ollama format as fallback
- Auto-detect models from both endpoints
This commit is contained in:
2026-04-04 20:22:27 +00:00
parent 075f6ae0bc
commit f41c5eccd2
3 changed files with 25 additions and 8 deletions

View File

@@ -7,6 +7,7 @@ export const OLLAMA_AUTO_DETECT_URLS = [
'http://127.0.0.1:11434',
'http://192.168.1.100:11434',
'http://10.0.0.1:11434',
'http://10.0.1.1:11434',
'https://llm.reifonas.cloud',
'http://ollama:11434',
'http://host.docker.internal:11434',