4 Commits

Author SHA1 Message Date
f41c5eccd2 feat: support OpenWebUI as Ollama gateway
- Connect via OpenWebUI API at https://llm.reifonas.cloud/api
- Use /api/v1/chat/completions format for OpenWebUI
- Keep native Ollama format as fallback
- Auto-detect models from both endpoints
2026-04-04 20:22:27 +00:00
075f6ae0bc feat: Ollama auto-detect without manual input
- Auto-detect Ollama endpoint from predefined URLs
- Try multiple common addresses (localhost, VPS IPs, cloud domain)
- One-click connect to Ollama without manual endpoint entry
- Visual feedback during detection
- Support for https://llm.reifonas.cloud
2026-04-04 19:51:34 +00:00
a395f0d696 feat: add Ollama local provider support
- Added Ollama (local) as AI provider option
- Configure VPS endpoint for Ollama connection
- Auto-detect available models from Ollama server
- Support for vision-capable models (llama3.2-vision, etc)
2026-04-04 19:46:14 +00:00
97eb42c243 feat: multi-provider AI support with auto-detection
- Added support for Google Gemini, OpenAI, Anthropic, and Azure OpenAI
- Implemented API key validation with auto model detection
- Added Error Boundary for better error handling
- Migrated PDF generation to native jsPDF (better quality)
- Added PWA support with offline capabilities
- Implemented tests with Vitest
- Fixed language consistency (PT-BR)
- Improved accessibility (ARIA)
2026-04-04 19:32:00 +00:00