075f6ae0bca67d970f5098552b9fa4113a8f0378
- Auto-detect Ollama endpoint from predefined URLs - Try multiple common addresses (localhost, VPS IPs, cloud domain) - One-click connect to Ollama without manual endpoint entry - Visual feedback during detection - Support for https://llm.reifonas.cloud
Run and deploy your AI Studio app
This contains everything you need to run your app locally.
View your app in AI Studio: https://ai.studio/apps/drive/1QJyTnh0ssUreUbbVXIVSPagOQEsRajVV
Run Locally
Prerequisites: Node.js
- Install dependencies:
npm install - Set the
GEMINI_API_KEYin .env.local to your Gemini API key - Run the app:
npm run dev
Description
Languages
TypeScript
94.7%
HTML
2.2%
Python
1.3%
CSS
0.8%
JavaScript
0.6%
Other
0.4%