a395f0d696697c88205b916c885a45779a4e3349
- Added Ollama (local) as AI provider option - Configure VPS endpoint for Ollama connection - Auto-detect available models from Ollama server - Support for vision-capable models (llama3.2-vision, etc)
Run and deploy your AI Studio app
This contains everything you need to run your app locally.
View your app in AI Studio: https://ai.studio/apps/drive/1QJyTnh0ssUreUbbVXIVSPagOQEsRajVV
Run Locally
Prerequisites: Node.js
- Install dependencies:
npm install - Set the
GEMINI_API_KEYin .env.local to your Gemini API key - Run the app:
npm run dev
Description
Languages
TypeScript
94.7%
HTML
2.2%
Python
1.3%
CSS
0.8%
JavaScript
0.6%
Other
0.4%