admtracksteel a395f0d696 feat: add Ollama local provider support
- Added Ollama (local) as AI provider option
- Configure VPS endpoint for Ollama connection
- Auto-detect available models from Ollama server
- Support for vision-capable models (llama3.2-vision, etc)
2026-04-04 19:46:14 +00:00

GHBanner

Run and deploy your AI Studio app

This contains everything you need to run your app locally.

View your app in AI Studio: https://ai.studio/apps/drive/1QJyTnh0ssUreUbbVXIVSPagOQEsRajVV

Run Locally

Prerequisites: Node.js

  1. Install dependencies: npm install
  2. Set the GEMINI_API_KEY in .env.local to your Gemini API key
  3. Run the app: npm run dev
Description
No description provided
Readme 174 KiB
Languages
TypeScript 94.7%
HTML 2.2%
Python 1.3%
CSS 0.8%
JavaScript 0.6%
Other 0.4%