Open WebUI
Why Open WebUI?
Quick Deploy on CLORE.AI
ghcr.io/open-webui/open-webui:cuda22/tcp
8080/httpAccessing Your Service
Verify It's Working
Installation
With Ollama (Recommended)
All-in-One (Bundled Ollama)
First Setup
Features
Chat Interface
Model Management
RAG (Document Chat)
User Management
Configuration
Environment Variables
Key Settings
Variable
Description
Default
Connect to Remote Ollama
Docker Compose
API Reference
Endpoint
Method
Description
Check Health
Get Version
List Models (via Ollama proxy)
Tips
Faster Responses
Better Quality
Save Resources
GPU Requirements
Troubleshooting
Can't connect to Ollama
Models not showing
Slow performance
Cost Estimate
Setup
GPU
Hourly
Next Steps
Last updated
Was this helpful?