Open WebUI — webui.corone.monster
Self-hosted LLM chat frontend. Independent from ELC AI Agent — used for direct API access to Anthropic + OpenAI when ELC is overkill.
Endpoint
- HTTPS:
https://webui.corone.monster - Local:
127.0.0.1:3010(Docker port mapping) - nginx:
/etc/nginx/sites-enabled/webui.corone.monster
Deployment
Docker container, NOT systemd-only:
- Image:
ghcr.io/open-webui/open-webui:main - Container:
open-webui(indocker ps) - Compose file:
/var/www/openwebui/docker-compose.yml - Volume:
open-webui-data→/app/backend/data(persistent) - systemd wrapper:
openwebui.service(manages docker-compose lifecycle)
Port mapping
ports:
- "127.0.0.1:3010:8080" # internal 8080 → host 3010Environment
ANTHROPIC_API_KEY— Claude API accessOPENAI_API_KEY— ChatGPT API accessENABLE_OLLAMA_API=false— no local GPUWEBUI_AUTH=true— first signup = adminWEBUI_NAME=WebUI
Installed
2026-04-28.
Use case vs ELC
| ELC AI Agent | Open WebUI | |
|---|---|---|
| Tools | full (file system, MCP, etc.) | none (chat only) |
| Sessions | persistent + workspace | conversation history |
| Models | Anthropic via Claude Code | Anthropic + OpenAI raw API |
| Skills | 58 installed | n/a |
| Use when | building / agentic work | quick chat, document Q&A |
Cross-refs
- codex-mcp-integration — alternative GPT bridge (via Codex MCP, in-ELC)
- nginx-vhost-map — vhost routing