Ollama + Open WebUI

A complete local LLM runtime (Ollama) paired with a ChatGPT-like web UI (Open WebUI). Supports dozens of open-source models (Llama 3, Mistral, Phi, Gemma, etc.) running entirely on your own hardware — no API keys, no cloud dependency. Ideal for privacy-conscious users and developers who want a self-hosted AI chat experience.

yaml

docker-compose.yml

services:
  ollama:
    image: ollama/ollama:${OLLAMA_DOCKER_TAG:-latest}
    container_name: ollama
    tty: true
    restart: unless-stopped
    volumes:
      - ollama_data:/root/.ollama
  open-webui:
    image: ghcr.io/open-webui/open-webui:${WEBUI_DOCKER_TAG:-main}
    container_name: open-webui
    volumes:
      - open_webui_data:/app/backend/data
    depends_on:
      - ollama
    ports:
      - ${OPEN_WEBUI_PORT:-3000}:8080
    environment:
      - OLLAMA_BASE_URL=http://ollama:11434
      - WEBUI_SECRET_KEY=${WEBUI_SECRET_KEY}
      - CORS_ALLOW_ORIGIN=${CORS_ALLOW_ORIGIN:-*}
      - OLLAMA_API_BASE_URL=http://ollama:11434
      - OPENAI_API_KEY=${OPENAI_API_KEY:-}
      - OPENAI_API_BASE_URL=${OPENAI_API_BASE_URL:-}
      - ANONYMIZED_TELEMETRY=false
      - DO_NOT_TRACK=true
      - SCARF_NO_ANALYTICS=true
    extra_hosts:
      - host.docker.internal:host-gateway
    restart: unless-stopped
volumes:
  ollama_data:
    driver: local
  open_webui_data:
    driver: local

.ENV

.env example

OLLAMA_DOCKER_TAG=latest
WEBUI_DOCKER_TAG=main
OPEN_WEBUI_PORT=3000
WEBUI_SECRET_KEY=change-this-to-a-random-secret-key
CORS_ALLOW_ORIGIN=*
OPENAI_API_KEY=sk-your-dummy-api-key-here
OPENAI_API_BASE_URL=http://ollama:11434/v1

deployment

Quick Start

  1. Create a working directory named after the service.
  2. Copy the compose file and generated `.env` into that directory.
  3. Review the variables and replace placeholders with real values.
  4. Run `docker compose up -d`.
mkdir ollama-open-webui
cd ollama-open-webui
# create docker-compose.yml
# create .env
docker compose up -d