Skip to content

Local Docker Compose

This project now includes a clean local Docker deployment layout under deploy/.

What is included

  • Dockerfile — local/runtime-oriented image build
  • deploy/ohmycaptcha/compose.yml — application service
  • deploy/ohmycaptcha/.env.example — app environment template
  • deploy/caddy/compose.yml — reverse proxy service
  • deploy/caddy/Caddyfile — HTTPS reverse proxy
  • deploy/caddy/.env.example — Caddy environment template

Use two separate Compose projects:

  • ohmycaptcha for the application
  • caddy for TLS termination and public ingress

They communicate through a shared external Docker network.

1. Create the shared network

docker network create caddy_public

2. Start Caddy

cd deploy/caddy
cp .env.example .env
docker compose up -d

Set DOMAIN and EMAIL in .env. Override EDGE_NETWORK if you do not want to use caddy_public.

3. Start OhMyCaptcha

cd deploy/ohmycaptcha
cp .env.example .env
docker compose up -d --build

Set at least:

  • CLIENT_KEY
  • LOCAL_BASE_URL
  • LOCAL_API_KEY
  • LOCAL_MODEL
  • CLOUD_BASE_URL
  • CLOUD_API_KEY
  • CLOUD_MODEL

4. Verify the service

curl http://127.0.0.1:8000/api/v1/health
curl https://your-domain.example/api/v1/health

Remote-model setup

You do not need to run a large model on the same host. Both model backends can point at remote OpenAI-compatible services:

LOCAL_BASE_URL=https://your-openai-compatible-endpoint/v1
LOCAL_API_KEY=your-api-key
LOCAL_MODEL=gpt-5.4

CLOUD_BASE_URL=https://your-openai-compatible-endpoint/v1
CLOUD_API_KEY=your-api-key
CLOUD_MODEL=gpt-5.4

Important limitations

  • Task state is in-memory only, so this deployment should stay single-instance unless task storage is redesigned.
  • Browser-based tasks still depend on Playwright, Chromium, runtime memory, IP reputation, and target-site behavior.
  • In the current codebase, the reCAPTCHA v2 audio fallback path uses the CLOUD_* backend in a provider-specific request shape. Some OpenAI-compatible providers may not accept it as-is.

Notes for small VPS hosts

  • Keep one app container only.
  • Keep Chromium headless.
  • Leave shm_size: "512m" in place unless you have measured a lower safe value.
  • Prefer remote multimodal models over local inference on 2 GB RAM systems.