Self-hosted fleet manager for OpenClaw AI agents

OpenClaw is a powerful autonomous AI agent — but it's built for one operator, one instance, one CLI. ClawFarm gives you a web dashboard to deploy, isolate, and manage a fleet of them on your own hardware.

ClawFarm dashboard showing three running AI agents with status, token usage, and management controls

Quick Start

Three commands. No TLS certificates to generate, no Docker GID to look up.

terminal
# Clone & configure
$ git clone https://github.com/clawfarm/clawfarm && cd clawfarm
$ cp .env.example .env   # add your LLM provider details

# Launch
$ docker compose up --build -d

# Open https://<your-ip>:8443
# Admin password: docker compose logs dashboard | head -20

What you get

Everything you need to run a fleet of AI agents. The AI capabilities are 100% OpenClaw — ClawFarm handles the operational infrastructure.

Container Isolation

Each agent runs in its own Docker container and network. Agents can't see each other or reach your LAN.

Multi-User RBAC

Per-agent access control enforced at the reverse proxy layer. Users only see and manage their own agents.

Backup & Rollback

Scheduled hourly backups with configurable retention. One-click restore to any previous state.

Zero-Config HTTPS

Caddy handles TLS termination with path-based routing under a single port. Four modes from self-signed to Let's Encrypt.

Monitoring

CPU, memory, storage, and token usage per agent. Fleet-wide stats at a glance from the dashboard.

Templates

Define reusable agent configs with environment variable substitution. Create new agents in seconds.

Deployment Modes

Four TLS modes to fit your setup. Caddy handles all of it — you just set one env var.

Internal

default

Auto-generated self-signed cert. Zero config — just deploy.

ACME

Automatic Let's Encrypt certificates. Set your domain and go.

Custom

Bring your own certs from an existing PKI or corporate CA.

Off

Plain HTTP. For when ClawFarm sits behind nginx, Traefik, or Cloudflare.

Works with any LLM

Any OpenAI-compatible API. Cloud providers, local models, or a mix.

Cloud APIs

OpenAI, Anthropic, OpenRouter — set three env vars and go.

Local Models

vLLM, Ollama, LiteLLM — point to your local endpoint.

Templates

Different configs for different use cases. One template per provider, model, or personality.

Architecture

All Docker, all the way down. Dashboard, frontend, and reverse proxy run as containers. Each agent is another container with its own isolated network.

Browser ──> Caddy (TLS + auth) ──> Dashboard (FastAPI) ──> Bot containers (OpenClaw)
                                     
                                Docker socket
  • Caddy — TLS termination, path-based routing (/claw/{name}/*), forward_auth
  • Dashboard — FastAPI backend managing bot lifecycle via Docker API
  • Frontend — Next.js dashboard UI
  • Bot containers — independent OpenClaw instances, one per agent, isolated networks

Ready to deploy your fleet?

Three commands to a running fleet. All agent data stays on your hardware.