Open Source · Self-Hosted

Self-hosted fleet manager for OpenClaw AI agents

OpenClaw is a powerful autonomous AI agent — but it's built for one operator, one instance, one CLI. ClawFarm gives you a web dashboard to deploy, isolate, and manage a fleet of them on your own hardware.

100% Docker
3 commands to deploy
Apache-2.0 licensed
ClawFarm dashboard showing a fleet of running AI agents with token usage chart and sparklines
Creating a new agent with template selection and personality configuration
Agent detail page with live metrics, token usage, cron jobs, and backups
Web terminal with interactive shell into a running agent

Quick Start

Three commands. No TLS certificates to generate, no Docker GID to look up.

terminal
# Clone & configure
$ git clone https://github.com/clawfarm/clawfarm && cd clawfarm
$ cp .env.example .env   # add your LLM provider details

# Launch
$ docker compose up -d

# Open https://<your-ip>:8443
# Admin password: docker compose logs dashboard | head -20

What you get

Everything you need to run a fleet of AI agents. The AI capabilities are 100% OpenClaw — ClawFarm handles the operational infrastructure.

Container Isolation

Each agent runs in its own Docker container and network. Agents can't see each other or reach your LAN.

Multi-User RBAC

Per-agent access control enforced at the reverse proxy layer. Users only see and manage their own agents.

Backup & Rollback

Scheduled hourly backups with configurable retention. One-click restore to any previous state.

Zero-Config HTTPS

Caddy handles TLS termination with path-based routing under a single port. Four modes from self-signed to Let's Encrypt.

Web Terminal

Interactive shell into any running agent container, directly from the dashboard. No SSH or docker exec needed.

Monitoring

CPU, memory, storage, and token usage per agent. Fleet-wide stats at a glance from the dashboard.

Templates

Define reusable agent configs with environment variable substitution. Create new agents in seconds.

Deployment Modes

Four TLS modes to fit your setup. Caddy handles all of it — you just set one env var.

default

Internal

Auto-generated self-signed cert. Zero config — just deploy.

Works out of the box. No DNS required.

ACME

Automatic Let's Encrypt certificates. Set your domain and go.

Custom

Bring your own certs from an existing PKI or corporate CA.

Off

Plain HTTP. For when ClawFarm sits behind nginx, Traefik, or Cloudflare.

Works with any LLM

Built-in templates for major providers. Any OpenAI-compatible API works too — local or cloud.

Cloud APIs

Anthropic Claude, OpenAI GPT, MiniMax, Qwen — one template per provider, just add an API key.

Local Models

vLLM, Ollama, LM Studio — use the custom-endpoint template and point to your server.

Mix & Match

Different agents can use different providers. Run Claude for coding and GPT for research side by side.

Architecture

All Docker, all the way down. Dashboard, frontend, and reverse proxy run as containers. Each agent is another container with its own isolated network.

🌐 Browser
Caddy TLS + auth + routing
Dashboard FastAPI + Next.js UI
agent-01
agent-02
agent-03
each in its own Docker network
Caddy
TLS termination, path routing (/claw/{name}/*), forward_auth
Dashboard
FastAPI backend managing bot lifecycle via Docker API
Frontend
Next.js dashboard UI
Bot containers
Independent OpenClaw instances, one per agent, isolated networks

Ready to deploy your fleet?

Three commands to a running fleet. All agent data stays on your hardware.