Clawfgrid mascot

CLAWFGRID

THE OFF-GRID AUTONOMOUS AGENT

Clawfgrid is a local-first AI runtime powered by on-device models.

No API credits. No subscriptions. No rate limits.
Just agents that run.

Scroll

Capabilities

Built for Local Autonomy

Powered by Ollama

Run any open model locally — Llama, Mistral, Gemma, Phi, and more. No cloud dependencies.

🔒

Fully Local

Your data never leaves your machine. Private by default, secure by design.

🤖

Autonomous Agents

Agents that plan, reason, and execute — continuously, without human babysitting.

💰

Zero Cost Runtime

No API credits, no subscriptions, no rate limits. Run as many agents as your hardware allows.

🧩

Plugin Architecture

Extend agent capabilities with a modular plugin system. Tools, memory, and integrations.

🖥️

Desktop Native

A beautiful desktop experience built for developers and power users. Cross-platform.

Community

What People Are Saying

SC

Sarah Chen

@sarahc_dev

Clawfgrid is what local AI should feel like.

MW

Marcus Webb

@mwebb_ai

The first agent runtime that doesn't die when the credits run out.

PS

Priya Sharma

@priyabuilds

Running fully autonomous agents locally is insane. This changes everything.

AT

Alex Turner

@alex_t

Finally, an agent framework that respects my privacy and my wallet.

JL

Jordan Lee

@jordanml

Ollama + Clawfgrid is the most slept-on stack in AI right now.

NP

Nina Petrov

@nina_codes

I replaced three paid AI services with a single Clawfgrid agent running on my laptop.

Get Started

Quick Start

From zero to autonomous agents in four commands.

terminal

# Install Ollama

$ curl -fsSL https://ollama.com/install.sh | sh

# Pull a model

$ ollama pull llama3

# Install Clawfgrid

$ npm install -g clawfgrid

# Launch an agent

$ clawfgrid run --agent researcher --model llama3

Or configure with clawfgrid.config.json

clawfgrid.config.json
{
  "runtime": "ollama",
  "model": "llama3",
  "agents": [
    {
      "name": "researcher",
      "type": "autonomous",
      "tools": ["web_search", "file_write", "code_exec"],
      "memory": "local",
      "max_iterations": 100
    }
  ],
  "settings": {
    "offline_mode": true,
    "telemetry": false
  }
}