Goose: Block's Open-Source Local-First AI Agent Hits 35K
Open Source 5 min read

Goose: Block's Open-Source Local-First AI Agent Hits 35K

Marcus Rivera
Marcus Rivera
Apr 20, 2026

While the AI coding-agent discourse stays locked on Cursor, Claude Code, and the next Copilot update, a quieter project out of Block — yes, the fintech company behind Square and Cash App — has been assembling the plumbing for the entire agentic-AI movement. Goose just hit 35,300 GitHub stars, shipped its 126th release (v1.29.1 on April 3, 2026), and is now one of three founding projects of the Linux Foundation's new Agentic AI Foundation (AAIF). It is, quite possibly, the most important open-source AI agent you've never tried.

What goose actually is

Goose is a local-first, open-source AI agent that runs on your machine, talks to any LLM you want, and connects to external tools through Anthropic's Model Context Protocol (MCP). It's built primarily in Rust (58.3% of the codebase) with a TypeScript desktop UI (34.1%), licensed Apache-2.0, and maintained by a community of 438 contributors.

The pitch is simple: give an agent that runs entirely on your hardware the same capabilities you'd get from a hosted service. It can:

  • Write and execute code
  • Build projects from scratch
  • Debug failures autonomously
  • Orchestrate multi-step workflows
  • Interact with external APIs
  • Edit files and run tests

There's a CLI for terminal die-hards and a desktop app (.dmg for macOS, .deb/.rpm/AppImage for Linux, .exe for Windows) for everyone else.

Install in under a minute

The CLI installs with a single command on macOS and Linux:

curl -fsSL https://github.com/block/goose/releases/download/stable/download_cli.sh | bash

On Windows, download the .exe installer from the releases page and run it. The desktop app has platform installers on the official install page.

Once installed, point goose at whatever model you want. The one that matters most here is Ollama.

The local-first story

Goose works with 15+ LLM providers including OpenAI, Anthropic, Google, Groq, Databricks, and — critically — Ollama for fully local inference. That combination matters. Running goose with Ollama means inputs, outputs, and artifacts never leave your machine. No telemetry, no API bill, no compliance review.

Setting it up:

# Install Ollama
curl -fsSL https://ollama.com/install.sh | sh

# Pull a model
ollama pull qwen3:14b

# In goose: Settings → Configure Provider → Ollama

From there you're driving an autonomous coding agent that could be running on a laptop in airplane mode. For sensitive codebases — regulated industries, private research, air-gapped environments — that's a different product category entirely from Cursor or Claude Code.

Why the Linux Foundation bet on it

On December 9, 2025, the Linux Foundation announced the Agentic AI Foundation (AAIF). The three founding project contributions:

Project Contributed by
Model Context Protocol (MCP) Anthropic
goose Block
AGENTS.md OpenAI

Platinum members include Amazon Web Services, Anthropic, Block, Bloomberg, Cloudflare, Google, Microsoft, and OpenAI. That's essentially every major model provider and every major cloud, sitting at the same table, agreeing that agentic AI needs neutral infrastructure.

Goose is the reference implementation of how those pieces fit together. MCP provides the tool-use protocol. AGENTS.md is the project instruction format. Goose is the agent that consumes both and actually does the work. The fact that a fintech side project got chosen over any of the Big Lab frameworks says something about how mature it already is.

What makes it different from Claude Code and OpenAI Codex

The Claude Code / Codex / Cursor trio assumes you're running the agent against a hosted frontier model. Goose assumes you might not want to.

Three things set goose apart:

Multi-model configuration. You can route different sub-tasks to different models in a single run. Use a cheap local model for file navigation, a frontier hosted model for hard reasoning, and a vision-capable model when you need to read a screenshot. This is a config file, not a feature request.

Recipes. Reusable workflow definitions live in YAML. A recipe like release_risk_check is a first-class artifact in the repo — you can see one in the main branch today. Share a recipe, share the whole workflow.

Custom distributions. Enterprises can fork goose, ship it with their own preconfigured providers, extensions, and branding. The repo includes a CUSTOM_DISTROS.md guide. If you want to build your internal "company coding agent," this is the path.

The governance question

Block governs the project through a formal GOVERNANCE.md, and AAIF membership means the governance is in the process of migrating to vendor-neutral oversight. That's non-trivial. It means goose won't get quietly shelved if Block's priorities shift — the same structural protection that keeps MCP from being yanked if Anthropic's strategy changes.

For anyone betting infrastructure on an agent framework, that's the single most important feature.

The Bottom Line

Goose is what you get when you build an AI agent framework without the assumption that you're selling API tokens at the other end. It's fully local, genuinely open (Apache-2.0, 438 contributors), and now sits at the center of the Linux Foundation's push to standardize agentic AI. With 35,300 stars and 126 shipped releases, it's no longer a curiosity — it's infrastructure. If you've been using Claude Code or Cursor and wondering what the vendor-neutral, on-device version looks like, you just found it.