Goose is an open-source AI agent by Block (Apache 2.0, Linux Foundation) with 38k+ GitHub stars. Runs locally as a desktop app, CLI, and API — supports 15+ LLM providers including Ollama for offline use.
Goose is an open-source, general-purpose AI agent developed by Block (the company behind Square and Cash App). It runs natively on your machine — available as a desktop app, CLI, and API — and connects to 15+ AI model providers while supporting 70+ MCP extensions for tool integrations. As a GitHub Copilot alternative, it is best suited for developers and power users who want a fully open, locally-running AI agent that goes beyond code assistance to handle research, automation, data analysis, and complex multi-step workflows.
| Feature | Goose | GitHub Copilot |
|---|---|---|
| Type | CLI Agent + Desktop App + API (open source) | IDE Extension |
| IDEs Supported | Works as ACP server with Zed, JetBrains, VS Code; primarily CLI/desktop | VS Code, JetBrains, Visual Studio, Neovim, and more |
| Pricing | Free (open source, Apache 2.0); you provide your own LLM API keys or existing subscriptions | Free tier; Pro $10/mo; Business $19/mo; Enterprise $39/mo |
| AI Models | 15+ providers: Anthropic, OpenAI, Google, Ollama, OpenRouter, Azure, Bedrock, and more | GPT-4o, Claude Sonnet 3.5, Gemini |
| Privacy / Hosting | Runs locally on your machine; you control which LLM provider is used | GitHub/Azure cloud |
| Open Source | Yes — Apache 2.0 license, Linux Foundation (AAIF) | No |
| Offline / Local Models | Yes — supports Ollama for local model execution | No |
| MCP / Tool Integrations | 70+ documented MCP extensions | Limited (GitHub context only) |
Goose is best for developers, power users, and engineering teams who want a fully open, locally-running AI agent with maximum flexibility in model choice and tool integrations. It is particularly well-suited for those who already have Claude, ChatGPT, or Gemini subscriptions and want to leverage them in a powerful agentic framework without additional costs. The Recipes and subagent features make it compelling for teams wanting to automate complex, repeatable workflows in CI/CD pipelines and development processes. Developers in security-sensitive environments will appreciate the local execution model and granular permission controls.
Goose itself is completely free and open source (Apache 2.0). You bring your own LLM provider — either via API keys (Anthropic, OpenAI, Google, etc.) or by using existing subscriptions via ACP providers. The cost of using Goose depends entirely on the model provider and usage volume you choose. For the most current documentation and getting started instructions, visit goose-docs.ai.
Goose is built in Rust for performance and cross-platform portability, running natively on macOS, Linux, and Windows. It implements the Model Context Protocol (MCP) for tool integrations, with 70+ documented extensions available in the official extension registry. Goose also implements the Agent Client Protocol (ACP), allowing it to act as an ACP server connectable from Zed, JetBrains, and VS Code, and to use ACP-compatible agents like Claude Code and Codex as providers. The project is governed by the Agentic AI Foundation (AAIF) at the Linux Foundation, ensuring vendor-neutral, community-driven development. The GitHub repository has 38,000+ stars and 400+ contributors.
Goose represents a fundamentally different philosophy from GitHub Copilot: instead of a managed cloud service with inline autocomplete, it is a fully open-source, locally-running AI agent with unmatched flexibility in model choice, tool integration, and workflow automation. With 38,000+ GitHub stars, Linux Foundation governance, support for 70+ MCP extensions, and the ability to use existing AI subscriptions, Goose has built a strong community of developers who want more than autocomplete. For teams comfortable with CLI-first workflows and wanting to push the boundaries of what an AI coding agent can do, Goose is one of the most powerful and open options available.
Yes — Goose is completely free and open source under the Apache 2.0 license. You provide your own LLM API keys or use existing subscriptions. The cost of usage depends on the model provider and volume you choose. There is no Goose subscription fee.
Goose is primarily a desktop app and CLI agent, not an inline IDE autocomplete plugin. However, it implements the Agent Client Protocol (ACP) and can act as an ACP server, making it connectable from editors like Zed, JetBrains, and VS Code. It provides a different interaction model — agent-driven task execution rather than inline autocomplete.
Yes — Goose supports Ollama as a provider, enabling fully local model execution without any internet connection or API calls. This makes it suitable for air-gapped environments, developers who prioritize maximum privacy, or those who want to experiment with open-source models locally.
Goose was created by Block (the company behind Square and Cash App) and is now governed by the Agentic AI Foundation (AAIF) at the Linux Foundation, ensuring vendor-neutral, community-driven development. As of April 2026, the project has 38,000+ GitHub stars and 400+ contributors, indicating an active and growing community.
Recipes are portable YAML configuration files that capture complex multi-step workflows as reusable automations. They can include instructions, extensions, parameters, and subrecipes. Teams can share Recipes, run them in CI/CD pipelines, or use them to standardize common development workflows across the organization.