Void

Void

Open-source AI code editor with direct LLM connections and full data control.

Void

Void - Github Copilot alternative

Void is a VS Code fork that integrates AI-powered coding features. It connects directly to any LLM provider without routing data through proprietary backends. Solo developers gain complete control over model selection, hosting options, and data privacy. The editor supports both local open-source models and cloud-based providers.

Strengths

  • Complete data sovereignty — Messages go directly to providers without data retention or middleman servers.
  • Unlimited model flexibility — Works with local models (DeepSeek, Llama, Gemini, Qwen, Mistral) and frontier APIs (Claude, OpenAI, Gemini, Grok).
  • Zero vendor lock-in — One-click transfer of existing VS Code themes, keybindings, and settings.
  • Advanced change management — Checkpoint system visualizes LLM-generated modifications before applying them.
  • Native agent capabilities — Agent mode provides file system operations, terminal access, and MCP tool integration.
  • Cost elimination potential — Host open-source models locally to eliminate recurring API charges.

Weaknesses

  • Active development paused — Work is temporarily paused while the team experiments with novel AI coding features.
  • Beta stability — First beta released in January 2025, with some experimental features still under development.
  • Smaller ecosystem — Community and extension library smaller than established alternatives.
  • Self-hosting overhead — Local model deployment requires hardware resources and configuration knowledge.

Best for

Developers who prioritize data privacy, want unlimited model choice, need offline AI coding, or seek to eliminate subscription costs through self-hosted solutions.

Pricing plans

  • Free — $0/forever — Unlimited usage. Open-source software with no subscription fees, seat limits, or API credit restrictions.

Tech details

  • Type: Desktop application (VS Code fork)
  • IDEs: Standalone editor based on Visual Studio Code architecture
  • Key features: Tab autocomplete, inline quick edits, chat with Agent/Gather modes, checkpoint visualization, lint error detection, fast apply on large files, MCP integration
  • Privacy / hosting: Fully local or direct API connections with no intermediary data collection. Supports self-hosted open-source models.
  • Models / context window: Supports Gemini 2.5, Claude 3.7, Grok 3, GPT-4, Qwen 3, and OpenAI-compatible endpoints. Context windows vary by selected model.

When to choose this over Github Copilot

  • You require guaranteed data privacy without third-party data routing or telemetry collection.
  • You want flexibility to use any LLM (local or cloud) instead of being locked to a single provider.
  • You need to eliminate recurring subscription costs by hosting open-source models on your hardware.

When Github Copilot may be a better fit

  • You need production-stable tooling with enterprise support and established reliability guarantees.
  • You prefer turnkey solutions without managing model deployment or API configurations.
  • You work in large teams that benefit from GitHub's integrated workflows and permissions.

Conclusion

Void is a Y Combinator-backed open-source project positioning itself as the privacy-first alternative to proprietary AI coding tools. Its VS Code foundation provides familiar workflows while adding AI capabilities with complete data control. The Github Copilot alternative appeals to developers who value transparency and model flexibility. Development is currently paused for feature experimentation, but the beta remains downloadable and extensible.

Sources


FAQ

Is Void completely free to use?

Yes. Void is open-source software with no subscription fees, seat limits, or usage caps. You pay only for API credits if using cloud LLM providers, or run local models at zero recurring cost.

Can Void work offline without internet access?

Yes. When configured with locally hosted models through Ollama, LM Studio, or vLLM, Void functions entirely offline. Only cloud-based LLM connections require internet connectivity.

Does Void send my code to external servers?

No. Void connects directly to your chosen LLM provider without routing data through proprietary backends. For local models, all processing happens on your machine.

How does Void compare to Cursor's feature set?

Void offers similar core features: tab autocomplete, inline editing, and chat with agent capabilities. The key difference is architectural — Void provides direct LLM connections and full open-source transparency.

Can I import my existing VS Code configuration?

Yes. Void supports one-click transfer of all VS Code themes, keybindings, and settings. As a VS Code fork, it maintains compatibility with most extensions and workflows.

Why is development currently paused?

The Void team is temporarily experimenting with novel AI coding features. The existing beta remains functional and users can download, use, and extend it with custom models while awaiting future updates.

Similar tools alternatives to Github Copilot