Security & Privacy
Your source code is your most valuable asset. Ratchet is designed from the ground up so that code stays on your machine — and only your machine.
Ratchet is a CLI tool that runs entirely within your local environment. When you invoke ratchet scan or ratchet fix, the tool reads files from your filesystem, constructs prompts, and sends those prompts directly to the AI provider of your choice — no Ratchet servers in the middle.
There is no cloud processing step, no file upload endpoint, and no intermediate proxy. The process is identical whether you're on a corporate laptop, an air-gapped CI runner, or a personal machine.
- No code upload — source files are never transmitted to Ratchet servers
- No auth server — your license key is validated against a public signature, offline-capable
- No cloud storage — scan results are written to
ratchet-report.jsonlocally - No dependency on Ratchet uptime — tool works even if ratchetcli.com is down
- Filesystem isolation — Ratchet only reads files within the project directory you specify
Ratchet never handles your API keys on its servers. You configure them locally via environment variables or a .ratchet.env file. Prompts go from your machine directly to OpenAI, Anthropic, or any compatible endpoint — governed by your provider's privacy policy, not ours.
This means you can use your company's enterprise agreement with Anthropic (which includes zero data retention) simply by setting your key in the environment. Ratchet inherits those guarantees automatically.
# Anthropic (default) ANTHROPIC_API_KEY=sk-ant-xxxxxxxxxxxxxxxxxxxxxxxx # OpenAI OPENAI_API_KEY=sk-xxxxxxxxxxxxxxxxxxxxxxxx # Groq (fast local-equivalent) GROQ_API_KEY=gsk_xxxxxxxxxxxxxxxxxxxxxxxx # Custom OpenAI-compatible endpoint (Ollama, LM Studio, etc.) RATCHET_LLM_BASE_URL=http://localhost:11434/v1 RATCHET_LLM_MODEL=llama3.2:latest
- Keys are read from environment at runtime — never cached or stored by Ratchet
- Use
--modelflag to switch providers per-run without changing config - Fully supports Ollama and local models — zero external calls in that configuration
- Enterprise Anthropic keys with zero-retention policy are fully compatible
Below is the complete list of network connections Ratchet makes during a typical run. There are exactly two: one to validate your license key (a signed JWT checked against a public key — no server roundtrip in offline mode), and one to your configured LLM API endpoint.
To verify: run sudo tcpdump -i any host ratchetcli.com during a scan and confirm only the license endpoint is contacted.
Many developer tools collect anonymous usage metrics to "improve the product." Ratchet collects nothing. No command invocations, no error reports, no feature flags, no A/B tests, no session IDs.
If you find a bug, we ask you to file a GitHub issue. That is the only feedback mechanism. This is intentional — it keeps our data surface at zero and means we can make strong privacy guarantees to security-sensitive teams without carve-outs.
- No analytics SDK — PostHog, Segment, Mixpanel are not in the dependency tree
- No crash reporting — Sentry, Bugsnag, Rollbar are not installed
- No feature flags — LaunchDarkly, Flagsmith not used; all features are local
- No update pings — version checks happen via npm registry, like any npm install
- Verify by auditing
node_modules/.package-lock.json— no telemetry packages present
Enterprise tier is in private beta. The following controls are in development for security-sensitive organizations. Contact us to get early access.