Rust-native privacy middleware that lets you use any LLM without leaking sensitive data. Consistent pseudonymization + streaming rehydration.
CloakPipe is a high-performance Rust middleware that sits between your application and any LLM provider (OpenAI, Claude, Grok, or local models).
It automatically detects and pseudonymizes sensitive data before requests leave your system, then restores the original values in responses — so you can safely use LLMs in regulated environments.
Many teams in fintech, healthtech, legal, and enterprise either:
• block ChatGPT/Claude entirely (losing productivity)
• or risk GDPR / HIPAA exposure by sending sensitive data externally
CloakPipe acts as a privacy layer in front of LLM calls.
• Rust proxy (<5ms overhead)
• consistent pseudonymization with encrypted vault
• streaming-safe rehydration (works with GPT-4o / Claude streaming)
• regex + ONNX NER detection (people, orgs, locations, amounts, etc.)
• local-first encrypted SQLite vault + audit logs
• minimal private chat UI (CloakChat) for instant local usage
• full ONNX NER integration (local entity detection)
• migrated vault + audit logs from flat files → encrypted SQLite
• built CloakChat (simple local web interface using CloakPipe)
• optional offline-first sync support via PowerSync
• complete README + install guide + 3-minute demo video
CloakPipe works like Cloudflare for LLM privacy — a protective layer in front of model calls.
No proprietary APIs
No cloud dependency
Runs fully local
Designed for regulated environments and privacy-conscious developers
Feedback welcome 🙌
Especially from folks working on privacy infrastructure, agent tooling, or self-hosted AI stacks.