CloakPipe — Privacy-first Rust LLM proxy (automatic PII protection, runs fully local)

Rust-native privacy middleware that lets you use any LLM without leaking sensitive data. Consistent pseudonymization + streaming rehydration.

Description

CloakPipe is a high-performance Rust middleware that sits between your application and any LLM provider (OpenAI, Claude, Grok, or local models).

It automatically detects and pseudonymizes sensitive data before requests leave your system, then restores the original values in responses — so you can safely use LLMs in regulated environments.

Problem it solves

Many teams in fintech, healthtech, legal, and enterprise either:

• block ChatGPT/Claude entirely (losing productivity)
• or risk GDPR / HIPAA exposure by sending sensitive data externally

CloakPipe acts as a privacy layer in front of LLM calls.

Core features (MIT licensed, fully FOSS)

• Rust proxy (<5ms overhead)
• consistent pseudonymization with encrypted vault
• streaming-safe rehydration (works with GPT-4o / Claude streaming)
• regex + ONNX NER detection (people, orgs, locations, amounts, etc.)
• local-first encrypted SQLite vault + audit logs
• minimal private chat UI (CloakChat) for instant local usage

What I built during FOSS Hack (March 2026)

• full ONNX NER integration (local entity detection)
• migrated vault + audit logs from flat files → encrypted SQLite
• built CloakChat (simple local web interface using CloakPipe)
• optional offline-first sync support via PowerSync
• complete README + install guide + 3-minute demo video

Why this matters

CloakPipe works like Cloudflare for LLM privacy — a protective layer in front of model calls.

No proprietary APIs
No cloud dependency
Runs fully local
Designed for regulated environments and privacy-conscious developers

Feedback welcome 🙌
Especially from folks working on privacy infrastructure, agent tooling, or self-hosted AI stacks.

Issues & PRs Board
No issues or pull requests added.