Your AI. Your data. Your keys.
Always.
Every AI you've used has been training on your conversations. Gemini does it. ChatGPT does it by default. The AI you've been using to think through your most personal problems has been feeding a training pipeline.
MEOK never has. Never will. Here's exactly how we enforce that — not with a policy promise, but with architecture.
Conversations trained on
Encryption standard
Max deletion window
Portable data
The four guarantees
Four guarantees. Enforced in architecture, not prose.
Not policy promises. Not legal boilerplate. Each one is a technical constraint that makes violation structurally impossible — with verifiable proof for each.
Zero training on your data
Architecturally impossible — not just a policy we might update.
Your conversations are never used to train, fine-tune, or improve any AI model for any other user. We have contractual zero-training agreements with every LLM provider we route to — but more importantly, the architecture makes it structurally impossible. Your data lives in your encrypted vault, which the training pipeline cannot reach.
Local-first storage
Your memories live in your Postgres instance. We are custodians, not landlords.
MEOK stores your sovereign memory in pgvector — an open-source Postgres extension with a fully documented, open schema. You can query it directly, back it up independently, and move it to any Postgres-compatible system without our involvement. The data is yours by architecture, not by contract.
Full portable export
One click. Standard JSON. No fee. No waiting. No data held back.
Export everything: all conversation history, all memory records, your AI's values and character settings, Birth Ceremony data, tags, categories, usage history. Human-readable. Machine-portable. The export schema is open and documented — any developer can write a tool to read it. This is not a courtesy feature. It is how you prove you actually own something.
Delete means delete
Every system. Every backup. 24 hours. Cryptographic proof.
When you delete, a cascade runs: primary Postgres, all vector embeddings, all cached responses, all CDN edge caches, all backup snapshots created after your registration date. We generate a deletion audit log with timestamps and a SHA-256 hash you can verify independently. Target: 24 hours. Maximum: 72 hours for backup propagation. We tell you which.
Technical architecture
How sovereignty is built in, not bolted on.
For the technically curious: here is every layer of the stack and exactly what it does to protect your data.
Your memories are stored as vector embeddings in a Postgres instance in your region. Embeddings are computed on inference. No third-party memory providers. Open schema, fully documented.
Every memory record, every conversation, every value setting is encrypted at rest using AES-256. The encryption keys are derived from your account credentials. MEOK cannot decrypt your data without your session key.
All data in transit is encrypted with TLS 1.3. Requests to LLM providers are routed through our servers and stripped of identifying information before dispatch.
Any query you flag as sensitive — or any query on the privacy-first routing policy — is sent to your local Ollama instance. It never leaves your device. The Sovereign Display shows which route was used.
Every deletion operation generates a signed audit log entry. The log contains: timestamp, systems affected, a SHA-256 hash of the deleted data batch, and a confirmation receipt. Available for download.
The export format is open and documented at meok.ai/docs/export. Any developer can write a tool to read it. Any user can understand it. No proprietary lock-in.
Real-time transparency
The Sovereign Display.
Every conversation includes a live panel showing you exactly what's happening with your data in real time. No guessing. No trusting. Seeing.
Current model
Which LLM processed your last message. Provider, version, and routing path.
Data location
Local (Ollama — never left your device) or Cloud (provider name + region).
Memory status
How many memories are active, last sync timestamp, vault encryption status.
Covenant score
Real-time Maternal Covenant score for your current session. Six dimensions, live.
Hard limits
What MEOK physically cannot do with your data.
Not “what we promise not to do”. What the architecture makes it technically impossible to do — regardless of who asks, including us.
Read your encrypted conversations (keys are yours)
Use your conversations to train any AI model
Sell, share, or license your data to any third party
Retain your data after deletion is confirmed
Target you with advertising based on your conversations
Share your data with governments without a valid UK court order (which we will publish)
Access your locally-routed Ollama conversations at all
Note on government requests: If we ever receive a valid UK court order compelling us to hand over data, we will publish a transparency notice in our monthly report (redacted where legally required). We will contest any order we believe to be overbroad. We have never received one.
Privacy questions
The questions you actually want answered.
For the privacy-first users who read the footnotes and check the architecture. We see you. These are for you.
Does MEOK use my data for RAG or retrieval?
Your data is used to answer your questions — that's retrieval-augmented generation working for you, not on you. MEOK retrieves relevant memories from your vault to improve responses to you. Your data is never retrieved to serve another user's experience, never used in batch training, and never pooled with other users' data in any shared index.
What about the AI model provider — do they see my data?
When you send a message, it passes to the LLM provider (Claude, GPT, DeepSeek etc) for inference. We have contractual zero-training agreements with all providers — they process the message and return the result; they cannot retain it for training. For maximum privacy, use Ollama routing: your query never leaves your device. The Sovereign Display shows exactly which provider processed each message.
Can I self-host MEOK?
Self-hosting is on the roadmap for H2 2026. The architecture is already designed for it: your memories live in a standard Postgres + pgvector instance you control, and the schema is open and documented. Full Docker images and self-host documentation will be published at launch. Email hello@meok.ai to be notified.
What's in the data export?
Everything. All conversation history, all memory records (semantic and episodic), your AI's values and character settings, Birth Ceremony data, all tags and categories, your usage history, and all exported integrations. In portable JSON format. Machine-readable and human-readable. The export schema is documented at meok.ai/docs/export. No data is held back. No export fee.
When you say delete means delete — what does that mean technically?
When you trigger deletion, a cascade runs across: your primary Postgres instance, all vector embeddings in pgvector, all cached responses, all CDN/edge caches, and all backup snapshots. We generate a deletion audit log with timestamps and a SHA-256 hash you can verify independently. Target: 24 hours. Maximum: 72 hours for backup propagation. We tell you when it's complete.
Does MEOK use my conversations to improve the product for other users?
No. We may collect aggregate anonymised analytics (e.g. 'X% of users access Work OS') but never the contents of conversations or memories. We cannot access your encrypted data. We don't want to. Product improvements come from user feedback, our own testing, and research — never from reading your conversations.
The difference nobody talks about
Your AI can't be bought.
Because you own it.
Nobody can acquire MEOK and change the rules.
Google can update their privacy policy overnight. OpenAI can sell your data to a government next year. Your AI can be acquired and flipped against you. MEOK's Maternal Covenant is constitutional — it cannot be amended to remove care. The architecture prevents it. Sovereignty isn't a promise. It's a constraint.
The only AI that gets smarter without taking from you.
Every other AI needs your data piped back to their servers to improve. That's the trade — you get a “free” service, they get your life as training data. MEOK improves locally. In your sovereign space. Your AI learns for you, on your hardware, under your control. You're not the product. You're the person.
Your AI. Not theirs.
Free forever for individuals. Because sovereignty is a right, not a subscription.
What would it feel like if an AI actually owned nothing of yours?
Free to start. Sovereign from your first conversation. No credit card. No training on your data — ever.
Export everything. Delete everything. Self-host when it ships. Yours from day one.