AI Privacy on Mac: What You Need to Know
Where does your AI conversation data go? A clear guide to privacy, local storage, API keys, and data handling on Mac AI apps.
Every time you type a message into an AI app, that text goes somewhere. Understanding where it goes, who stores it, and what they do with it is essential if you discuss anything sensitive with AI.
How AI Chat Apps Handle Your Data
There are three layers to consider:
1. The App Layer (Client)
This is the software on your Mac. It stores your conversation history, settings, and credentials. The key questions:
- Where is history stored? Locally on your Mac, or on the company’s servers?
- Is an account required? Account-based apps tie your data to an identity
- How are credentials stored? API keys should be in the macOS Keychain, not in plain text files
2. The Transport Layer (API)
When you send a message, it travels over the internet to an AI provider. The key questions:
- Is the connection encrypted? (HTTPS - this is standard and expected)
- Does the request pass through intermediaries? Some apps route through their own servers before reaching the AI provider
3. The Model Layer (Provider)
The AI provider (OpenAI, Anthropic, Google, etc.) processes your message. The key questions:
- Is your data used for training? Most providers let you opt out, but the default varies
- How long is data retained? Providers may log requests for abuse monitoring
- What jurisdiction? Data processed in the US is subject to US law
ChatGPT Privacy
OpenAI’s approach:
- Account required: Yes. Your conversations are tied to your OpenAI account
- Training data: By default, your conversations may be used to improve models. You can opt out in settings, but many users do not know this
- Data storage: Conversations are stored on OpenAI’s servers
- API vs Consumer: API usage has different (better) privacy terms than ChatGPT consumer. API data is not used for training by default
Claude Privacy
Anthropic’s approach:
- Account required: Yes for claude.ai. API access does not require linking conversations to a consumer account
- Training data: Commercial API data is not used for training. Consumer chat data handling varies
- Data retention: API requests may be logged for safety monitoring with limited retention
The BYOK Privacy Advantage
When you use Bring Your Own Key, the data flow is simpler:
- Your message goes from the Chapeta app to Chapeta’s API proxy
- Chapeta forwards it to OpenRouter
- OpenRouter routes it to the model provider, and the response returns through the same path
Chapeta keeps conversation history on your local filesystem. API keys are stored in macOS Keychain (encrypted by the OS). Chapeta does not use your prompts or responses to train models.
The providers still process your data, but API usage generally has better privacy terms than consumer product usage. Chapeta uses email sign-in for account identity, but conversation content is never stored on Chapeta servers - it stays on your Mac.
What Chapeta Does Differently
Chapeta’s privacy model:
- Email sign-in only: no passwords, no social logins: Chapeta still uses technical identifiers for entitlement and billing flows
- Local storage: All conversations, settings, and skills are stored on your Mac’s filesystem
- Telemetry scope: The app has no in-app analytics SDKs or ad trackers; the website uses PostHog page and interaction analytics
- Keychain storage: API keys are stored in macOS Keychain, encrypted by the operating system
- Proxy routing: In both BYOK and Pro modes, requests are routed through Chapeta’s API proxy before OpenRouter/provider delivery
Practical Privacy Tips
- Use API access over consumer products when privacy matters. API terms are consistently better
- Check your provider’s training opt-out. OpenAI, Anthropic, and Google all offer this
- Do not paste passwords, API keys, or secrets into any AI chat. The model does not need them and they will be transmitted to the provider
- Use local models for the most sensitive work. Chapeta does not support local models currently, but this is a general best practice
- Prefer BYOK over managed access when privacy is a priority
Honest Limitations
No cloud AI is fully private. When you send a message to GPT or Claude, that text leaves your Mac and is processed by a third party. The provider makes privacy promises, but you are trusting their compliance. For truly private AI, you would need to run models locally on your Mac. Chapeta does not currently support local models through Ollama or similar tools. If absolute privacy is your requirement, local model inference is the only answer.