Granola, a popular AI-powered note-taking app, is making users’ “private” notes viewable to anyone with a link. The company also uses these notes to train their AI systems unless users manually opt out.
This is a big problem because people often store sensitive information in their notes – meeting details, personal thoughts, work documents, and confidential discussions. What users thought was private is actually accessible to strangers who stumble across the right link.
When “Private” Isn’t Really Private
Granola markets itself as keeping notes “private by default,” but their actual privacy settings tell a different story. The app automatically sets sharing permissions that allow link access, meaning your notes could end up in search engines or shared accidentally through browser history.
The AI training issue adds another layer of concern. Every note you write could become part of Granola’s machine learning system, potentially showing up in suggestions for other users. The company does offer an opt-out option, but it’s buried in settings that most people never check.
This isn’t unusual in the AI world – many companies use customer data to improve their products. But note-taking apps hold especially sensitive information, making the stakes much higher.
What You Should Do
If you use Granola, check your privacy settings immediately. Look for sharing permissions and AI training options, then adjust them to match what you actually want. Consider whether the convenience of AI-powered notes is worth the privacy trade-offs.
Expect more scrutiny of AI apps’ privacy practices as users become more aware of how their personal data gets used.

