This Mental Wellness App Will Tell You Exactly What It Can and Can't See
Most privacy pages are written to be unreadable. Daylogue is publishing an honest map, what's end-to-end encrypted so not even our own engineers can read it, and what we can see on the server and why. No category does this. We think it's overdue.
LOS ANGELES, CA, July 15, 2026 / PRNewswire / — Daylogue today published an honest, line-by-line account of what its servers can and cannot read. It is the kind of document no mental wellness app has ever put out, in a category where the FTC has now repeatedly found the leading apps lying about exactly this. Daylogue's position: if people are going to trust a pattern journal with the most sensitive content they produce, they deserve to know precisely which parts of it are encrypted past the company's reach and which parts the company handles in plaintext, and why.
The short version: when a user writes or speaks an entry inside the Daylogue app, that entry is encrypted on their device with AES-256-GCM before it leaves the phone. Daylogue's servers hold only scrambled data. No Daylogue employee can read those entries. A court subpoena, a data breach, or a future acquirer would not produce a single readable in-app entry. That is a cryptographic property of the system, not a policy promise.
The longer version, the part most apps don't publish, is that not every pathway into Daylogue has the same protections, and Daylogue wants users to know where the lines are:
- SMS check-ins arrive through a carrier gateway, which means Daylogue's servers receive them in plaintext and encrypt them at rest. If end-to-end privacy matters to a user for a specific entry, the in-app path is the one to use.
- Email check-ins work the same way as SMS, the inbox receives plaintext at the boundary, and entries are encrypted at rest thereafter.
- Voice entries are transcribed by Deepgram, a third-party voice-to-text provider under contract with data-handling terms; the resulting text is what lives in Daylogue.
- AI-generated summaries the "what Daylogue has noticed about you" text, are produced by models that need plaintext to do the work, so summaries are readable to Daylogue's systems during generation and stored encrypted at rest.
That map is the point. Mental wellness apps have been caught sharing sensitive user data with advertisers, hiding data practices, and building business models that relied on selling the most personal information people produce online. The FTC has fined, settled with, or investigated more than a few. The pattern is not that these apps had no privacy story. It is that the privacy story they told the public did not match what was happening inside the system. Daylogue's wager is that users would rather read a map than a marketing page.
"A privacy policy no one can read isn't a privacy policy, it's a liability shield," said Brandon Bibbins, Founder and CEO of Daylogue. "We built the app so the most sensitive part, the entries a user types or speaks inside the app, is encrypted on their phone before it leaves, and we can't read it. The other paths into the system aren't like that, and pretending they are would be the same lie the category has been caught telling. We'd rather be specific. People can decide what they want to put where."
Daylogue also runs a three-tier crisis detection system in every check-in, text and voice, with access to more than 55 vetted mental health resources. The system is designed to recognize signs of serious distress within the natural flow of a check-in and surface the right resources quietly, without derailing the experience. Daylogue is clear about what it is and is not: a wellness app, not a clinical tool. The crisis system exists so that someone in acute distress is never left without a path to real help.
For schools and employers using Daylogue at scale, the platform applies k-anonymity, a privacy method that works like this: no data point is shown in a group dashboard unless at least five people share it. That means administrators see patterns across their community. They never see any individual person's data.
"Mental wellness data is the most sensitive information a person can generate digitally, and the industry has treated it as an afterthought," said Marcus M., Head of Strategy and Partnerships at Daylogue. "What we're doing differently isn't inventing some perfect privacy machine. It's being specific. Here's what we can't read. Here's what we can. Here's why. Enterprise buyers have never gotten that document from a wellness vendor before. We think the honesty is the product."
Daylogue's privacy and safety systems, in plain terms:
- End-to-end encryption on in-app entries: text and voice entries written inside the Daylogue app are encrypted with AES-256-GCM on the user's device before upload. Daylogue's servers hold only ciphertext for these entries. No Daylogue employee can read them, and a subpoena or breach would not produce them in readable form.
- Plaintext at the boundary for SMS and email entries: check-ins sent by SMS or email arrive at Daylogue's gateway in plaintext because that is how those protocols work. They are encrypted at rest thereafter. Users who want end-to-end protection for a specific entry should use the in-app path.
- Voice transcription via Deepgram: voice audio is sent to Deepgram, a third-party voice-to-text provider under contract, and the resulting text is what Daylogue stores. Daylogue does not retain voice audio.
- AI summaries are readable to Daylogue during generation: the "what Daylogue has noticed about you" summaries are produced by models that require plaintext input, so summaries are readable to Daylogue's systems at the moment they are generated. They are encrypted at rest.
- No training on private data without explicit consent: Daylogue does not use any user's content, entries, summaries, or metadata, to train its AI without that user's specific, opt-in consent.
- Three-tier crisis detection: active in every check-in, with 55+ vetted crisis and mental health resources ready to surface when needed.
- k-anonymity on group reports: no metric appears in an organizational dashboard unless at least five users contribute to it; individual patterns are never visible to administrators.
- HIPAA-readiness architecture: Daylogue is built to the technical standards required for healthcare compliance, supporting organizations that need it as the platform scales toward formal certification.
- Clear non-clinical positioning: every part of the app communicates that Daylogue is a companion to professional care, not a replacement for it.
Daylogue is available on iOS and web at daylogue.io. Access for schools, workplaces, and organizations is available at daylogue.io/enterprise.
About Daylogue
Daylogue is a pattern journal that reads your past entries and detects the emotional patterns running through them. Instead of a stack of separate journal entries, you get a short, plain-language summary that updates over time: what topics keep coming back, when a pattern is repeating, what's shifted in the last few weeks. Daylogue is not therapy and is not a replacement for professional care. It is a private space on your phone for honest reflection, a companion to therapy, to hard conversations, and to the days when you want to know yourself a little better. Entries written inside the Daylogue app are end-to-end encrypted on your device before upload, so Daylogue cannot read them. (SMS and email check-ins, and AI-generated summaries, are handled on the server and are not end-to-end encrypted. See Daylogue's privacy page for the full map.) Founded by Brandon Bibbins, Daylogue is independent and available on iOS and web at daylogue.io.
Media Contact Daylogue hello@daylogue.io daylogue.io
SOURCE Daylogue