About Daylogue
Ethics Principles
What we believe about AI, emotional data, and your autonomy.
Daylogue handles something personal: how you feel, what you think about, and what patterns run through your days. That responsibility shapes every decision we make. These are the principles we build by. They are not aspirational. They describe how the product works right now.
Self-awareness, not diagnosis
Daylogue helps you notice patterns in your emotional life. It does not tell you what is wrong with you. It does not diagnose conditions, prescribe actions, or provide clinical guidance of any kind.
When the app surfaces a pattern, it is offering an observation, not a conclusion. "Your stress has been elevated for three weeks" is different from "You have an anxiety disorder." Daylogue does the first. It will never do the second.
The language throughout the product reflects this distinction. We deliberately use non-clinical terms: "rising" instead of "recovery," "dipping" instead of "decline," "sustained elevated stress" instead of "burnout risk." Patterns are information for you to reflect on, not labels to carry.
Privacy as architecture, not policy
Many apps promise privacy in their terms of service. Daylogue builds privacy into the system itself.
Journal entries are encrypted on your device using AES-256-GCM before they leave it. Your encryption keys stay on your devices. Daylogue employees cannot read your raw entries, because we do not have the keys to decrypt them.
This is a deliberate architectural choice. A privacy policy can change. A terms-of-service update can quietly expand what a company does with your data. But when encryption happens on your device and the keys never leave it, there is no policy change that gives us access. The math prevents it.
No manipulation
Daylogue does not use streaks, guilt, or dark patterns to keep you coming back. There are no badges to earn, no counters to protect, no passive-aggressive notifications when you miss a day.
An independent ethics audit in February 2026 scored Daylogue 87 out of 100 across privacy, data handling, consent, gamification, and emotional safety. One of the outcomes was the complete removal of all remaining gamification elements. Streaks were already minimal, but the audit identified them as potentially harmful to genuine reflective practice. They are now gone entirely.
The goal is for you to use Daylogue because it is useful, not because it is addictive. If you stop finding value in it, you should stop using it. A wellness tool that manipulates you into engagement is not a wellness tool.
Transparency about AI
Daylogue uses AI to generate follow-up questions, extract mood and energy signals from your words, identify patterns across entries, and write narrative summaries. This requires your entries to be briefly decrypted and sent to an AI provider (AWS Bedrock) for processing.
That is a real tradeoff. End-to-end encryption protects your entries at rest and in transit, but AI features require a brief decryption window. We are transparent about this because hiding it would be dishonest. The AI provider does not store or log your content, and your entries are re-encrypted before storage. But we want you to know the tradeoff exists.
For a detailed explanation of how AI processing works, see How Daylogue Uses AI.
Crisis safety
If someone expresses crisis-level distress during a check-in, Daylogue does not attempt to help. This is intentional.
A wellness app is not equipped to handle a mental health crisis. Pretending otherwise would be dangerous. Instead, Daylogue immediately surfaces real crisis resources, including the 988 Suicide and Crisis Lifeline. Voice sessions automatically end within two seconds of surfacing these resources to ensure the person can access professional help without delay.
This boundary is non-negotiable. Daylogue will never try to talk someone through a crisis, because doing so would put them at risk.
Data ownership
Your data is yours. This is not a marketing line. It describes how the product is built.
- Export anytime. You can export all your data whenever you want. No hoops, no waiting periods.
- Delete anytime. You can delete your account and all associated data. Deletion is real, not a soft archive.
- No data sales. Daylogue does not sell your data to anyone, for any reason.
- No training without consent. Your entries are never used to train AI models. This is a policy, not a toggleable setting.
Scope discipline
Daylogue is a wellness tool. It is not a medical device, a clinical resource, or a substitute for professional help. It stays within its lane and clearly communicates what it is and what it is not.
Scope disclaimers appear throughout the product. The AI is prompted to stay within wellness boundaries. The language is deliberately non-clinical. This is not accidental. It is a design principle.
Scope creep in wellness apps is dangerous. A tool that gradually starts acting like a therapist puts users at risk by creating a false sense of clinical support. Daylogue resists that drift by building scope boundaries into the product itself, not relying on users to understand the limitations on their own.
Related pages
- How Daylogue Uses AIPlain language guide to AI processing and data handling
- How Daylogue Generates InsightsWhat we measure, how we measure it, and what the patterns mean
- Security ArchitectureTechnical overview of encryption, authentication, and data protection
Ready to see your patterns?
Two minutes a day. No blank pages. No streaks. Just questions that lead somewhere.
Try your first check-in