ChatGPT Is Showing Ads Now. Here's What That Means If You Use It as a Journal.

OpenAI rolled out ads in ChatGPT in February 2026. Millions of people use it to process feelings, vent about relationships, and talk through hard days. When a free product starts showing ads, the math changes.

B
Brandon
Founder
March 16, 20266 min readMental Wellness

ChatGPT Is Showing Ads Now. Here's What That Means If You Use It as a Journal.

The News

On February 9, 2026, OpenAI started rolling out ads in ChatGPT. If you're on the free tier or the $8/month Go plan, you'll see sponsored content at the bottom of responses. The first wave of advertisers includes Target, Audible, Ford, Adobe, and Williams-Sonoma, with major ad agencies like WPP, Omnicom, and Dentsu buying placements at roughly $60 CPM.

OpenAI says ads won't influence ChatGPT's answers. They say conversations stay private from advertisers. They say users can control ad personalization in settings.

All of that may be true today. But the introduction of ads changes the relationship between you and the product. And if you've been using ChatGPT to process your feelings, that shift is worth understanding.

People Are Already Using ChatGPT as a Journal

This isn't hypothetical. Millions of people talk to ChatGPT about the most private parts of their lives. They vent about their boss. They process a breakup. They work through anxiety at 2am when nobody else is awake.

A 2025 study published in the Journal of Psychiatric Research analyzed over 1,500 Reddit posts from people using ChatGPT therapeutically. The patterns were clear: people were using it to externalize thoughts through journaling, prepare for therapy sessions, and process traumatic experiences. Some reported that ChatGPT understood them better than most people in their lives.

One viral post described feeding an AI seven years of daily journal entries, over 1,500 entries of the author's innermost thoughts. The AI could recall patterns the writer had never seen, connections across years of emotional data. The writer described becoming "an emotional wreck" three hours in because the AI articulated things about their life that no one else had.

People aren't just chatting with these tools. They're confiding in them.

What Ads Actually Mean for Your Data

Here's the thing about ad-supported products. When a product is free, the business model has to come from somewhere. And in advertising, that somewhere is your attention, shaped by your data.

OpenAI's updated privacy policy, revised on February 9, 2026, spells out how this works. If you have ChatGPT's memory feature turned on and ad personalization enabled, the platform may reference your stored memories when selecting which ads to show you. Your conversation context is used as a signal for ad targeting. Advertisers don't see your actual chats. But your chats inform what you see.

Think about what that looks like in practice. You tell ChatGPT you're stressed about money. The system notes the context. Your next few sessions include a sponsored financial planning tool at the bottom. You talk about sleep problems. An ad for a mattress brand appears. You discuss relationship struggles. A couples therapy platform gets surfaced.

OpenAI says advertisers only receive aggregated metrics: total views and clicks. No individual data. But the targeting itself requires the system to understand you well enough to match you with relevant ads. That understanding comes from your conversations.

And there's a history here worth noting. In August 2025, researchers discovered that nearly 100,000 shared ChatGPT conversations had been indexed by Google. People's private discussions about mental health, abuse survival, career fears, and relationship problems were searchable on the open web. OpenAI called it a "short-lived experiment" and pulled the feature. But the conversations were already out there.

That's not an argument that OpenAI is careless. It's an argument that when your most private thoughts live in someone else's system, things can go wrong in ways you didn't expect.

A Quick Comparison

I'll be straightforward about where Daylogue sits in this, because it's relevant and because I'd want to know if I were reading this.

ChatGPT (Free/Go tier):

  • Conversations stored in plaintext on OpenAI's servers
  • Now ad-supported, with conversation context used for ad targeting
  • Memory feature can inform ad personalization if both are enabled
  • Conversations can be used for model training (you can opt out in settings)
  • 100K+ conversations were indexed by search engines in August 2025
  • No specialized handling for emotional or wellness data

Daylogue:

  • Entries encrypted at rest with user-held keys
  • No ads. No plans for ads. Revenue comes from subscriptions.
  • Never sells data. Never shares it with advertisers.
  • AI processing uses AWS Bedrock with zero data retention (prompts aren't stored or used for training)
  • Built specifically for emotional data, with privacy as a design constraint, not a marketing line

But this isn't just a Daylogue pitch. There are other privacy-focused journaling tools out there. Day One offers end-to-end encryption. Standard Notes is open source and encrypted. The point isn't that one app is the answer. The point is that emotional data deserves a higher standard than a general-purpose chatbot running ads.

What You Can Do Right Now

If you've been using ChatGPT as a journal and you're not ready to switch, here are some practical steps.

Check your data settings. Open ChatGPT, go to Settings, and look at Data Controls. Turn off "Improve the model for everyone" if you don't want your conversations used for training. Review your memory settings and ad personalization preferences.

Use temporary chats. ChatGPT now offers a temporary chat mode that doesn't save to your history or contribute to memory. If you're going to discuss something sensitive, use it.

Be intentional about what you share. This applies to any AI tool, including ours. Think of it this way: if this conversation were read by a stranger, would you be okay with that? If the answer is no, check how that tool protects your data before you share.

Consider purpose-built tools. There's a difference between a tool that can be used for journaling and a tool built for journaling. Purpose-built tools tend to make different architectural decisions because emotional data is the core use case, not an afterthought.

Read the privacy policy. Nobody does this and I get it. But OpenAI's updated policy is actually pretty readable. It's worth 10 minutes to understand what happens to the things you type.

The Real Question

This isn't about whether OpenAI is a bad company. They're building a business, and ads are a reasonable revenue model for a free product. The question is whether an ad-supported model is the right one for a product people use to process their most vulnerable thoughts.

When you write about being scared, lonely, or overwhelmed, that data should go somewhere safe. Somewhere that doesn't need to understand your pain points in order to sell someone else access to your attention.

The tools you trust with your feelings should earn that trust. Not sell it.

Tagged:

privacyAI journalingChatGPTdata privacyemotional wellnessjournaling apps

Share this article

B
Written by

Brandon

Founder at Daylogue

Building tools to help people understand themselves better. Believer in the power of small, consistent habits.

Enjoyed this article?

Get more insights on journaling, self-discovery, and emotional wellness delivered to your inbox weekly.