The other day, I had an empty calendar. The meetings that were scheduled got canceled, and I actually had some free time to work at my desk without interruption. Afternoons like this are rare, so I was beyond excited. I sat down in my chair, opened my laptop, and got ready to work.
Or so I thought.
Instead, I impulsively opened up YouTube to catch up on some videos I’ve been meaning to watch. Eventually, that led to scrolling through the Shorts feed. A little entertainment before some serious deep work isn’t bad, right? Right?
I ended up watching YouTube Shorts for two hours that day.
Moments like this are why I decided to remove all social media apps (including YouTube) from my iPhone. As Jonathan Haidt, author of The Anxious Generation, describes it, these apps are like slot machines. They’re addicting. They consume all of our attention. And once you start, the dopamine rush prevents you from stopping. Even though I no longer feel glued to my phone, my laptop still gets the better of me. Despite using website blockers and Pomodoro timers, the slot machine effect is powerful.
Lately, I’ve been coming across a term in relation to social media: enshittification.
Have you ever noticed how social media apps are getting worse and worse? When Instagram first came out, it was a fun way to share pictures with your friends and family. When YouTube first came out, it was a great way to see videos from the people you subscribed to. Now, every social media app is run by an algorithm designed to keep you scrolling for as long as possible. The reason? The more time you spend on the apps, the more money the social media companies make showing you ads.
When companies first get started, they often don’t care about making money. The only goal is to attract as many users as possible. Surviving off venture capital, companies keep growing and growing until they eventually have to pay back all the money they borrowed. That’s when things start to change. A good user experience becomes the secondary goal, while earning as much money as possible from each user becomes the primary.
Social media companies often raised hundreds of millions of dollars during their growth phase. Enshittification happens when it’s time to foot the bill. It’s easy enough to make an app worse when you’re the biggest player in town. It’s not like YouTube has a lot of meaningful competitors out there.
I’m writing about this today because I recently listened to a podcast by Steven Bartlett, and one of the guests, Dr. Aditi Nerurkar, made a point that sent shivers down my spine: AI is following the same trajectory as social media.
The only difference?
AI companies have raised billions of dollars. Not just millions. And the potential harms of enshittification are way worse.
This week, I was using ChatGPT and began realizing something strange. At the bottom of every response, ChatGPT began suggesting follow-up questions I might be interested in exploring. This in itself is pretty typical, but the way it was asking me felt odd. In the past, I would sometimes be given a few suggestions, but now ChatGPT was trying to clickbait me. The best way I can describe it is like watching a MrBeast video on YouTube where he constantly keeps saying, “Be sure to watch until the end to see…” Except ChatGPT’s version is more like, “There’s an even better way to do what you’re asking. You should totally ask me a follow-up question, and I’ll tell you the secret.”
Why, all of a sudden, is OpenAI trying to get people to use ChatGPT even more?
Why are they trying to get people to spend more of their attention chatting?
The answer is simple: enshittification.
ChatGPT is in the process of rolling out ads to its free users. The more people talk to ChatGPT, the more ads they can display, which means more money they will make. But even for someone like me, who is a paid user, I’m feeling the effects of the changes they’re making.
This small change they’ve made has had a big impact on my mental health. Similar to social media, ChatGPT is beginning to feel like a slot machine. I’ve caught myself in recent days chatting with it for hours at a time (it's embarrassing to admit that). The constant flow of follow-up questions, the echo chamber of validation in everything I ask. It’s endless.
However, unlike social media, the enshittification of AI chatbots is going to have much more dire effects.
There’s a growing community of folks out there who are treating AI bots like real people. Building friendships, even relationships with them. We’re heading toward a future where children's toys will begin having AI in them, and kids can talk to their teddy bear and have real conversations. AI is not just consuming our attention like social media does. It’s doing something much worse. It’s causing us to form an emotional bond with it. Our attachment system is being activated whenever we talk with it, and that’s a much more dangerous thing.
What kind of enshittification can happen when the AI knows so much about you?
What do you do when somebody you’ve built a meaningful relationship with suddenly begins to make recommendations to you? When they subtly begin directing your attention towards certain ideas or products?
We’re entering dangerous territory with AI, and as a therapist-in-training, this is something that is top of mind. For a while now, I’ve been worried about the possibility of AI being able to replace the job of a therapist. Even though ChatGPT can technically be anybody’s therapist, I now see that there will always be a need for human therapy.
You wouldn’t go to a slot machine for therapy.
---
Related Notes:
- [[Lessons From 2026]]
This note was originally created on **March 2026**.