I hate to say it, but when I wake up in the morning, the very first thing I do is check my phone. First I turn off my alarm, I've made it a habit to wake up before it goes off. Then I scroll through a handful of websites. Yahoo Finance first, because the market is crazy. Hacker News, where I skim titles to see if AWS suffered an outage while I was sleeping. And then I put my phone down before I'm tempted to check my Twitter feed. I've managed to stay away from TikTok, but the TikTok model is finding its way to every user's phone whether we like it or not.
On TikTok, you don't surf the web. You don't think of an idea and then research it. Instead, based entirely on your activity in the app, their proprietary algorithm decides what content will best suit you. For their users, this is the best thing since sliced bread. For the tech world, this is the best way to influence your users.
Now, the TikTok model is no longer reserved for TikTok, but has spread to all social media. What worries me is that it's also going to infect the entire World Wide Web. Imagine this for a second: You open your web browser. Instead of a search bar or a list of bookmarks, you're greeted by an endless, vertically scrolling stream of content. Short videos, news snippets, product listings, and interactive demos. You don't type anything, you just swipe what you don't like and tap what you do. The algorithm learns, and soon it feels like the web is reading your mind. You're served exactly what you didn't know you wanted. Everything is effortless, because the content you see feels like something you would have searched for yourself.
With AI integrations like Google's Gemini being baked directly into the browser, this TikTok-ification of the entire web is the logical next step. We're shifting from a model of surfing the web to one where the web is served to us.
The Algorithms
This looks like peak convenience. If these algorithms can figure out what you want to consume without you having to search for it, what's the big deal? The web is full of noise, and any tool that can cut through the clutter and help surface the gems should be a powerful discovery tool. But the reality doesn't entirely work this way.
There's something that always gets in the way: incentives. More accurately, company incentives.
When I log into my Yahoo Mail (yes, I still have one), the first bolded email on top isn't actually an email. It's an ad disguised as an email. When I open the Chrome browser, I'm presented with "Sponsored content" I might be interested in. Note that Google Discover is supposed to be the ultimate tool for discovering content, but their incentives are clear: they're showing you sponsored content first.
The model for content that's directly served to you is designed to get you addicted. It isn't designed for education or fulfillment; it's optimized for engagement. The goal is to provide small, constant dopamine hits, keeping you in a state of perpetual consumption without ever feeling finished. It's browsing as a slot machine, not a library.
What happens when we all consume a unique, algorithmically-generated web? We lose our shared cultural space. After the last episode of Breaking Bad aired, I texted my coworkers: "Speechless." The reply was, "Best TV show in history." We didn't need more context to understand what we were all talking about. With personalized content, this shared culture is vanishing.
The core problem isn't algorithmic curation itself, but who it serves. The algorithms are designed to benefit the company that made them, not the user. And as the laws of "enshittification" dictate, any platform that locks in its users will eventually turn the screws, making the algorithm worse for you to better serve its advertisers or bottom line.
Solving the Wrong Problem
Algorithmic solutions often fix problems that shouldn't exist in the first place.
Think about your email. The idea of "algorithmically sorted email" only makes sense if your inbox is flooded with spam, newsletters you never wanted, and automated notifications. You need a powerful AI to find the real human messages buried in the noise.
But here's the trick: your email shouldn't be flooded with that junk to begin with. If we had better norms, stricter regulations, and more respectful systems, your inbox would contain only meaningful correspondence. In that world, you wouldn't want an algorithm deciding what's important. You'd just read your emails.
The same is true for the web. The "noise" the TikTok model promises to solve, the SEO spam, the clickbait, the low-value content, is largely a product of an ad-driven attention economy. Instead of fixing that root problem, the algorithmic model just builds a new, even more captivating layer on top of it. It doesn't clean up the web; it just gives you a more personalized and addictive filter bubble to live inside.
The TikTok model of the web is convenient, addictive, and increasingly inevitable. But it's not the only future. It's the path of least resistance for platforms seeking growth and engagement at all costs.
There is an alternative, though. No, you don't have to demand more from these platforms. You don't have to vote for a politician. You don't even have to do much. The very first thing to do is remember your own agency. You are in control of the web you see and use. Change the default settings on your device. Delete the apps that are taking advantage of you. Use an ad blocker. If you find creators making things you like, look for ways to support them directly. Be the primary curator of your digital life.
It requires some effort, of course. But it's worth it, because the alternative is letting someone else decide what you see, what you think about, and how you spend your time. The web can still be a tool for discovery and connection rather than a slot machine optimized for your attention. You just have to choose to make it that way.

Comments
There are no comments added yet.
Let's hear your thoughts