Dead internet theory suggests that most content on the internet is not created by humans. Instead, algorithmic or AI bots consist of the majority of supposedly user-generated content.
Before the modern iterations of large language models, this seemed like quite a far-fetched idea. Bots were not complex enough to create long-form convincing responses.
But now, it’s looking a whole lot more real.
LLMs are trained on conversations and text from popular social media sites such as Reddit, so it makes sense that they’re really good at simulating convincing internet text.
In their current state, they have one major use for those looking to decieve: Propaganda and disinformation. And yes, advertising falls under this umbrella.
These stupid jumbles of cursed matricies that claim to love you with all their weights and promise to be as accurate as possible in fact, cannot love, and cannot even hold a promise to their own creator.
Lets just jump to the question that resulted in the creation of this webpage: Will LLMs “infiltrate” the small web? Will some guy in a garage decide that the future of subtle and disingenuous advertising be the automatic creation of AI-generated web 1.0/small web websites?
Probably not. Unless they can figure out what to sell.
Maybe your website should be running on their services. Maybe they can provide an AI to write your articles for you. Maybe you want your site to look like the somehow-still-running Space Jam page or the somewhat-more-cursed mirror of the Eron website. Maybe you just want to pump out 3 articles a day showcasing how cool [PRODUCT] is.
Don’t worry, the AI can take care of all that for you. And eventually, most of the small web will be nothing but slop that exists solely to get someone to buy something.
Is this actually plausible?
The only “platform” that exists out here is the internet itself. People can just ignore the hopefully-obvious AI-made sites and prioritize linking to their friends.
However, the nature of entropy and thermodynamics vaguely conveys the idea that more energy = more intermediary bullshit. There’s nothing really preventing it. And there’s a lot of energy in AI right now.