Breaking News

Contributor: The web is flooded with AI garbage. Real content is reserved for subscribers and democracy suffers

Princess Diana trips in a parkour park. Team USA wins gold at the Bong Olympics. Tank Man breakdancing in Tiananmen Square. Kurt Cobain plays pogs. Tupac Shakur looks for poutine at Costco. Open AI’s Sora 2 artificial intelligence video generator debuted this month, and internet crazies have been jumping on it. Hilarious and harmless? Or a symbol of how we are saying goodbye to reality, entering an era where no one can ever trust video again?

It’s the latest example of how AI is transforming the world. But the problem goes far beyond just creating energy-sucking brains; this becomes a major threat to democracy itself. Today, billions of people learn about the Internet not through high-quality news and information portals, but through algorithmically generated clickbait, misinformation, and nonsense.

This phenomenon forms a “sludge economy”: a second-rate Internet where those who don’t pay for content are inundated with low-quality sludge optimized for advertising. Platforms like TikTok, Facebook, and YouTube are packed with maximum content at minimal cost, produced by algorithmically scraping and remixing chunks of human-written material into synthetic mush. Bots create and distribute countless fake author blogs, how-to guides, political memes, and get-rich-quick videos.

Today, nearly 75% of new web content is at least partially generated by AI, but this deluge is not evenly distributed across society. People who pay for high-quality news and data services benefit from credible journalism and verified reporting. But billions of users cannot afford paid content or simply prefer to rely on free platforms. In developing countries, this divide is pronounced: as billions of people go online for the first time via cheap phones and spotty networks, the flood of sewage often becomes synonymous with the Internet itself.

This is important for democracy in two key ways. First, democracy depends on informed citizens sharing a basis of facts and a population capable of making sense of the issues that affect them. The crumbling economy misleads voters, erodes trust in institutions, and fuels polarization by amplifying sensational content. Beyond the much-discussed problem of foreign disinformation campaigns, this insidious epidemic of slop affects many more people every day.

Second, people can become vulnerable to extremism simply due to prolonged exposure to rubble. When users scroll through different algorithmic feeds, we lose consensus on fundamental truths, because each side literally lives in its own information universe. This is a growing problem in the United States, where AI-generated information is becoming so prolific (and so realistic) that consumers believe this “pink slime” information is more factual rather than real sources of information.

Demagogues know this and exploit the poor around the world who lack information. For example, AI-generated disinformation already poses a pervasive threat to electoral integrity in Africa and Asia, with deepfakes in South Africa, India, Kenya and Namibia affecting tens of millions of new voters via cheap phones and apps.

Why has slop taken over our digital world and what can we do about it? To find answers, we surveyed 421 Silicon Valley coders and developers who design the algorithms and platforms that power our information diet. We found a community of concerned tech insiders who are being held back from making positive changes by market forces and corporate leaders.

Developers told us that their bosses’ ideology strongly shapes what they build. More than 80% said their CEO or founder’s personal beliefs influence product design.

And CEOs aren’t the only ones who make their company’s success a top priority, even above ethics and social responsibility. More than half of the developers surveyed regret the negative social impact of their products, and yet 74% of them would still create tools restricting freedoms such as surveillance platforms, even if it bothered them. Resistance is difficult in tech company culture.

This reveals a troubling synergy: business incentives align with a culture of compliance, giving rise to algorithms that favor divisive or low-value content because they drive engagement. The slime economy exists because producing low-quality content is cheap and profitable. Solutions to the rubble problem must realign business incentives.

Companies could filter out the trash by ranking clickbait farms down, clearly labeling AI-generated content, and removing blatantly false information. Search engines and social networks should not treat an investigative article written by a human as equal to a pseudo-news article written by a robot. Voices are already being heard in the United States and Europe to impose quality standards on the algorithms that determine what we see.

Imaginative solutions are possible. One idea is to create public, non-profit social networks. Just like you listen to public radio, you can access an AI-free public social news feed that rivals scrolling on TikTok but delivers real news and educational snippets instead of conspiracies. And given that 22% of Gen Z hates AI, the private sector’s billion-dollar idea might just be a YouTube competitor that promises a total ban on AI, forever.

We can also fund slop producers by removing the ad money pipeline that rewards content farms and spam sites. If ad networks refuse to fund websites without editorial standards, the flow of unwanted content would slow. This has worked for extremist disinformation: when platforms and payment processors cut off money, the volume of toxic content goes down.

Our research offers a glimmer of hope. Most developers say they want to create products that strengthen democracy rather than subvert it. To reverse the economic trend, technology creators, consumers and regulators must build a healthier digital public sphere together. Sustainable democracy, from local communities to the global stage, depends on bridging the gap between those who get the facts and those who thrive on nonsense. Let’s end digital waste before it eats away at democracy as we know it.

Jason Miklian is a research professor at the University of Oslo in Norway. Kristian Hoelscher is a research professor at the Peace Research Institute in Oslo, Norway.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button