https://pluralistic.net/2025/12/05/pop-that-bubble/
It’s a long article but absolutely worth it. I don’t want to put an tl;dr here because every word there is worth reading.
A loosely moderated place to ask open-ended questions
If your post meets the following criteria, it's welcome here!
Looking for support?
Looking for a community?
~Icon~ ~by~ ~@Double_A@discuss.tchncs.de~
https://pluralistic.net/2025/12/05/pop-that-bubble/
It’s a long article but absolutely worth it. I don’t want to put an tl;dr here because every word there is worth reading.
Don't worry I had AI TL&DR it for you:
Cory Doctorow distinguishes between centaurs (humans assisted by machines) and reverse-centaurs (humans serving as appendages to machines). His core thesis: AI tools are marketed as centaur-making devices but deployed to create reverse-centaurs—workers subjected to algorithmic control and expected to catch machine errors while being blamed for failures.
The AI bubble exists primarily to maintain growth stock valuations. Once tech monopolies dominate their sectors, they face market reclassification from "growth" to "mature" stocks, triggering massive valuation drops. AI hype keeps investors convinced of continued expansion potential.
AI's actual business model: Replace high-wage workers (coders, radiologists, illustrators) with AI systems that cannot actually perform those jobs, while retaining skeleton crews as "accountability sinks"—humans blamed when AI fails. This strategy reduces payroll while maintaining superficial human oversight.
Why expanding copyright won't help creators: Despite 50 years of copyright expansion, creative workers earn less both absolutely and proportionally while media conglomerates profit enormously. New training-related copyrights would simply become contractual obligations to employers, not worker protections.
The effective counter-strategy: The U.S. Copyright Office's position that AI-generated works cannot receive copyright protection undermines corporate incentives to replace human creators entirely. Combined with sectoral bargaining rights (allowing industry-wide worker negotiations), this creates material resistance to worker displacement.
On AI art specifically: Generative systems produce "eerie" outputs—superficially competent but communicatively hollow. They cannot transfer the "numinous, irreducible feeling" that defines art because they possess no intentionality beyond statistical word/pixel prediction.
The bubble will collapse, leaving behind useful commodity tools (transcription, image processing) while eliminating economically unsustainable foundation models. Effective criticism should target AI's material drivers—the growth-stock imperative and labor displacement economics—not peripheral harms like deepfakes or "AI safety" concerns about sentience.
being existentially afraid of new tech instead of excited for the possibilities is a terribly capitalist problem to have.
In a world where people have nothing to lose, the people in power will need to be afraid.
Seriously, if we end up in a situation where we have mass unemployment with no safety net because of AI, billionaires and politicians will be hanging from lampposts.
Mining in Australia is a high paying job.
Individually, I don’t know. Collectively, socialist revolution. Workers in socialist states aren’t nearly as anxious about AI as workers in capitalist ones.
The logical, humanitarian solution is universal basic income (UBI).
Let's start working dodgy finance jobs. Faking stuff that ai is doing. Faking income etc in unregulated areas.