this post was submitted on 12 Dec 2025
19 points (91.3% liked)

Asklemmy

51613 readers
1012 users here now

A loosely moderated place to ask open-ended questions

Search asklemmy 🔍

If your post meets the following criteria, it's welcome here!

  1. Open-ended question
  2. Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
  3. Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
  4. Not ad nauseam inducing: please make sure it is a question that would be new to most members
  5. An actual topic of discussion

Looking for support?

Looking for a community?

~Icon~ ~by~ ~@Double_A@discuss.tchncs.de~

founded 6 years ago
MODERATORS
 

Qualification:

No, seriously, it's just awful. I'm starting to worry that I'll end up homeless or working in low-paying jobs like mining, if, of course, such jobs still exist and aren't taken by other people. Maybe I should move to a more or less decent village, where at least I'll have the opportunity to grow my own food and get water from a well?

But I saw that life in the villages is very hard, and I'm not ready to work from morning until late at night without days off.

The year 2027 scares me a lot.

top 12 comments
sorted by: hot top controversial new old
[–] minorkeys@lemmy.world 1 points 51 minutes ago* (last edited 44 minutes ago)

There is no way to know what paths will be left available once LLM integration starts to prove effective or in what roles. History shows very clearly that the powerful do not care if the peasantry suffers and dies because of the changes they force on the world. They care primarily, and often exclusively, for their own ambitions.

If we are unnecessary for those ambitions, our welfare is not considered at all. Our only value to them has been our labour and without that value we are all under threat of being denied access to any resources at all, including those necessary to stay alive, including the resource of physical space, which land ownership allows them to deny us.

[–] gergolippai@lemmy.world 2 points 3 hours ago

Okay ai, someone locked themselves out of their apartment. Solve. Okay ai, here's this flat tyre on my bike. Solve. Okay ai, I have a leak in my kitchen. Solve.

In short, I'm not too afraid :)

[–] piyuv@lemmy.world 1 points 7 hours ago (1 children)

https://pluralistic.net/2025/12/05/pop-that-bubble/

It’s a long article but absolutely worth it. I don’t want to put an tl;dr here because every word there is worth reading.

[–] Randomgal@lemmy.ca 1 points 4 hours ago

Don't worry I had AI TL&DR it for you:

Summary of "The Reverse-Centaur's Guide to Criticizing AI"

Cory Doctorow distinguishes between centaurs (humans assisted by machines) and reverse-centaurs (humans serving as appendages to machines). His core thesis: AI tools are marketed as centaur-making devices but deployed to create reverse-centaurs—workers subjected to algorithmic control and expected to catch machine errors while being blamed for failures.

The AI bubble exists primarily to maintain growth stock valuations. Once tech monopolies dominate their sectors, they face market reclassification from "growth" to "mature" stocks, triggering massive valuation drops. AI hype keeps investors convinced of continued expansion potential.

AI's actual business model: Replace high-wage workers (coders, radiologists, illustrators) with AI systems that cannot actually perform those jobs, while retaining skeleton crews as "accountability sinks"—humans blamed when AI fails. This strategy reduces payroll while maintaining superficial human oversight.

Why expanding copyright won't help creators: Despite 50 years of copyright expansion, creative workers earn less both absolutely and proportionally while media conglomerates profit enormously. New training-related copyrights would simply become contractual obligations to employers, not worker protections.

The effective counter-strategy: The U.S. Copyright Office's position that AI-generated works cannot receive copyright protection undermines corporate incentives to replace human creators entirely. Combined with sectoral bargaining rights (allowing industry-wide worker negotiations), this creates material resistance to worker displacement.

On AI art specifically: Generative systems produce "eerie" outputs—superficially competent but communicatively hollow. They cannot transfer the "numinous, irreducible feeling" that defines art because they possess no intentionality beyond statistical word/pixel prediction.

The bubble will collapse, leaving behind useful commodity tools (transcription, image processing) while eliminating economically unsustainable foundation models. Effective criticism should target AI's material drivers—the growth-stock imperative and labor displacement economics—not peripheral harms like deepfakes or "AI safety" concerns about sentience.

[–] Corporal_Punishment@feddit.uk 18 points 1 day ago (1 children)

In a world where people have nothing to lose, the people in power will need to be afraid.

Seriously, if we end up in a situation where we have mass unemployment with no safety net because of AI, billionaires and politicians will be hanging from lampposts.

[–] rain_lover@lemmy.ml 1 points 1 hour ago

They just wont let it get that bad. Make sure people can afford a netflix subscription and some shitty junk food each day and there will be no revolution.

[–] umbrella@lemmy.ml 5 points 23 hours ago (1 children)

being existentially afraid of new tech instead of excited for the possibilities is a terribly capitalist problem to have.

[–] communism@lemmy.ml 2 points 1 hour ago

New tech isn't socially neutral. Should we be excited about the possibilities of new missiles and warplanes? If you understand how that new technology can be bad, you can understand how other new technologies can also not be "exciting". Capitalism produces for the sake of production. We have plenty of useless shit that exists for the ouroboros of profit and marketing rather than to fulfil some natural use case. I think modern LLMs fall into the former, not to mention the energy cost of the current demand. I think LLMs can be cool as toys/for demos/as academic projects/etc but the current prevalence is purely due to marketing and AI companies trying to make something that is quite expensive, profitable.

[–] prettygorgeous@aussie.zone 4 points 23 hours ago

Mining in Australia is a high paying job.

[–] davel@lemmy.ml 7 points 1 day ago

Individually, I don’t know. Collectively, socialist revolution. Workers in socialist states aren’t nearly as anxious about AI as workers in capitalist ones.

[–] GuyFawkes@midwest.social 6 points 1 day ago

The logical, humanitarian solution is universal basic income (UBI).

Let's start working dodgy finance jobs. Faking stuff that ai is doing. Faking income etc in unregulated areas.