this post was submitted on 19 Feb 2026
294 points (99.7% liked)

Fuck AI

5975 readers
1110 users here now

"We did it, Patrick! We made a technological breakthrough!"

A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.

AI, in this case, refers to LLMs, GPT technology, and anything listed as "AI" meant to increase market valuations.

founded 2 years ago
MODERATORS
 

Similarly, in research, the trajectory points toward systems that can increasingly automate the research cycle. In some domains, that already looks like robotic laboratories that run continuously, automate large portions of experimentation and even select new tests based on prior results.

At first glance, this may sound like a welcome boost to productivity. But universities are not information factories; they are systems of practice. They rely on a pipeline of graduate students and early-career academics who learn to teach and research by participating in that same work. If autonomous agents absorb more of the “routine” responsibilities that historically served as on-ramps into academic life, the university may keep producing courses and publications while quietly thinning the opportunity structures that sustain expertise over time.

The same dynamic applies to undergraduates, albeit in a different register.

you are viewing a single comment's thread
view the rest of the comments
[–] minorkeys@lemmy.world 3 points 1 day ago

I feel the effect when it happens as I am required to use AI in my work. I have to acknowledge it and take steps to not offload memorizing and analysis of things solely to an AI assistant. Mitigating this impact takes time and effort. The danger is that AI is good enough that the gains in speed may outweigh the risks and the cost of errors. If the efficiency is high enough, meeting the performance output standards required to hold a position at a company will not be possible without using AI and doing so in a way that makes it impossible to mitigate the formation of dependencies. People will have to use AI in a way that ensures dependency in order to have the job. The costs are bourne by the workers, the benefits reaped by the owners.

That's the leverage that will reshape our society, we will be forced to work in ways that make us worse at learning, memorization and analysis. Ask a product owner a question and they have to reach for AI because the environment makes it impossible to have the answer without it. And if they can get the answer from AI, so can the person asking the question. So with AI adoption, leadership, decision making and expertise are all transfered to ownership, decimating those middle roles between implementation and ownership that the entire office environment is built around. They're also trying to use it to replace implementation as well, as with software engineers.

For business, AI is a massive opportunity to reduce their dependency on human labour, while making remaining labour dependent on AI. It's a nightmare for a society and for human beings. If robotics manages to accelerate alongside, then what is a population even capable of doing to protect themselves from the harm of this corporate empowerment? No jobs, no money, no legal access to resources and facing an autonomous robotic security system protecting those resources?

History shows self-interest and concentrated power leads to mass suffering. I don't see how these new technologies, in the hands private power, will produce anything different this time.