this post was submitted on 19 Feb 2026
290 points (99.7% liked)

Fuck AI

5952 readers
1034 users here now

"We did it, Patrick! We made a technological breakthrough!"

A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.

AI, in this case, refers to LLMs, GPT technology, and anything listed as "AI" meant to increase market valuations.

founded 2 years ago
MODERATORS
 

Similarly, in research, the trajectory points toward systems that can increasingly automate the research cycle. In some domains, that already looks like robotic laboratories that run continuously, automate large portions of experimentation and even select new tests based on prior results.

At first glance, this may sound like a welcome boost to productivity. But universities are not information factories; they are systems of practice. They rely on a pipeline of graduate students and early-career academics who learn to teach and research by participating in that same work. If autonomous agents absorb more of the “routine” responsibilities that historically served as on-ramps into academic life, the university may keep producing courses and publications while quietly thinning the opportunity structures that sustain expertise over time.

The same dynamic applies to undergraduates, albeit in a different register.

you are viewing a single comment's thread
view the rest of the comments
[–] UnspecificGravity@piefed.social 35 points 1 day ago (4 children)

Honestly, a lot of these predicted problems with AI are actually overly optimistic because they assume AI can actually DO the work that we are talking about and the current state of AI very much cannot.

I sometimes think that articles like this are plants by the AI industry to create the narrative that their shit is even capable of causing this problem.

[–] floquant@lemmy.dbzer0.com 18 points 1 day ago (1 children)

What people think AI can do is a problem, even (especially) when it cannot.

[–] UnspecificGravity@piefed.social 11 points 1 day ago (1 children)

I would say that THIS is the biggest risk of AI. its not what it does, its what people believe it does. Especially people who aren't capable of actually assessing its performance.

[–] regedit@lemmy.zip 2 points 16 hours ago

Or C-level execs that are so out of touch with what their employees do and are convinced it can and/or should replace one or all of an employees job duties.

[–] chisel@piefed.social 4 points 1 day ago (3 children)

The problem is AI replacing essentially busy work that historically brand-new workers tend to be assigned. AI, in its current state, absolutely can do a lot of this work and replace the need for junior employees in a lot of cases. The problem is, junior employees need this work in order to improve, so if we delegate all of it to AI, there will no no future senior employees to do the more advanced work that AI can't do.

[–] rebelsimile@sh.itjust.works 3 points 1 day ago* (last edited 1 day ago)

I think it’s actually just the perception that it’s doing that from people who are so far away from the working that they don’t have any clue.

Like 300 years ago if you wanted to be a sailor maybe you started by rowing oars, or swabbing decks or dealing with ropes and rigging or whatever. If you’re a 19 year old seaman in 2026 you’re probably using a throttle controller that’s 26 different systems disconnected from the actual mechanical work. No one says “oh that guys not a novice seaman” or “the boat literally drives itself hurr”

If you’re a graphic designer in 1970, you’d cut out and hand lay a magazine page out on a big glowing table, page by page, resetting the type so it fit along with giant physical images so you could design the page. In 2026, you’d use Illustrator and lay it out on your computer. No one says “oh the magazines just lay themselves out” or “that’s not how you lay out magazine”. It’s just done with the tools available.

What’s going on is very few people seem to understand the difference between a tool and a solution. A tool is a thing that does something. A hammer is a tool. A solution is something that solves a problem. A hammer is a tool but it is not a good solution for a crying baby. Applying random tools to non-problems does not generate solutions randomly, it just creates even more intractable problems.

[–] UnspecificGravity@piefed.social 4 points 1 day ago (1 children)

It probably depends on your industry but it absolutely cannot even do entry level work in a lot of fields.

[–] chisel@piefed.social 3 points 1 day ago

I feel like that goes without saying, yeah. AI can do a lot of a junior software engineer's job, but it's going to fail miserabley at being a junior house cleaner.

[–] Bustedknuckles@lemmy.world 2 points 1 day ago

I agree, and that's a great way of putting it. We're kneecapping ourselves collectively because enough individual companies are deprecating the junior dev experience. We'll see if it holds up when senior devs are in such short supply that companies have to pay them 4x the margin they saved on junior devs. I think they're hoping that the machine learning gets good enough to do senior dev work before the humans retire. Or else they're just line-go-up types

[–] Ephera@lemmy.ml 2 points 1 day ago

LLMs do tend to be pretty good at textbook problems, because they've been trained on the textbooks. We have working students at $DAYJOB, who tell us that you can often get a flawless grade by handing in something AI-generated.

But then, yeah, you don't learn anything, and that will become a problem sooner or later, because none of problems at work are textbook problems.

[–] Virtvirt588@lemmy.world 2 points 1 day ago

I do agree, it may be that the situation is exacerbated to the point where doomerism is essentially encouraged.

The thing is, there was an article where the students themselves didn't want AI on the mandated Chromebooks. It feels force fed, and the fact is, nobody, no matter the age wants garbage screwing up their workflow.

Honestly, a lot of these articles are repeating the same story. In the end it all leads to a similar conclusion - and it isn't age related.