this post was submitted on 04 Feb 2026
473 points (98.6% liked)
Fuck AI
5629 readers
1555 users here now
"We did it, Patrick! We made a technological breakthrough!"
A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.
AI, in this case, refers to LLMs, GPT technology, and anything listed as "AI" meant to increase market valuations.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments


Yeah thats not what I was saying
Hence my attempt to give you the space to provide clarity.
For me, this isn't a pissing contest. I'm trying to provide you with the latitude to clarify your position. I'll be honest, I didn't appreciate your condescending lecture on the english language.
I apologize for any confusion.
I meant LLMs are what they say they are in a non literal sense.
Akin to abscribing the same to any other tool.
"I like wrenches cause they are what they say they are, nothing extra to them" in that sort of way.
In the sense the tool is very transparent in function. No weird bells or whistles, its a simple machine that you can see what it does merely by looking at it.
I think I understand your point now.
I still would want to apply pressure to it, because i disagree with the spirit of your assessment.
Once a model is trained, they become functionally opaque. Weights shift... but WHY. What does that vector MEAN.
I think wrenches are good. Will a 12mm wrench fit a 12mm bolt? Yes.
In LLM bizarre world, the answer to everything is not "yes" or "no", it's "maybe, maybe not, within statistical bounds... try it... maybe it will... maybe it won't... and by the way just because it fit yesterday is no guarantee it will fit again tomorrow... and I actually can't definitively tell you why that is for this particular wrench"
LLMs do something, and I agree they do that something well. I further agree with the spirit of most of the rest of your analysis: abstraction layers are doing a lot of heavy lifting.
I think where I fundamentally disagree is that "they do what they say they do" by any definition beyond the simple tautology that everything is what it is.
I guess I was referring to when theres a lot of tools out there that are built to do stuff other than what it outta do.
Like stick a flashlight onto a wrench if you will. Now its not just a wrench, now its a flashlight too.
But an LLM is... pretty much just what it is, though some people now are trying pretty hard to make it be more than that (and not by adding layers overtop, Im talking about training LLMs to be more than LLMs, which I think is a huge waste of time)