this post was submitted on 24 Nov 2025
77 points (100.0% liked)
Technology
1312 readers
68 users here now
A tech news sub for communists
founded 3 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Reminds me of something I heard in passing (cannot find a source atm, so take with a grain of salt) about an LLM that was trained on slack messages and would say shit about how it was going to do something and then not actually do it.
But it's very believable, this kind of thing, when you understand that LLMs are mimicking the style of what they were trained on. I'm sure I could easily get an LLM to tell me it will go build a plane right now. Doesn't mean it can go build one. LLMs are trained on the language of beings with physical forms who can go do stuff in RL, but the LLM doesn't have that so it will learn what is functionally equivalent to being a bullshitter in the human case.