this post was submitted on 25 Jan 2026
316 points (97.3% liked)

Programmer Humor

28738 readers
901 users here now

Welcome to Programmer Humor!

This is a place where you can post jokes, memes, humor, etc. related to programming!

For sharing awful code theres also Programming Horror.

Rules

founded 2 years ago
MODERATORS
 
top 45 comments
sorted by: hot top controversial new old
[–] ChaoticNeutralCzech@feddit.org 7 points 8 hours ago

Nah, too cold. It stopped moving and the computer can't generate any more random numbers to pick from the LLM's weighted suggestions.

[–] RVGamer06@sh.itjust.works 3 points 8 hours ago

O cholera, czy to Freddy Fazbear?

[–] stsquad@lemmy.ml 84 points 19 hours ago (3 children)

If you have ever read the "thought" process on some of the reasoning models you can catch them going into loops of circular reasoning just slowly burning tokens. I'm not even sure this isn't by design.

[–] Feathercrown@lemmy.world 5 points 8 hours ago (2 children)

Why would it be by design? What does that even mean in this context?

[–] MotoAsh@piefed.social 2 points 3 hours ago

You have to pay for tokens on many of the "AI" tools that you do not run on your own computer.

[–] deHaga@feddit.uk 2 points 7 hours ago

Compute costs?

[–] swiftywizard@discuss.tchncs.de 48 points 19 hours ago (1 children)

I dunno, let's waste some water

[–] gtr@programming.dev 5 points 7 hours ago (1 children)

They are trying to get rid of us by wasting our resources.

[–] MajorasTerribleFate@lemmy.zip 8 points 7 hours ago

So, it's Nestlé behind things again.

[–] SubArcticTundra@lemmy.ml 15 points 18 hours ago (1 children)

I'm pretty sure training is purely result oriented so anything that works goes

[–] MotoAsh@piefed.social 1 points 3 hours ago

Exactly why this shit is and will never be trustworthy.

[–] Darohan@lemmy.zip 54 points 18 hours ago
[–] Kyrgizion@lemmy.world 40 points 20 hours ago

Attack of the logic gates.

[–] ideonek@piefed.social 27 points 20 hours ago (6 children)
[–] FishFace@piefed.social 82 points 20 hours ago (2 children)

LLMs work by picking the next word* as the most likely candidate word given its training and the context. Sometimes it gets into a situation where the model's view of "context" doesn't change when the word is picked, so the next word is just the same. Then the same thing happens again and around we go. There are fail-safe mechanisms to try and prevent it but they don't work perfectly.

*Token

[–] ideonek@piefed.social 17 points 19 hours ago (1 children)

That was the answer I was looking for. So it's simmolar to "seahorse" emoji case, but this time.at some point he just glitched that most likely next world for this sentence is "or" and after adding the "or" is also "or" and after adding the next one is also "or", and after a 11th one... you may just as we'll commit. Since thats the same context as with 10.

Thanks!

[–] atomicbocks@sh.itjust.works -3 points 15 hours ago (1 children)

He?

This is not a person and does not have a gender.

[–] ideonek@piefed.social 25 points 15 hours ago* (last edited 14 hours ago) (1 children)

Chill dude. It's a grammatical/translation error, not an ideological declaration. Especially common mistake if of your native language have "grammatical gender". "Spoon" is a "she" in my language, but im not proposing to any one soon. Not all hills are worth nitpicking on.

[–] MonkderVierte@lemmy.zip 5 points 18 hours ago

I've got it once in a "while it is not" "while it is" loop.

[–] ech@lemmy.ca 16 points 15 hours ago (1 children)

It's like the text predictor on your phone. If you just keep hitting the next suggested word, you'll usually end up in a loop at some point. Same thing here, though admittedly much more advanced.

[–] vaultdweller013@sh.itjust.works 1 points 17 minutes ago

Example of my phone doing this.

I just want you are the only reason that you can't just forget that I don't have a way that I have a lot to the word you are not even going on the phone and you can call it the other way to the other one I know you are going out to talk about the time you are not even in a good place for the rest they'll have a little bit more mechanically and the rest is.

You can see it looping pretty damned quick with me just hitting the first suggestion after the initial I.

[–] ch00f@lemmy.world 41 points 20 hours ago (1 children)

Gemini evolved into a seal.

[–] kamenlady@lemmy.world 10 points 19 hours ago

or simply, or

[–] Arghblarg@lemmy.ca 26 points 20 hours ago

LLM showed its true nature, probabilistic bullshit generator that got caught in a strange attractor of some sort within its own matrix of lies.

[–] palordrolap@fedia.io 12 points 18 hours ago (1 children)

Unmentioned by other comments: The LLM is trying to follow the rule of three because sentences with an "A, B and/or C" structure tend to sound more punchy, knowledgeable and authoritative.

Yes, I did do that on purpose.

[–] Cevilia@lemmy.blahaj.zone 6 points 13 hours ago (1 children)

Not only that, but also "not only, but also" constructions, which sound more emphatic, conclusive, and relatable.

[–] luciferofastora@feddit.org 1 points 47 minutes ago

I used to think learning stylistic devices like this was just an idle fancy, a tool simply designed to analyse poems, one of the many things you're most certain you'll never need but have to learn in school.

What a fool I've been.

[–] kogasa@programming.dev 12 points 19 hours ago

Turned into a sea lion

[–] squirrel@piefed.kobel.fyi 6 points 16 hours ago
[–] lividweasel@lemmy.world 2 points 14 hours ago (1 children)
[–] rockerface@lemmy.cafe 4 points 8 hours ago (1 children)

Platinum, even. Star Platinum.

[–] MotoAsh@piefed.social 1 points 3 hours ago

I don't see no 'a's between those 'or's for the full "ora ora ora ora" effect.