this post was submitted on 25 Jan 2026
418 points (97.1% liked)

Programmer Humor

30917 readers
828 users here now

Welcome to Programmer Humor!

This is a place where you can post jokes, memes, humor, etc. related to programming!

For sharing awful code theres also Programming Horror.

Rules

founded 2 years ago
MODERATORS
 
top 50 comments
sorted by: hot top controversial new old
[–] stsquad@lemmy.ml 101 points 2 months ago (4 children)

If you have ever read the "thought" process on some of the reasoning models you can catch them going into loops of circular reasoning just slowly burning tokens. I'm not even sure this isn't by design.

[–] swiftywizard@discuss.tchncs.de 62 points 2 months ago (1 children)

I dunno, let's waste some water

[–] gtr@programming.dev 7 points 2 months ago (1 children)

They are trying to get rid of us by wasting our resources.

[–] MajorasTerribleFate@lemmy.zip 11 points 2 months ago

So, it's Nestlé behind things again.

[–] SubArcticTundra@lemmy.ml 18 points 2 months ago (1 children)

I'm pretty sure training is purely result oriented so anything that works goes

[–] MotoAsh@piefed.social 3 points 2 months ago

Exactly why this shit is and will never be trustworthy.

[–] Feathercrown@lemmy.world 8 points 2 months ago (2 children)

Why would it be by design? What does that even mean in this context?

[–] MotoAsh@piefed.social 5 points 2 months ago (5 children)

You have to pay for tokens on many of the "AI" tools that you do not run on your own computer.

[–] Feathercrown@lemmy.world 5 points 2 months ago* (last edited 2 months ago) (9 children)

Hmm, interesting theory. However:

  1. We know this is an issue with language models, it happens all the time with weaker ones - so there is an alternative explanation.

  2. LLMs are running at a loss right now, the company would lose more money than they gain from you - so there is no motive.

[–] jerkface@lemmy.ca 2 points 2 months ago (1 children)

it was proposed less as a hypothesis about reality than as virtue signalling (in the original sense)

load more comments (1 replies)
load more comments (8 replies)
load more comments (4 replies)
[–] deHaga@feddit.uk 3 points 2 months ago

Compute costs?

[–] dream_weasel@sh.itjust.works 4 points 2 months ago

This kind of stuff happens on any model you train from scratch even before training for multi step reasoning. It seems to happen more when there's not enough data in the training set, but it's not an intentional add. Output length is a whole deal.

[–] Darohan@lemmy.zip 67 points 2 months ago
[–] Kyrgizion@lemmy.world 47 points 2 months ago

Attack of the logic gates.

[–] ideonek@piefed.social 28 points 2 months ago (6 children)
[–] FishFace@piefed.social 94 points 2 months ago (6 children)

LLMs work by picking the next word* as the most likely candidate word given its training and the context. Sometimes it gets into a situation where the model's view of "context" doesn't change when the word is picked, so the next word is just the same. Then the same thing happens again and around we go. There are fail-safe mechanisms to try and prevent it but they don't work perfectly.

*Token

[–] ideonek@piefed.social 18 points 2 months ago (21 children)

That was the answer I was looking for. So it's simmolar to "seahorse" emoji case, but this time.at some point he just glitched that most likely next world for this sentence is "or" and after adding the "or" is also "or" and after adding the next one is also "or", and after a 11th one... you may just as we'll commit. Since thats the same context as with 10.

Thanks!

load more comments (21 replies)
[–] MonkderVierte@lemmy.zip 5 points 2 months ago

I've got it once in a "while it is not" "while it is" loop.

load more comments (4 replies)
[–] ch00f@lemmy.world 45 points 2 months ago (1 children)

Gemini evolved into a seal.

[–] kamenlady@lemmy.world 11 points 2 months ago

or simply, or

[–] Arghblarg@lemmy.ca 26 points 2 months ago

LLM showed its true nature, probabilistic bullshit generator that got caught in a strange attractor of some sort within its own matrix of lies.

[–] ech@lemmy.ca 19 points 2 months ago (1 children)

It's like the text predictor on your phone. If you just keep hitting the next suggested word, you'll usually end up in a loop at some point. Same thing here, though admittedly much more advanced.

[–] vaultdweller013@sh.itjust.works 2 points 2 months ago (1 children)

Example of my phone doing this.

I just want you are the only reason that you can't just forget that I don't have a way that I have a lot to the word you are not even going on the phone and you can call it the other way to the other one I know you are going out to talk about the time you are not even in a good place for the rest they'll have a little bit more mechanically and the rest is.

You can see it looping pretty damned quick with me just hitting the first suggestion after the initial I.

load more comments (1 replies)
[–] palordrolap@fedia.io 15 points 2 months ago (1 children)

Unmentioned by other comments: The LLM is trying to follow the rule of three because sentences with an "A, B and/or C" structure tend to sound more punchy, knowledgeable and authoritative.

Yes, I did do that on purpose.

[–] Cevilia@lemmy.blahaj.zone 7 points 2 months ago (1 children)

Not only that, but also "not only, but also" constructions, which sound more emphatic, conclusive, and relatable.

[–] luciferofastora@feddit.org 2 points 2 months ago

I used to think learning stylistic devices like this was just an idle fancy, a tool simply designed to analyse poems, one of the many things you're most certain you'll never need but have to learn in school.

What a fool I've been.

[–] kogasa@programming.dev 13 points 2 months ago

Turned into a sea lion

[–] ChaoticNeutralCzech@feddit.org 16 points 2 months ago

Nah, too cold. It stopped moving and the computer can't generate any more random numbers to pick from the LLM's weighted suggestions.

[–] squirrel@piefed.kobel.fyi 9 points 2 months ago
[–] ZILtoid1991@lemmy.world 8 points 2 months ago

Five Nights at Altman's

[–] lividweasel@lemmy.world 6 points 2 months ago (1 children)
[–] rockerface@lemmy.cafe 5 points 2 months ago (1 children)

Platinum, even. Star Platinum.

[–] MotoAsh@piefed.social 2 points 2 months ago

I don't see no 'a's between those 'or's for the full "ora ora ora ora" effect.

[–] RVGamer06@sh.itjust.works 6 points 2 months ago

O cholera, czy to Freddy Fazbear?

[–] jwt@programming.dev 5 points 2 months ago

Reminds me of that "have you ever had a dream" kid.

[–] kamen@lemmy.world 3 points 2 months ago

If software was your kid.

Credit: Scribbly G

[–] DylanMc6@lemmy.dbzer0.com 2 points 2 months ago

The AI touched that lava lamp

load more comments
view more: next ›