this post was submitted on 25 Jan 2026
407 points (97.0% liked)

Programmer Humor

28805 readers
538 users here now

Welcome to Programmer Humor!

This is a place where you can post jokes, memes, humor, etc. related to programming!

For sharing awful code theres also Programming Horror.

Rules

founded 2 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] Feathercrown@lemmy.world 8 points 2 days ago (2 children)

Why would it be by design? What does that even mean in this context?

[–] MotoAsh@piefed.social 4 points 1 day ago (2 children)

You have to pay for tokens on many of the "AI" tools that you do not run on your own computer.

[–] Feathercrown@lemmy.world 3 points 1 day ago* (last edited 1 day ago) (2 children)

Hmm, interesting theory. However:

  1. We know this is an issue with language models, it happens all the time with weaker ones - so there is an alternative explanation.

  2. LLMs are running at a loss right now, the company would lose more money than they gain from you - so there is no motive.

[–] jerkface@lemmy.ca 2 points 1 day ago (1 children)

it was proposed less as a hypothesis about reality than as virtue signalling (in the original sense)

[–] MotoAsh@piefed.social 1 points 6 hours ago* (last edited 6 hours ago)

No, it wasn't a virtue signal, you fucking dingdongs.

Capitalism is rife with undercooked products, because getting a product out there starts the income flowing sooner. They don't have to be making a profit for a revenue stream to make sense. Some money is better than no money. Get it?

Fuck, it's like all you idiots can do is project your lack of understanding on others...

[–] MotoAsh@piefed.social -2 points 1 day ago (1 children)

Of course there's a technical reason for it, but they have incentive to try and sell even a shitty product.

[–] Feathercrown@lemmy.world 2 points 20 hours ago (1 children)

I don't think this really addresses my second point.

[–] MotoAsh@piefed.social 1 points 6 hours ago

How does it not? This isn't a fucking debate. How would artificially bloating the number of tokens they sell not help their bottom line?

[–] piccolo@sh.itjust.works 1 points 1 day ago (1 children)

Dont they charge be input tokens? E.g. your prompt. Not the output.

[–] MotoAsh@piefed.social 3 points 1 day ago* (last edited 1 day ago) (1 children)

I think many of them do, but there are also many "AI" tools that will automatically add a ton of stuff to try and make it spit out more intelligent responses, or even re-prompt the tool multiple times to try and make sure it's not handing back hallucinations.

It really adds up in their attempt to make fancy autocomplete seem "intelligent".

[–] piccolo@sh.itjust.works 1 points 1 day ago (1 children)

Yes, reasoning models... but i dont think they would charge on that... that would be insane, but AI executives are insane, so who the fuck knows.

[–] MotoAsh@piefed.social 1 points 7 hours ago* (last edited 7 hours ago)

Not the models. AI tools that integrate with the models. The "AI" would be akin to the backend of the tool. If you're using Claude as the backend, the tool would be asking claude more questions and repeat questions via the API. As in, more input.

[–] deHaga@feddit.uk 3 points 2 days ago

Compute costs?