this post was submitted on 26 Dec 2025
345 points (98.9% liked)

Fuck AI

6873 readers
480 users here now

"We did it, Patrick! We made a technological breakthrough!"

A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.

AI, in this case, refers to LLMs, GPT technology, and anything listed as "AI" meant to increase market valuations.

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] damnthefilibuster@lemmy.world 43 points 4 months ago (3 children)

They could just… sell us some…

[–] GreenKnight23@lemmy.world 26 points 4 months ago (2 children)

I would only accept them if they were 75% off msrp or more.

[–] jj4211@lemmy.world 3 points 4 months ago (1 children)

Sure thing, that datacenter GPU is now like 3000 dollars....

[–] very_well_lost@lemmy.world 1 points 4 months ago

Probably closer to 6k. Datacenter GPUs start at around 25 grand a pop.

[–] pulsewidth@lemmy.world 14 points 4 months ago (2 children)

What would we do with them?

Ai GPUs are not the same as consumer GPUs - they're not PCIe x16 cards, they in fact don't even have a socket format... They're generally supplied onboard motherboards in a blade-format prefab server build.

[–] damnthefilibuster@lemmy.world 8 points 4 months ago

I wouldn’t mind having a local LLM or GenAI

[–] picnic@lemmy.world 1 points 4 months ago

I have had grid gpus. I'd love to have a few accelerators.

[–] skozzii@lemmy.ca 4 points 4 months ago (1 children)