this post was submitted on 22 Dec 2025
24 points (83.3% liked)

Technology

41050 readers
197 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 3 years ago
MODERATORS
 

Reminder: there are no video outputs on these chatbot data center processors driving up the prices of graphics cards.

So they can't even sell as used GPUs to crash the consumer GPU price market when the AI bubble pops.

This is a reminder that businesses aren't "money focused calculation machines that optimize for the maximum possible profit." They don't worry about every little dollar, they just print money and use it to control you.

Raising prices for you is the goal, not a byproduct of some other smarter plan.

Some people don't need the rest of this post, and it's very long, so I'll put it in a comment.

you are viewing a single comment's thread
view the rest of the comments
[–] thatsTheCatch@lemmy.nz 12 points 2 days ago (2 children)

I'm confused — GPUs main function is to be able to do lot's of calculations in parallel, vs a CPU which does one thing at a time (simplistically).

GPUs aren't only used solely for video, it's just that graphics are an excellent use case for this type of processing.

So I don't think AI companies are buying GPUs for video output and more because they can process lots of training calculations in parallel. Like how bitcoin miners use GPUs even though there's no video involved in that

[–] TehPers@beehaw.org 17 points 2 days ago (1 children)

To be more specific here, GPUs are really, really good at linear algebra. They multiply matrices and vectors as single operations. CPUs can often do some SIMD operations, but not nearly as well or as many.

Video games do a lot of LA in order to render scenes. At the bare minimum, each model vertex is being multiplied by matrices to convert from world space to screen space, clip space, NDC, etc which are calculated based on the properties of your camera and projection type.

ML also does a lot of LA. Neural nets, for example. are literally a sequence of matrix multiplications. A very simple neural net works by taking a vector representing an input (or matrix for multiple inputs), multiplies that by a matrix representing a node's weights, then passes the result to an activation function. Then does that a bunch more times.

Both functions want GPUs, but both need different things from it. AI wants GPUs with huge amounts of memory (for these huge models) which are optimized for data center usage (using cooling designed for racks). Games want GPUs that don't need to have terabytes of VRAM, but which should be fast at calculating, fast at transferring data between CPU and GPU, and capable of running many shader programs in parallel (so that you can render more pixels at a time, for example).

[–] iloveDigit@piefed.social 1 points 1 day ago (1 children)

This doesn't mean it would be near useless to just add video outputs to neural net cards though.

Used data center GPUs might be equivalent to a low end or outdated GPU with extra VRAM, but there would be so many of them on the market, you'd see stuff like games being optimized differently to make use of them.

[–] TehPers@beehaw.org 9 points 1 day ago (1 children)

Nvidia sold many of their data center GPUs as full server racks. The GPUs aren't in a form factor to use with a traditional PC and simply cannot slot into a PCIe slot because they don't have that kind of interface. Look up the DGX B200, which is shipped in a form factor intended for rack mounting and has 8 GPUs alongside two CPUs and everything else needed to run it as a server. These GPUs don't support video output. It's not that they just don't have an output port. They don't even have the software for it because these GPUs are not capable of rendering graphics (which makes you wonder why they are even called "GPU" anymore). They cannot be plugged into a PCIe slot because there is no interface for it.

[–] iloveDigit@piefed.social 0 points 1 day ago* (last edited 1 day ago) (1 children)

I try not to call them GPUs, though it's hard to avoid.

But I didn't know they're not even capable of rendering graphics at a deeper level than just not having a video output.

It sounds like you definitely know some stuff I don't, but wouldn't it be smart for these companies to bid a bit more if they could, to make these builds with more resellable parts instead of using these crazy server rack combo platters?

I still think it's an economy controlled top down by the authorities that makes this "profitable," and when you boil it down it's just a fancy mathy story to distract from them making special stuff for themselves they don't want to share with us

[–] TehPers@beehaw.org 7 points 1 day ago

wouldn't it be smart for these companies to bid a bit more if they could, to make these builds with more resellable parts instead of using these crazy server rack combo platters?

Their customers don't care about if they are resellable. They just want GPUs.

We aren't their customers, and I mean this in the most literal sense possible. You can't buy these. They only sell them to big companies.

Yes, it's shit.

[–] iloveDigit@piefed.social 3 points 2 days ago* (last edited 2 days ago) (1 children)

This all applies to cryptocurrency miners too.

In fact, it might be even more relevant there, because crypto miners compete so hard on electric bill costs, they definitely have to plan on liquidating equipment when it gets old enough, even if it still works. I think a lot of miners still use regular consumer GPUs to this day because with a specialized card that has no video output, it can depreciate from $1000+ to worthless almost instantly. There just end up being no buyers.

If this was all real business and not just the authorities controlling people, Nvidia would have competition offering similar cards with video outputs for a few cents more, because that product would make more business sense. But instead, it would be super expensive to add video outputs to specialized cards, because it would "cannibalize sales" for graphics cards later (i.e. give savings to consumers)

[–] FaceDeer@fedia.io 1 points 1 day ago (1 children)

No major cryptocurrency has used GPUs for mining for many years. Bitcoin uses completely custom ASICs and Ethereum switched away from proof of work entirely.

[–] iloveDigit@piefed.social 1 points 1 day ago (1 children)

Incorrect. Monero and others still use GPU based mining

[–] FaceDeer@fedia.io 1 points 1 day ago (1 children)

I said no major cryptocurrency. Monero's got a market cap of $8 billion, it's small fry.

[–] iloveDigit@piefed.social 1 points 1 day ago* (last edited 1 day ago) (1 children)

Incorrect again. You mentioned Ethereum which nobody cares about, you can't call Monero "not major" after that. The only cryptocurrencies that matter are Bitcoin, doggie coin, and Monero

If market cap was relevant then crypto veterans like me would care about Ethereum

[–] FaceDeer@fedia.io 3 points 1 day ago (1 children)

Ethereum's got a market cap of $350 billion and it's where all the new development is going on, according to the Electric Capital Developer it has by far the most developers working on and with it. Approximately 65% of all new code written in the entire crypto industry is written for Ethereum or its Layer 2 scaling solutions (like Arbitrum, Optimism, and Base).

It's spelled "Dogecoin," by the way.

[–] iloveDigit@piefed.social 0 points 1 day ago* (last edited 1 day ago) (1 children)

The "dogecoin" spelling has been ruined by people calling it "doj coin"

And market cap isn't relevant, nor is whoever "electric capital developer" is or whatever chat bots you're calling "the most developers"

Bitcoin, doggie coin, and Monero are the only ones standing the test of time so far. Ethereum is "proof of stake" now

[–] moomoomoo309@programming.dev 2 points 1 day ago* (last edited 1 day ago) (1 children)

That is, in fact, how "dogecoin" is pronounced (doj coin). Here's one of the two creators of dogecoin saying as much (and why!): https://youtu.be/kVDcOI0-gdQ

[–] iloveDigit@piefed.social -1 points 1 day ago (2 children)

I stopped taking the creators opinions on doggie coin seriously when they started calling it doj coin

Btw, notice that the Monero community is way more active than Ethereum or doggie coin in decentralized platforms like piefed/Lemmy or nostr

[–] moomoomoo309@programming.dev 1 points 37 minutes ago (1 children)

Okay, but, did you actually watch the video? It's based on Homestar Runner and how they intentionally mispronounced something - the mispronunciation is entirely intentional.

[–] iloveDigit@piefed.social 1 points 31 minutes ago* (last edited 30 minutes ago)

I don't remember the meme from anything to do with homestar runner, what I remember before doggie coin was pictures of dogs (mainly Shiba Inu) with horribly misspelled uplifting messages written in colorful comic sans

[–] boonhet@sopuli.xyz 1 points 12 hours ago (1 children)

Why would it be pronounced doggie if it's named doge?

Masterful trolling in this entire thread unless you're being serious.

[–] iloveDigit@piefed.social 1 points 10 hours ago (1 children)
[–] boonhet@sopuli.xyz 1 points 10 hours ago (1 children)

So you call the original meme dog-e too? That doesn't roll off the tongue nearly as nicely as doge does

[–] iloveDigit@piefed.social 1 points 9 hours ago* (last edited 9 hours ago) (1 children)

The original meme does seem to be a misspelling of doggie, not a use of the old English term pronounced "doj"

I don't see why anyone would call it "doj"

[–] boonhet@sopuli.xyz 0 points 9 hours ago (1 children)

It's misspelled on purpose so you'd read it differently lol

Do you have a diagnosis yet, or are you still rolling with undiagnosed 'tism?

[–] iloveDigit@piefed.social 0 points 9 hours ago* (last edited 9 hours ago) (1 children)

Weak gaslighting attempt

Misspelling doggie as doge doesn't change the pronunciation to "doj"

There would either be no change at all or a change to something more like "dowg" because it's understood that the word is still supposed to be based on "dog." There's no reason you'd lose the G sound from "dog" here

Learn how spelling and pronunciation work

And if you want to gaslight me, I'm generally just not very susceptible, but I at least take people more seriously on nostr because there are no bans there. Would make more sense to try there

[–] boonhet@sopuli.xyz 1 points 8 hours ago (1 children)

Literally nobody pronounces "doge" as "doggie" or "dog". Everyone I know pronounces it as "doge".

It's an intentional misspelling, you're supposed to pronounce it wrong (compared to "dog") intentionally.

[–] iloveDigit@piefed.social 2 points 8 hours ago

So again, that would be something like "dowg" or "doggeh" instead of "doggie"

Not "doj"

I never heard anyone read it as "doj" until about when Elon Musk started pushing that pronunciation, which was based on an old English word for a feudal lord or something, not how people automatically read that spelling

I feel like you're just be wasting my time on purpose, why would anyone actually read it like that? You're probably trolling