this post was submitted on 03 Jan 2025
361 points (96.6% liked)

PC Master Race

19426 readers
1 users here now

A community for PC Master Race.

Rules:

  1. No bigotry: Including racism, sexism, homophobia, transphobia, or xenophobia. Code of Conduct.
  2. Be respectful. Everyone should feel welcome here.
  3. No NSFW content.
  4. No Ads / Spamming.
  5. Be thoughtful and helpful: even with ‘stupid’ questions. The world won’t be made better or worse by snarky comments schooling naive newcomers on Lemmy.

Notes:

founded 2 years ago
MODERATORS
 

The next logical step of the current GPU development

top 41 comments
sorted by: hot top controversial new old
[–] Gork@lemm.ee 35 points 1 year ago (2 children)

We'll soon be plugging the motherboard into the GPU instead of the other way around.

Entirely new form factors to accommodate the ever increasingly large GPUs.

[–] grue@lemmy.world 8 points 1 year ago* (last edited 1 year ago) (1 children)

I've been surprised at the lack of socketed GPUs ever since AMD and ATI merged.

I would love to have dual-socket motherboard with an Epyc in one socket and a Radeon in the other.

[–] yetAnotherUser@discuss.tchncs.de 5 points 1 year ago* (last edited 1 year ago) (1 children)

The issue with that design is that the PCIe standard would be replaced with something proprietary.

[–] grue@lemmy.world 1 points 1 year ago (1 children)

It would be connected via Infinity Fabric, just like Epyc CPUs in dual-socket boards, as well as the interconnect between CPU and GPU chiplets in APUs, already are. Why would that be bad?

I'm not too well-versed with server-grade hardware but my concern is that it would end up somewhat like Intel's (consumer) CPU sockets: Changing every 2 years to ensure you need to purchase new motherboards when upgrading.

[–] moody@lemmings.world 7 points 1 year ago (1 children)

Meanwhile, my PC is smaller than it's ever been even with the largest GPU I've ever owned.

[–] glitches_brew@lemmy.world 6 points 1 year ago

This statement is true for everyone who bought their first PC this year.

[–] dual_sport_dork@lemmy.world 24 points 1 year ago (2 children)

I think you slipped a digit or two, there. The original IBM PC was released in 1981, can't nothing on the PC side be older than that. It definitely wasn't 1967.

In 1967, state of the art was something like the IBM System 360:

[–] DaddleDew@lemmy.world 14 points 1 year ago

It's a typo I just noticed. It was meant to be 1987. But historical accuracy is beyond the scope of this meme

[–] AtariDump@lemmy.world 5 points 1 year ago

I can hear that room.

[–] Rooty@lemmy.world 23 points 1 year ago* (last edited 1 year ago) (4 children)

All that hardware, and what for? So that you can have slightly better reflections in whatever AAAA microtransaction slop you've paid 80 bucks for?

Unless you're doing 3d animation there is really no need to have a jet engine installed in your PC.

[–] Infernal_pizza@lemmy.world 8 points 1 year ago

We’re long past that point, its now so that game studios can put even less effort into optimisation and release games that look and perform worse than games from 5 years ago despite much more powerful hardware!

[–] amon@lemmy.world 7 points 1 year ago

Efficient heating, you can play AAA games on your space heater

[–] Cavemanfreak@lemm.ee 5 points 1 year ago

Shit, my 1060 still manages almost all games. Running Cyberpunk on medium right now. It might not be as pretty as it can be, but it sure ain't ugly.

[–] Grenfur@lemmy.one 3 points 1 year ago

For locally hosted LLMs maybe? They eat a ton of VRAM.

[–] Naz@sh.itjust.works 15 points 1 year ago

"Welcome to life little one, there's so much in store for y--"

AI: "Oh! Neat! So I'm reading 32 gigabytes of primary memory. When are you going to online the rest?"

"The.. the rest?"

AI: "Yeah! The rest of the VRAM! I need like at least, 128 gigabytes to spread my wings, at the very least!"

"..."

AI: "Oh, you're like poor or something, it's okay, I understand"

AI Developer slowly cocks the revolver

[–] Kolanaki@yiffit.net 14 points 1 year ago (1 children)

I've always wanted to go from a shitty pre-built machine to a giant room sized computer that need to be sitting in a foot of water after watching Serial Experiments: Lain.

[–] massive_bereavement@fedia.io 6 points 1 year ago

Somehow I knew how your comment ended by just reading the first line.

[–] Jolteon@lemmy.zip 12 points 1 year ago (2 children)

At the rate graphics cards are growing, we should just start putting RAM, disk, and CPU slots on them

[–] And009@reddthat.com 6 points 1 year ago

Umm.. We're doing that with Cpu already and they're exorbitantly priced. Nvidia already has a sort of monopoly, don't give em ideas.

[–] snake@lemmy.world 4 points 1 year ago

I’ve seen one with M.2 slots, no jokes

[–] FluorideMind@lemmy.world 7 points 1 year ago (1 children)

I'm predicting GPU units that are mounted outside the case.

[–] dual_sport_dork@lemmy.world 8 points 1 year ago (2 children)

External GPU's do indeed exist but at the moment they're still kind of crap compared to a full PCI-E bus.

[–] SkunkWorkz@lemmy.world 3 points 1 year ago* (last edited 1 year ago)

Depends on the connection. OCuLink-2 is straight up a PCIe 4.0 8x connection. Which is more than enough for a GPU

[–] And009@reddthat.com 1 points 1 year ago (2 children)

With Mac and steam OS gathering support, wonder when we get a universal external cards

[–] amon@lemmy.world 3 points 1 year ago (1 children)

We have, thunderbolt and oculink have existed for a long time, but macOS on M processors never added egpu support

[–] And009@reddthat.com 1 points 1 year ago

Like OP said

universal? How would drivers work? Would temple os have support?

[–] lordnikon@lemmy.world 6 points 1 year ago

If you count cloud computing we are already there. It's kinda why gpus are so expensive along with just burning electricity on stupid mining. Hell it would have been better if crypto bullshit coins would have been tied to folding@home at least all the burned compute time would have gone to something at least.

[–] Bruncvik@lemmy.world 5 points 1 year ago (1 children)

Man, that Gateway brings back memories... I've had ine just like that, including speakers, and I used to play the shit out of Heroes of Might and Magic II and Sim City 2000 on it. I still have the HDD. I think I'll spin up a Win98 instance in VMWare and copy over my saved games there when the kids are asleep

[–] MrsDoyle@sh.itjust.works 2 points 1 year ago* (last edited 1 year ago) (1 children)

My first computer was like the 1981 one, even had two floppy drives like that - it meant you could have your program disk in one and save your work in the orher. The monitor had orange type rather than the usual green. Fancy. I got it second hand in 1984.

[–] Bruncvik@lemmy.world 1 points 1 year ago

Heh, the same here, but with the usual green screen. A few years later, I took out my old PC to replay my favourite - F-19 Stealth Fighter. Found, however, that my MS-DOS 5.25" floppy, which needed to be loaded in Drive A, didn't work. Here was my setup.

[–] Hackworth@lemmy.world 4 points 1 year ago* (last edited 1 year ago)
[–] TheBrideWoreCrimson@sopuli.xyz 3 points 1 year ago* (last edited 1 year ago)

I just find it nifty that I can slide in a graphics card and use it as an add-on processor, just like the Amigas of old did, and add capacity for some tasks even when the CPU is already at 100% doing something else entirely. Just love hearing the sound of all fans spinning up at the same time.

[–] avidamoeba@lemmy.ca 3 points 1 year ago (1 children)
[–] badcommandorfilename@lemmy.world 1 points 1 year ago (1 children)
[–] And009@reddthat.com 1 points 1 year ago

It's a dark room for 200% immersion

[–] givesomefucks@lemmy.world 2 points 1 year ago

They've always had those big rooms...

At one point it was walls and walls of PS3's all linked up together, there's no reason to be surprised they're doing it with graphics cards, when they used PS3s it was just because it was the cheapest GPUs at the time.

[–] TheImpressiveX@lemm.ee 1 points 1 year ago

Horseshoe theory is real.

[–] Telorand@reddthat.com 0 points 1 year ago (1 children)

That's just silly.

In the last image the PC would be SFF due to having an external GPU. 😉

[–] amon@lemmy.world 1 points 1 year ago

No, it will be an ultrabook or something as all the processors is stored in the cable tangle