this post was submitted on 26 Jan 2026
225 points (99.1% liked)

PC Gaming

14387 readers
222 users here now

For PC gaming news and discussion. PCGamingWiki

Rules:

  1. Be Respectful.
  2. No Spam or Porn.
  3. No Advertising.
  4. No Memes.
  5. No Tech Support.
  6. No questions about buying/building computers.
  7. No game suggestions, friend requests, surveys, or begging.
  8. No Let's Plays, streams, highlight reels/montages, random videos or shorts.
  9. No off-topic posts/comments, within reason.
  10. Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)

founded 2 years ago
MODERATORS
all 25 comments
sorted by: hot top controversial new old
[–] atomicbocks@sh.itjust.works 24 points 2 months ago (1 children)

This looks really cool, but I have immediate concerns about the lack of ZIF socket for something designed to take 30 year old chips.

[–] SpikesOtherDog@ani.social 3 points 2 months ago (1 children)

The ZIF socket we know is patented. You would need to reinvent it to use it.

[–] atomicbocks@sh.itjust.works 1 points 2 months ago (1 children)

I would be surprised if there was anything on the board that didn’t have a patent covering it.

[–] SpikesOtherDog@ani.social 6 points 2 months ago (1 children)

ATX itself is a joint standard specifically for creating specifications vendors need to meet, so that relaxes a lot of things.

You need to pay to use HDMI from what I understand, but not DisplayPort.

That being said, I would think that integrating components would mean you are buying parts that have been patented and not that you are leasing the patents themselves. That will increase the cost of the boards, as would be the modularity for users to swap parts themselves.

I do know that soldered connections are much more stable than pins to board. So many times I have fixed a computer by reseating memory or a card. Corrosion seems to build up between connections that is much less common than properly placed and soldered chips. I personally prefer my memory and CPU modular.

If we do use modular components, we should look into first making a board that ONLY has CPU and memory, placing the remainder of components in PCIe.

[–] atomicbocks@sh.itjust.works 1 points 2 months ago (1 children)

I don’t think populating the board with a ZIF instead of a pin hole socket would have changed anything.

[–] SpikesOtherDog@ani.social 1 points 2 months ago

Yeah, I talked my way into realizing the component is probably the patented part and it's being resold.

[–] SharkAttak@kbin.melroy.org 3 points 2 months ago (1 children)

The chip shortage makes people take desperate measures.

[–] Kolanaki@pawb.social 1 points 2 months ago (2 children)

Could I make RAM modules using breadboards? 🤔

[–] SharkAttak@kbin.melroy.org 1 points 2 months ago

Yes, but you could also make your own chips

[–] Lee@retrolemmy.com 1 points 2 months ago

Ignoring signal integrity issues like noise, switching speed, impacts of resistance and capacitance compared to PCB and soldering, yes you could make a memory module that operates at slow speeds using a bread board. I think most hardware engineering students would have wired up memory chips on a breadboard (my school did anyway for applying memory mapped hardware), granted those weren't to any particular PC spec.

Before you think "why doesn't someone make open source PCB for modern RAM to help the shortage", the shortage issue is with the memory chips that go on the PCBs, not the boards themselves. What this does mean is that someone could in theory find cheap broken memory modules and combine their working parts to make good memory modules.