0
()
submitted a long while ago by @ to c/@
all 50 comments
sorted by: hot top controversial new old
[-] bappity@lemmy.world 134 points 5 months ago

ANTI UPGRADE?? WHAT THE FUCK

[-] aard@kyu.de 190 points 5 months ago

Intel is well known for requiring a new board for each new CPU generation, even if it is the same socket. AMD on the other hand is known to push stuff to its physical limits before they break compatibility.

[-] neo@lemy.lol 27 points 5 months ago

But why? Did Intel make a deal with the board manufacturers? Is this tradition from the days when they build boards themselves?

I thought they just didn't care and wanted as little restrictions for their chip design as possible, but if this actually works without drawbacks, that theory is out the window.

[-] A_Very_Big_Fan@lemmy.world 46 points 5 months ago

Just another instance of common anti-consumer behavior from multi billion dollar companies who have no respect for the customers that line their pockets.

[-] radau@lemmy.dbzer0.com 20 points 5 months ago

They used to dominate the consumer market prior to Ryzen so might have something to do with it but I got no evidence lol

[-] empireOfLove2@lemmy.dbzer0.com 16 points 5 months ago

Intel also sells the chipset and the license to the chipset software; the more boards get sold, the more money they make (as well as their motherboard partners, who also get to sell more, which encourages more manufacturers to make Intel boards and not AMD)

[-] tabular@lemmy.world 8 points 5 months ago* (last edited 5 months ago)

There are many motherboard manufactors but only 2 CPU manufacturers (for PC desktop). Board makers don't "makes deals" so much as have the terms dictated to them. Even graphics card manufacturers made them their bitch back when multi-GPU was a thing - it was them who had to sell their Crossfire/SLL technology on their motherboards.

[-] FiskFisk33@startrek.website 2 points 5 months ago

guess who sells the chipsets to the motherboard manufacturers

[-] just_another_person@lemmy.world 42 points 5 months ago

They've been pulling this shit since the early days. Similar tricks were employed in the 486 days to swap out chips, and again in the Celeron days. I think they switched to the slot style intentionally to keep selling chips to a point lol

[-] bappity@lemmy.world 17 points 5 months ago* (last edited 5 months ago)
[-] umbrella@lemmy.ml 7 points 5 months ago

thats why we are in dire need of open source hardware.

[-] bruhduh@lemmy.world 9 points 5 months ago

We have open source designs (RISCV also have GPU designs) but we don't have manufacture power open sourced yet

Are there any projects to develop that capability that you know of?

[-] bruhduh@lemmy.world 2 points 5 months ago

No, there isn't yet, there's the most i could find, but it's not machines

[-] umbrella@lemmy.ml 3 points 5 months ago

i dream of a world where the process will cheapen out enough like pcb design, where you can just submit the design you want and they will fab it out for you.

with more players coming into the game because of sanctions, i hope we are now on the path.

[-] bruhduh@lemmy.world 3 points 5 months ago

Yes, i hope so too, as for now, semiconductor lithography at home is impossible due how big and complex these machines are, so i have same opinion as you are

[-] tal@lemmy.today 2 points 5 months ago

https://www.cia.gov/readingroom/docs/DOC_0000498114.pdf

Soviet Computer Technology: Little Prospect for Catching Up

We believe that there are many reasons why the Soviets trail the United States in computer technology:

  • The Soviets' centrally-planned economy does not permit adequate flexibility to design or manufacturing changes frequently encountered in computer production; this situation has often resulted in a shortage of critical components

especially for new products.

[-] WhatAmLemmy@lemmy.world 11 points 5 months ago* (last edited 5 months ago)

If you're only response to criticism of capitalism is ((communism)), you may just be a cog in the corporate propaganda machine.

[-] ZombiFrancis@sh.itjust.works 2 points 5 months ago

I mean they went with a literal cia link.

load more comments (4 replies)
[-] grue@lemmy.world 14 points 5 months ago

IIRC, the slot CPU thing was because they wanted to get the cache closer to the processor, but hadn't integrated it on-die yet. AMD did the same thing with the original Athlon.

On a related note, Intel's anticompetitive and anti- consumer tactics are why I've been buying AMD since the K6-2.

[-] Evilcoleslaw@lemmy.world 5 points 5 months ago* (last edited 5 months ago)

They had integrated the L2 on-die before that already with the Pentium Pro on Socket 8. IIRC the problem was the yields were exceptionally low on those Pentium Pros and it was specifically the cache failing. So every chip that had bad cache they had to discard or bin it as a lower spec part. The slot and SECC form factor allowed them to use separate silicon on a larger node by having the cache still be on-package (the SECC board) instead of on-die.

[-] just_another_person@lemmy.world 2 points 5 months ago

AMD followed suit for the memory bandwidth part from the K62 architecture. Intel had no reason to do so.

[-] turmacar@lemmy.world 4 points 5 months ago* (last edited 5 months ago)

It's been at least since the "big iron" days.

Technician comes out to upgrade your mainframe and it consists of installing a jumper to enable the extra features. For only a few million dollars.

[-] Bezier@suppo.fi 32 points 5 months ago

But otherwise upgrade parts would be too affordable!

[-] bappity@lemmy.world 10 points 5 months ago
[-] MonkderDritte@feddit.de 5 points 5 months ago

No, "user security".

[-] Bezier@suppo.fi 74 points 5 months ago

I see pencil drawn traces are back on the menu

[-] Badeendje@lemmy.world 14 points 5 months ago* (last edited 5 months ago)

I used to use the liquid to repair rear window heating and a triple 0 brush on the althlon processors. I u locked the CPUs of all my buddies.. worked perfect.

[-] Evil_Shrubbery@lemm.ee 4 points 5 months ago

Good old single core days.

[-] mlg@lemmy.world 44 points 5 months ago

Turns out, the difference in the socket is just a few pins here and there, and you can make a 8th or 9th generation Coffee Lake CPU work on your Z170/270 board if you apply a few Kapton tape fixes and mod your BIOS,

Modders giving me a new reason to keep my ye olde z170 mobo instead of just making a new machine with all the nice hardware

[-] SpiceDealer@lemmy.world 35 points 5 months ago

Hackers are the only saving grace in this increasingly dystopian world.

[-] blarth@thelemmy.club 32 points 5 months ago

Reminds me of drawing lines on old AMD processors with graphite pencils.

[-] arcosenautic@lemmy.world 28 points 5 months ago

When will it stop? No really?

[-] Kecessa@sh.itjust.works 52 points 5 months ago

Vote with your wallet, go AMD

[-] Appoxo@lemmy.dbzer0.com 3 points 5 months ago

Until AMD does the same and it's back to square one?

[-] Pantsofmagic@lemmy.world 1 points 5 months ago

It's really unfortunate they kinda screwed over threadripper customers so bad in this way, but they're still the lesser evil by a country mile.

load more comments (1 replies)
[-] adespoton@lemmy.ca 30 points 5 months ago

It’s in the article; newer gen chips will have extra DRM that will prevent the hacks from working.

Oh, you meant when will the anti-hacks stop?

Bless your heart….

[-] Adanisi@lemmy.zip 4 points 5 months ago

DRM for CPUs.

All normal, nothing to see here, folks!

[-] bloodfart@lemmy.ml 13 points 5 months ago

That’s cool, but is there a subset of features or cpu bound operations or something that makes it worth going through the trouble just to run a faster(?) cpu with slower memory?

[-] BombOmOm@lemmy.world 7 points 5 months ago

Ah intel, never change!

[-] just_another_person@lemmy.world 6 points 5 months ago

Stellar work.

[-] dan1101@lemm.ee 4 points 5 months ago

For some reason I don't think I even knew Intel made motherboards.

[-] themoken@startrek.website 5 points 5 months ago

They don't, but they define the socket the processor slots into and probably did this to market the newer chips as more advanced than they are (by bundling a minor chip upgrade with an additional chipset upgrade that may have more uplift).

I see no other reason to kneecap upgrades like this when upgrading entails the consumer buying more of your product.

[-] JohnEdwa@sopuli.xyz 3 points 5 months ago* (last edited 5 months ago)

That's exactly what it is. I previously had Intel hardware for a few generations, but I got seriously pissed off that every time I wanted to upgrade, they had come up with a new incompatible socket and discontinued everything older so I had to also buy a new motherboard.

I think they might be a bit better at supporting older sockets these days, but still, too many sockets and incompatible chipsets.

[-] DontNoodles@discuss.tchncs.de 2 points 5 months ago

I wish there was something for HP 800 G3s. I bought them used after a lot of deliberations and would love to keep it running for as long as I can while not losing out on functionalities.

this post was submitted on 01 Jan 0001
0 points (NaN% liked)

0 readers
0 users here now

founded a long while ago