this post was submitted on 23 Mar 2026
14 points (93.8% liked)

Technology

1402 readers
22 users here now

A tech news sub for communists

founded 3 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] yogthos@lemmygrad.ml 3 points 3 days ago (1 children)

That's what I'm thinking too. There's no reason why you couldn't make a chip like this for a full blown Deepseek model, and then when new models come out you just print new chips for them. The really nice part is that their approach doesn't need DRAM either because the state of each transistor acts as memory, it just needs a bit of SRAM which we don't have a shortage of.

I'm fully convinced that the whole AI as a service business model is going to be very short lived. Ultimately, nobody really likes their data going out to some company, and to have to pay subscription fees to use the models. If we start getting these kinds of specialized chips, they're going to be a game changer.

[–] CriticalResist8@lemmygrad.ml 1 points 2 days ago (1 children)

I could however totally see an economy where the chips themselves while cheap to produce cost a premium based on model and number of parameters.

Because the tech is certainly impressive and they have proof of concept. I don't know how scalable this is for them (or others), but it clearly works and shows immediate advantages. If it could integrate with existing consumer hardware, like say a PCI card you plug the chip into and switch them out when you want to change the model, anybody could easily have this at home.

But with capitalism we'd probably have to settle for DRM'd chips that self-destruct after X many tokens generated lol.

[–] yogthos@lemmygrad.ml 1 points 2 days ago

that's disgustingly plausible scenario