It's not quite that simple. Models you can run locally without paying tens of thousands for the hardware are not really comparable in quality to frontier models. Even with DeepSeek, you'd need a pretty beefy machine to run the full 671 billion parameter version. But that's rapidly changing with new techniques requiring drastically less memory and processing power. I expect that within a couple of years, we'll be able to run something comparable to Claude on a regular consumer laptop. And I think at a certain point the model gets good enough that you don't really need a better one. Even if the bleeding edge is going to keep getting pushed, there's going to be a point where it's not going to be relevant for most people. But we're not quite there yet today.
Sure, it's not that simple NOW, but the direction that China is taking with open source will eventually overwhelm the profit driven, energy gobbling crap currently on offer to a select few in the US.
It's not quite that simple. Models you can run locally without paying tens of thousands for the hardware are not really comparable in quality to frontier models. Even with DeepSeek, you'd need a pretty beefy machine to run the full 671 billion parameter version. But that's rapidly changing with new techniques requiring drastically less memory and processing power. I expect that within a couple of years, we'll be able to run something comparable to Claude on a regular consumer laptop. And I think at a certain point the model gets good enough that you don't really need a better one. Even if the bleeding edge is going to keep getting pushed, there's going to be a point where it's not going to be relevant for most people. But we're not quite there yet today.
Sure, it's not that simple NOW, but the direction that China is taking with open source will eventually overwhelm the profit driven, energy gobbling crap currently on offer to a select few in the US.
Right, that's basically what I'm arguing in the article. I'm just noting that we're not quite there today.