this post was submitted on 27 Feb 2026
185 points (96.5% liked)
Technology
81933 readers
2644 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
For those who don't want to read several pages of unnecessary text telling you what you probably already know:
So this is just putting some numbers to what a lot of people already guessed. The AI companies are not just buying a ton of RAM to build out their data centers. They aren't buying enough other components to even use that RAM. They're buying it so that no one else can.
I'm just not connecting the dots. The amount of money they're spending on this is astronomical, and they are burning through the cash they have at a rate they can't sustain, while they're fighting for their future against Google, Anthropic, plus xAI and Perplexity and others, and maybe foreign competition like Deepseek that the government can't fully shield them from. While also competing with major data center companies themselves, who may want to build data centers for other non-AI purposes, too. And those competitors have deep, deep pockets.
If they don't have a revenue model that actually keeps them afloat, then all their capital expenditures will end up going to benefit someone else.
In other words, the central thesis that they want to choke out competition from on-device models kinda ignores that they're facing a much more immediate, much more pressing threat from their data center competition. It's like trying to corner the market on snow shovels when a hurricane is bearing down.
Plus one important thing worth noting is that OpenAI purchased the option to buy that much memory, enough to persuade the memory manufacturers to change their own investment decisions for the next 5 years. They're not necessarily going to actually buy that much. And in theory could sell that option to others. 40% of the market is enough to really move prices, but not enough to actually corner it and exclude others from buying memory. They'll just have to make it more expensive for themselves at the same time that they make it more expensive, but not impossible, for their true competitors also outfitting data centers.
And market supervision – is not existent.
It's OpenAI in particular trying to screw everyone else. The wafers they contracted from Samsung and SK Hynix are something like 40% of those companies' production. There isn't enough production volume for the other AI companies to over order like that.
So this is the strategy of putting 4 houses on your properties in Monopoly and never upgrading them to hotels because that way there are no houses for your opponents to buy
Hopefully this accelerates their crash.
Not if they sell it on the surge.
They may have millions extra, and that just means they've now become a shitty version of best buy as they schlepp it at surge pricing to make back the bank.