this post was submitted on 27 Feb 2026
237 points (97.2% liked)

Technology

82001 readers
3253 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
237
submitted 1 day ago* (last edited 1 day ago) by Beep@lemmus.org to c/technology@lemmy.world
 

The narrative in AI infrastructure over the last two years has been dominated by the enormous and growing demand for compute capacity and its economic consequences, such as the buildout of data centers and the consequent shortages of key resources such as land, water, power, and copper.

But of all these bottlenecks, memory is by far the most significant. The demand for memory is now outpacing the demand for other drivers of compute capacity. The implications of this will ripple through not just the economics of data centers, but the cost of every single consumer and enterprise hardware device.

In this piece, we unpack the market action around memory prices, its ripple effects across the consumer and industrial electronics market, and the supply and demand curve that is emerging around AI. Critically, we explain why the amount of memory being purchased by AI companies like OpenAI seems to be more than what they need, and how the threat of on-device inference might actually be incentivizing an engineered memory shortage.

you are viewing a single comment's thread
view the rest of the comments
[–] daannii@lemmy.world 23 points 18 hours ago (2 children)

Get ready to only own screens. And every thing is processed via the cloud on data centers destroying your community and livelihood.

And of course you will be paying for every minute of it.

[–] phx@lemmy.world 3 points 2 hours ago

You won't open the content, and you'll have absolutely 0% privacy because all of your processing and data will be on somebody else's system.

AI will go from making cute GIF's to fully automated surveillance and ensuring nobody uses those systems for anything not approved by the regime.

[–] Bakkoda@lemmy.world 4 points 7 hours ago (1 children)

Currently contracting in an "automated" manufacturing center in the US and all of our SCADA traffic is on the global network currently being routed through the UK because oops and it's hilariously bad. Systems designed for millisecond polling talking 1-30 seconds to react.

[–] muusemuuse@sh.itjust.works 2 points 7 hours ago

SCADA systems are universally terrible. Clouding them does not resolve that. Fuck whoever decided to do this.