this post was submitted on 13 Jan 2026
314 points (98.2% liked)

Technology

78880 readers
1718 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 
  • The AI-driven memory shortage doesn't just affect PCs
  • More capacity is coming, but not before 2027
  • Low-margin budget products are likely to be hit hardest
all 50 comments
sorted by: hot top controversial new old
[–] cmnybo@discuss.tchncs.de 128 points 6 days ago (4 children)

So start making more dumb TVs and speakers. That way they don't need much memory.

[–] SatansMaggotyCumFart@lemmy.world 69 points 6 days ago (4 children)

I just want a straight up monitor with no built in wifi or Bluetooth or even speakers.

Just display the image I want it to when I want it to.

[–] galaxy_nova@lemmy.world 33 points 6 days ago

Literally this. Let me use my own fucking box and my own goddamn pc how I want with big monitor. Also put DP on it fuck HDMI. I don’t need trackers in my bloody tv OS. My tv has an OS!!! That’s ridiculous. Sorry this is one of those things I’ve always been super pissed about.

[–] riskable@programming.dev 6 points 5 days ago

Every modern monitor has some memory in it. They have timing controllers and image processing chips that need DRAM to function. Not much, but it is standard DDR3/DDR4 or LPDDR RAM.

[–] LodeMike@lemmy.today 8 points 6 days ago (1 children)
[–] Supervisor194@lemmy.world 9 points 6 days ago

For real, all my TVs are Sceptre. They are not easy to find, but they are incredibly reasonably priced and completely dumb perfection.

If people like us make them popular enough, maybe they'll realize this is a lucrative niche market.

[–] LaunchesKayaks@lemmy.world 6 points 6 days ago

I use old dumb tvs as monitors. They don't have the best picture, but are better and cheaper than anything currently for sale.

[–] YourAvgMortal@lemmy.world 34 points 6 days ago

But how will they play ads‽

[–] Tim_Bisley@piefed.social 8 points 6 days ago (3 children)

Man I'm really dreading buying a new TV. Been going strong with my plasma for years. I don't need any "smart" features in a tv. From what I understand you either get a good TV or a "dumb" TV, pick one.

[–] blitzen@lemmy.ca 12 points 6 days ago (1 children)

But a good tv, and don’t connect it to the internet. Apple TV or Shield device.

[–] adarza@lemmy.ca 18 points 6 days ago (3 children)

televisions of the near future when you first turn them on: "Internet connection and account required to complete initial product set up."

[–] cmnybo@discuss.tchncs.de 9 points 6 days ago

I would immediately return that as defective. I'd rather use that old 1980's portable TV that's been collecting dust in my closet since they shut down the analog TV broadcasts.

[–] f4f4f4f4f4f4f4f4@sopuli.xyz 2 points 5 days ago

These already exist. I've also seen some that technically work but have pop-ups complaining about being offline.

[–] kameecoding@lemmy.world 1 points 5 days ago

My TV channels now come in through an app and just recently it started working on my Nvidia shield, still I can't really disconnect the net from my samsung tv because explaining to my father how to use the tv app is already hard, he would not handle using it through the nvidia shield.

[–] chillpanzee@lemmy.ml 3 points 6 days ago

Yeah, pretty much. Any decent display on a TV is gonna have all the shit you don't want, but you can (at least for now) just not connect your smart TV to the internet. You;'d have a good TV with the "smarts" neutered. But you're still paying for it.

It's a bit like the ATSC broadcast stack. You likely aren't using it, but the industry still makes you pay for it (and it's not cheap).

load more comments (1 replies)
[–] anon_8675309@lemmy.world 38 points 5 days ago (2 children)

Good. Maybe we’ll get dumb TVs again.

[–] DannyMac@sh.itjust.works 4 points 5 days ago (1 children)

Idk, even the oldest HDTVs used RAM, sadly.

[–] Soup@lemmy.world 1 points 5 days ago (2 children)
[–] Lenggo@lemmy.world 4 points 5 days ago

Ha. judging by how well most of them run even today, not very much

[–] DannyMac@sh.itjust.works 1 points 4 days ago

I'm not sure. Current "dumb" TVs do actually exist, just search for commercial TVs. All the major brands make them for commercial applications where the end user has no need for traditional smart apps. Sadly, they cost a bit more (but not by a ridiculous amount) and may not have the latest alphabet soup image tech, but they look fine and work great. They still run an OS, but it's stripped down to the bare essentials, usually just settings and input, not usually any apps. If any apps are installed, they're unobtrusive and geared toward their intended purpose. I've been tempted to buy one, but the larger the TV the wider the price gap. The sweet spot seems to be at 65".

[–] Mynameisallen@lemmy.zip 47 points 6 days ago (2 children)

Here’s an idea.. let’s stop. Let’s just fucking stop with AI?

[–] Lost_My_Mind@lemmy.world 23 points 6 days ago (1 children)

votes for MynameisAllen for president

Now if only we knew their name...

[–] fartsparkles@lemmy.world 7 points 6 days ago

Pretty sure it’s My Name Is All En.

Dr En En En.

[–] FenrirIII@lemmy.world 4 points 5 days ago

There's too much imaginary money in AI that tech bros are stealing.

[–] ViscloReader@lemmy.world 22 points 5 days ago (2 children)

That's the thing I keep telling people around me. It's not only RAM. Too many critical components have ram.

If you increase RAM, you increase computer prices. (And I'm talking ALL kinds of computer, from microcomputers to desktop to servers)

If you increase computer prices you increase costs for almost everything

[–] kameecoding@lemmy.world 4 points 5 days ago (1 children)

I was thinking how this will affect cars, all of them have a bigass tablet in the middle nowadays

[–] mech@feddit.org 4 points 5 days ago* (last edited 5 days ago) (1 children)
[–] GamingChairModel@lemmy.world 5 points 5 days ago

90GB of both RAM+NAND combined. I'm guessing most of it is actual persistent storage for all the stuff the infotainment system uses (including imagery and offline map data for GPS, which is probably a big one), rather than actual memory in the sense of desktop computing.

[–] SendMePhotos@lemmy.world 25 points 6 days ago

I don't even fucking use the smart TV functions as it is. Fuck all that noise.

[–] Rentlar@lemmy.ca 14 points 6 days ago (1 children)

I 'member when headphones were just a copper wire to a magnet dressed in plastic, and none of this garbage required memory. Can we go back to that age? Thanks.

[–] artyom@piefed.social 15 points 6 days ago (1 children)

2027 sounds right. No way these fabs don't know this shit is temporary, so unlikely they'll increase production.

[–] Sabin10@lemmy.world 4 points 6 days ago (1 children)

Unfortunately it's the people between the manufacturers and the consumers that think this current iteration of AI is the future. They even seem to think we want it and can't wait to pay them for it.

[–] artyom@piefed.social 7 points 6 days ago

No one thinks that. Not the hardware OEMs, not the consumers, not the CEOs, not even the investors. It's all just a grift to see how high it can get before it pops.

[–] notreallyhere@lemmy.world 8 points 6 days ago (1 children)

its coming for your babies

[–] SreudianFlip@sh.itjust.works 4 points 6 days ago

cf. Children of Men

[–] sundray@lemmus.org 8 points 6 days ago

And the OEM's are like:

[–] sturmblast@lemmy.world 5 points 5 days ago

Just like those popular 3d tvs right?

[–] coherent_domain@infosec.pub 6 points 6 days ago* (last edited 5 days ago) (2 children)

Question: will AI eventually hurt CPUs? Like memory companies, the TSMC also only have finite production capacity.

[–] BeardedGingerWonder@feddit.uk 1 points 5 days ago

I'm beginning to wonder if AI is actually going to set us back technologically. Certainly it seems to be creating or widening a technology divide.

[–] solrize@lemmy.ml 3 points 6 days ago (3 children)

Why do TV's and audio gear use memory? TV's ok I can sort of understand a little, but audio? That's still analog right? Or anyway mostly analog.

[–] empireOfLove2@lemmy.dbzer0.com 17 points 6 days ago (2 children)

All digital devices will use some amount of memory. Audio devices are all digital these days and only use a DAC (Digital to Analog Converter) to generate the actual audio waveform from a raw sample stream.

On something like a standalone audio amp there still has to be the whole backend to store codec information, menus and settings, and a whole host of other controls and audio processing features that are likely implemented on top of a basic OS and not directly written to a microcontroller. There's more memory than you think.

[–] b_tr3e@feddit.org 6 points 6 days ago* (last edited 6 days ago)

"Codec information" is in ROM or implemented in hardware directly. Even studio quality audio interfaces that are DSP comtrolled will need only relatively small amounts of RAM; relatively slow memory for variable space and slightly faster mem for buffering. Both in the megabyte range and far from the speed that GPUs or AI require.

load more comments (1 replies)
[–] stoy@lemmy.zip 2 points 6 days ago

Depends on if you have analog cabled headphones, like the Meze Empyrians or the Philips X2HR Fidelio, then they are analog, but wireless or even digital headphones with USB/Lightning has ram.

load more comments (1 replies)