Not likely. I expect the AI bubble will burst before those software optimization gears even start to turn.
Showerthoughts
A "Showerthought" is a simple term used to describe the thoughts that pop into your head while you're doing everyday things like taking a shower, driving, or just daydreaming. The most popular seem to be lighthearted clever little truths, hidden in daily life.
Here are some examples to inspire your own showerthoughts:
- Both “200” and “160” are 2 minutes in microwave math
- When you’re a kid, you don’t realize you’re also watching your mom and dad grow up.
- More dreams have been destroyed by alarm clocks than anything else
Rules
- All posts must be showerthoughts
- The entire showerthought must be in the title
- No politics
- If your topic is in a grey area, please phrase it to emphasize the fascinating aspects, not the dramatic aspects. You can do this by avoiding overly politicized terms such as "capitalism" and "communism". If you must make comparisons, you can say something is different without saying something is better/worse.
- A good place for politics is c/politicaldiscussion
- Posts must be original/unique
- Adhere to Lemmy's Code of Conduct and the TOS
If you made it this far, showerthoughts is accepting new mods. This community is generally tame so its not a lot of work, but having a few more mods would help reports get addressed a little sooner.
Whats it like to be a mod? Reports just show up as messages in your Lemmy inbox, and if a different mod has already addressed the report, the message goes away and you never worry about it.
Not just that all of their ai slop code will be more unoptimized
Big AI is a bubble but AI in general is not.
If anything, the DRAM shortages will apply pressure on researchers to come up with more efficient AI models rather than more efficient (normal) software overall.
I suspect that as more software gets AI-assisted development we'll actually see less efficient software but eventually, more efficient as adoption of AI coding assist becomes more mature (and probably more formalized/automated).
I say this because of experience: If you ask an LLM to write something for you it often does a terrible job with efficiency. However, if you ask it to analyze an existing code base to make it more efficient, it often does a great job. The dichotomy is due to the nature of AI prompting: It works best if you only give it one thing to do at a time.
In theory, if AI code assist becomes more mature and formalized, the "optimize this" step will likely be built-in, rather than something the developer has to ask for after the fact.
It's not just garbage software. So many programs are just electron apps which is about the most inefficient way of making them. If we could start actually making programs again instead of just shipping a webpage and a browser bundled together you'd see resource usage plummet.
In the gaming space even before the RAM shortage I've seen more developers begin doing optimization work again thanks to the prevalence of steam deck and such so the precedent is there and I'm hopeful other developers do start considering lower end hardware.
Probably a super unpopular take, but the Switch and Switch 2 have done more for game optimization than the Steam Deck has by sheer volume of consoles sold than the Steam Deck ever could. I agree the Steam Deck pushed things further but the catalyst is the Switch/2
I take it the Switch/S2 has many non-Nintendo games shared with other consoles? Hard to search through 4,000 titles on Wikipedia to find them at random, but I did see they had one Assassin's Creed (Odyssey) at the game's launch. I never really had Nintendo systems and just associate them with exclusive Nintendo games.
I'm choosing to believe the Steam Machine will do more of the same for PC games. Maybe it won't force optimization at launch, but I hope it maintains itself as a benchmark for builds and provides demand for optimization to a certain spec.
Web apps are a godsend and probably the most important innovation to help move people off of Windows.
I would prefer improvements to web apps and electron/webview2 if I had to pick.
If those web apps were using the same shared electron backend then they could be "a godsend". But each of those web apps uses it's own electron backend.
No, everything will just become subscription based.
And powered by the cloud
🤣 Nah, they'll enforce mandatory cloud computing.
You'll just have a "terminal"
It’s crazy that people don’t see this is where computers are heading.
The day tech bros realized they could squeeze recurring monthly subscriptions out of you for basically increasingly banal shit the writing was on the wall. The end game is that you have a chromebook with 800 subscriptions to streaming services for your os, music, movies, tv, games, image editing software, music DAWs, plugins for both the aforementioned softwares, subscriptions for hardware associated with the software (eg drawing tablets or midi keyboards), etc but covering every niche you can possibly think of and not just graphic art and music.
And when you bitch about it tech bros and weird alphas and young zoomers who were raised on this ecosystem and indoctrinated by it will go “well you see it’s fair because updates cost money to develop” as if the old system of expecting bug fixes and security patches to be free but not necessarily feature updates was unfair. Like if I buy a car and it’s fucked up I expect it to be fixed for free but I don’t expect them to feature match the next model year.
Tech workers are disproportionately high paid and so whiney when they have to provide even a modicum of support because then they have to potentially cut into that disproportionate high pay. Like “oh no i make 80-150,000+ a year but if i support this I’ll have to work more without generating sales and will maybe only make 60-130,000+. The horror!” fuck those libertarian shitstains that are literally overthrowing an entire government (and possibly more) with technofacism so that they can justify their “I know python, I should be able to earn as much as I want, fuck ethics, I never emotionally matured past 16” bullshit
Not when AI is writing the code.
Maybe it'll write native apps instead of garbage web/electron/chrome apps
Narrator:
'It didn't'
there is no “shortage” just capitalism testing the limits of various bubbles.
You fool, humans are flexible enough to get used to slow experiences. Even if the average user needs to have discord, slack, 100 chrome tabs, word and any other electron app opened simultaneously, he will just go through his work. He may not be happy with it but still continue without changing his habits.
But to be honest, I goddamn hope you are right!

It's a really nice idea, but bad developers are already so deep in the sunk cost fallacy that they'll likely just double down.
Nobody reassesses their dogma just because the justification for it is no longer valid. That's not how people work.
There's plenty of "unbloated" software available. It's just not on Windows.
Which unbloated browser do you use?
(This isn't a dig or a gotcha, I'm serious, I'm looking to switch browsers)
Shouldnt Firefox or a Fork of Firefox like Waterfox or ZenBrowser be fine?
Found the silver lining guy.
Love the optimism but yeah, the impact on software dev will be minimal, if there even is one.

Naaaah, you are just going to have to run it in the cloud optimised by AI for the low low price of both your kidneys so Bezos, Mark and Elon can continue partying.
The "shortage" is temporary and artificial, so that's a hard NO. The ram shortage doesn't present any incentive to make apps more efficient because the hardware and software that is already in people's homes won't be effected by the shortage and people who currently use the software won't be affected by the shortage. The very small percentage of people that will be affected by the temporary shortage wouldn't justify making changes to software that is currently in development.
There's no incentive for software companies to make their code more efficient until people stop using their software so stop using it and it will get better. Just as an example Adobe reader is crap, just straight up garbage, but people still use it so the app stopped getting improvements many years ago. Then Adobe moved to a subscription based system, and cloud service for selling your data but guess what, it's still the same app that it was 10 years ago, just more expensive.
One of those little truisms folks forget is that optimising software takes a LOT longer than making something that just works.
The RAM shortage will end before any meaningful memory optimizations can be made.
Naw it's easy:
void* malloc(size_t size) {
return std::malloc(size/2);
}
I opened Photoshop, and I left it open with no document open. Just the main window. It started at 11 GB of RAM and went up to 28 gb without me doing anything.
If there was anything that was as good as Photoshop, I’d have switched years ago. But I’ve tried the alternatives, and they’re just is nothing like it. Same for InDesign. Affinity photo is really really close, but it’s just not the same.
I’ve been using Photoshop for over 30 years. Even when the time comes, making the switch will be very difficult.
edit: I just tried opening PS again and letting it sit. it's hovering around 3-3.5GB of ram usage. I think that last attempt was a fluke.
I’ve been using Photoshop for over 30 years. Even when the time comes
It's not coming. Not for you, anyway.
Tbf software is bloated because higher ups who don't use computers besides microsoft excel tell programmers to not optimize.
If it gets the job done, don't spend anymore hours on it. Perfection doesn't bring any more revenue. Less rewards for the effort they say. Incentives are not there for optimization .
no, they don't care about users or if they're literally cooking ram, they'll keep it bloated, and probably make it more bloated
I'm currently running Fedora Linux with Firefox and YouTube opened up. The whole system uses ~4GB of memory. That's totally fine and I couldn't care less about what Microsoft is doing with their OS.
With that said, I don't think we'll see a lot of optimizations in commercial software. Maybe a few here and there, but a lot of developers nowadays don't even know how to optimize their code. Especially people working in web development or adjacent frameworks. Let's just throw hundreds of npm packages into one project and bundle them up with webpack, here's your 12MB JavaScript - take it or leave it. Projects like this aren't the exception, they are the norm.
Even if the devices that can run that code without running out of memory get more expensive, companies will just pay for those and write them off on the taxes. And if not, more apps will just get pushed into the cloud.
Nah, you'll get 8GB and swap on nvme. Or, you'll get to rent a terminal server slot for just $30 a month.
I was about to post something like that, and hit electron apps and all that. Thanks OP.
No, devs don't optimize unless there is a real incentive. Coding speed, eayse, portability (electron, or whole browser+nodejs packaged as an executable) will do
I suspect companies behind needlessly memory-intensive software would rather push (harder) towards cloud services, or ignore the problem entirely - I'm sure they'll find a way to enshittify their products in a way that solves the problem for them, or see lower profits and learn absolutely nothing.
If the software in question is something people need for their job, those companies can absolutely just decide that it's not their problem and that you'll just have to face the shortage head-on.
I recall listening to half of a video from SumitoMedia, where his answer to your question is, quote, "do you hear how fucking stupid you sound?" (you can probably guess why I didn't watch the rest of it).
Wouldn't that be nice! Yeah I think it'll totally work.
Hey, I think I see someone right now, they're switching from writing in Python to writing in assembly! "Hey buddy, don't forget to clear that register! And don't forget you'll need to write this all over from scratch to get it to work on any other platform!"
It still costs more to rewrite all your existing code sooo no.
Where I'm eyeing resource usage is in the cloud right now. I run a few discourse instances which seem really inefficient to me - 1.5G ram for just a discussion board. I have to dedicate a server for each one, whereas my rust web servers can have more like 30meg usage. Probably doing a lot less stuff, but still.
…and grocery store prices will go back down, too.