this post was submitted on 26 Dec 2025
372 points (98.2% liked)

Programming

24191 readers
310 users here now

Welcome to the main community in programming.dev! Feel free to post anything relating to programming here!

Cross posting is strongly encouraged in the instance. If you feel your post or another person's post makes sense in another community cross post into it.

Hope you enjoy the instance!

Rules

Rules

  • Follow the programming.dev instance rules
  • Keep content related to programming in some way
  • If you're posting long videos try to add in some form of tldr for those who don't want to watch videos

Wormhole

Follow the wormhole through a path of communities !webdev@programming.dev



founded 2 years ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] Ephera@lemmy.ml 68 points 1 week ago (15 children)

Yeah, gonna be interesting. Software companies working on consumer software often don't need to care, because:

  • They don't need to buy the RAM that they're filling up.
  • They're not the only culprit on your PC.
  • Consumers don't understand how RAM works nearly as well as they understand fuel.
  • And even when consumers understand that an application is using too much, they may not be able to switch to an alternative either way, see for example the many chat applications written in Electron, none of which are interoperable.

I can see somewhat of a shift happening for software that companies develop for themselves, though. At $DAYJOB, we have an application written in Rust and you can practically see the dollar signs lighting up in the eyes of management when you tell them "just get the cheapest device to run it on" and "it's hardly going to incur cloud hosting costs".
Obviously this alone rarely leads to management deciding to rewrite an application/service in a more efficient language, but it certainly makes them more open to devs wanting to use these languages. Well, and who knows what happens, if the prices for Raspberry Pis and cloud hosting and such end up skyrocketing similarly.

[–] squaresinger@lemmy.world 29 points 1 week ago (4 children)

Add to the list: doing native development most often means doing it twice. Native apps are better in pretty much every metric, but rarely are they so much better that management decides it's worth doing the same work multiple times.

If you do native, you usually need a web version, Android, iOS, and if you are lucky you can develop Windows/Linux/Mac only once and only have to take the variation between them into account.

Do the same in Electon and a single reactive web version works for everything. It's hard to justify multiple app development teams if a single one suffices too.

[–] nullpotential@lemmy.dbzer0.com 12 points 1 week ago (1 children)
[–] squaresinger@lemmy.world 3 points 1 week ago

Works good enough for all that a manager cares about.

[–] boonhet@sopuli.xyz 8 points 1 week ago* (last edited 1 week ago) (2 children)

At this rate I suspect the best solution is to cram everything but the UI into a cross-platform library (written in, say, Rust) and have the UI code platform-specific, use your cross-platform library using FFI. If you're big enough to do that, at least.

[–] footfaults@lemmygrad.ml 8 points 1 week ago

but the UI into a cross-platform library (written in, say, Rust)

Many have tried, none have succeeded. You can go allllll the way back to Java's SWING, as well as Qt. This isn't something that "just do it in Rust" is going to succeed at.

[–] SorteKanin@feddit.dk 2 points 1 week ago (1 children)

Or just use rust for everything with Dioxus. At least, that's what Dioxus is going for.

[–] boonhet@sopuli.xyz 3 points 1 week ago (1 children)

Are we gui yet?

I haven't really kept up with Rust UI frameworks (or Rust at all lately, nearly nobody wants to pay me to write Rust, they keep paying me to write everything else). Iced was the most well-known framework last I tried any UI work, with Tauri being a promising alternative (that still requires web tech unfortunately). This was just me playing around on the desktop.

Is Dioxus easy to get started with? I have like no mobile UI experience, and pretty much no UI experience in general. I prefer doing backend development. Buuuut there's an app I want to build and I really want it to be available for both iOS and Android... And while iOS development doesn't seem too horrible for me, Android has always been weird to me

load more comments (1 replies)

iOS

At my last job we had a stretch where we were maintaining four different iOS versions of our software: different versions for iPhones and iPads, and for each of those one version in Objective-C and one in Swift. If anyone thinks "wow, that was totally unnecessary", that should have been the name of my company.

[–] who@feddit.org 3 points 1 week ago* (last edited 1 week ago) (2 children)

This equation might change a bit as more software users learn how bloated apps affect their hardware upgrade frequency & costs over time. The RAM drought brings new incentive to teach and act on that knowledge.

Management might be a bit easier to convince when made to realize that efficiency translates to more customers, while bloat translates to fewer. In some cases, developing a native app might even mean gaining traction in a new market.

[–] squaresinger@lemmy.world 6 points 1 week ago

I'm not too optimistic on that one. Bloated software has been an issue for the last 20 or so years at least.

At the same time upgrade cycles have become much slower. In the 90s you'd upgrade your PC every two years and each upgrade would bring whole entire use cases that just weren't possible before. Similar story with smartphones until the mid-2010s.

Nowadays people use their PCs for upwards of 10 years and their smartphones until they drop them and crack the screen.

Devices have so much performance nowadays that you can really just run some electron apps and not worry about it. It might lag a little at times, but nobody buys a new device just because the loyalty app of your local supermarket is laggy.

I don't like Electron either, but tbh, most apps running on Electon are so light-weight that it doesn't matter much that they waste 10x the performance. If your device can handle a browser with 100 tabs, there's no issue running an Electron app either.

Lastly, most Electron/Webview apps aren't really a matter of choice. If your company uses Teams you will use teams, no matter how shit it runs on your device. If you need to use your public transport, you will use their app, no matter if it's Electron or not. Same with your bank, your mobile phone carrier or any other service.

load more comments (1 replies)
[–] who@feddit.org 18 points 1 week ago* (last edited 1 week ago) (1 children)

many chat applications written in Electron, none of which are interoperable.

This is one of my pet peeves, and a noteworthy example because chat applications tend to be left running all day long in order to notify of new messages, reducing a system's available RAM at all times. Bloated ones end up pushing users into upgrading their hardware sooner than should be needed, which is expensive, wasteful, and harmful to the environment.

Open chat services that support third party clients have an advantage here, since someone can develop a lightweight one, or even a featherweight message notifier (so that no full-featured client has to run all day long).

load more comments (1 replies)
[–] olafurp@lemmy.world 4 points 1 week ago (4 children)

As a programmer myself I don't care about RAM usage, just startup time. If it takes 10s to load 150MB into memory it's a good case for putting in the work to reduce the RAM bloat.

[–] HugeNerd@lemmy.ca 4 points 1 week ago (1 children)

As a programmer myself I don’t care about RAM usage,

But you repeat yourself.

load more comments (1 replies)
load more comments (3 replies)
load more comments (12 replies)
[–] cassandrafatigue@lemmy.dbzer0.com 65 points 1 week ago* (last edited 1 week ago) (3 children)

It's not running out. It's being hoarded for the entropy machine.

Edit: anyone know if entropy machine ram can be salvaged for human use? If they use the same sticks?

[–] mholiv@lemmy.world 7 points 1 week ago (3 children)

Yes but you’ll need special hardware. Enterprise systems use registered “RDIMM” modules that won’t work in consumer systems. Even if your system supports ECC that is just UDIMM aka consumer grade with error correction.

This all being said I would bet you could find some cheap Epic or Xeon chips + an appropriate board if/when they crash comes.

load more comments (3 replies)
[–] TehPers@beehaw.org 3 points 1 week ago (1 children)

Server memory is probably reusable, though likely to be either soldered and/or ECC modules. But a soldering iron and someone sufficiently smart can probably do it (if it isn't directly usable).

[–] cassandrafatigue@lemmy.dbzer0.com 3 points 1 week ago (1 children)

So it's salvageable if they don't burn it out running everything at 500c

[–] TehPers@beehaw.org 3 points 1 week ago (1 children)

500°C would be way above the safe operating temps, but most likely yes.

[–] cassandrafatigue@lemmy.dbzer0.com 2 points 1 week ago (2 children)

You think the slop cultists care?

load more comments (2 replies)
load more comments (1 replies)
[–] Carnelian@lemmy.world 51 points 1 week ago

640k ought to be enough for anybody

[–] whelk@retrolemmy.com 26 points 1 week ago

TUI enthusiasts: "I've trained for this day."

P.S. Yes, I know a TUI program can still be bloated.

[–] favoredponcho@lemmy.zip 24 points 1 week ago

Just glad I invested in 64GBs when it only cost $200. Same ram today is nearly $700.

[–] HugeNerd@lemmy.ca 17 points 1 week ago

But how can I get anything done with these meager 128 GB computers?

[–] footfaults@lemmygrad.ml 14 points 1 week ago* (last edited 1 week ago)

The tradeoff always was to use higher level languages to increase development velocity, and then pay for it with larger and faster machines. Moore's law made it where the software engineer's time and labor was the expensive thing.

Moore's law has been dying for a decade, if not more, and as a result of this I am definitely seeing people focus more on languages that are closer to hardware. My concern is that management will, like always, not accept the tradeoffs that performance oriented languages sometimes require and will still expect incredible levels of productivity from developers. Especially with all of nonsense around using LLMs to "increase code writing speed"

I still use Python very heavily, but have been investigating Zig on the side (Rust didn't really scratch my itch) and I love the speed and performance, but you are absolutely taking a tradeoff when it comes to productivity. Things take longer to develop but once you finish developing it the performance is incredible.

I just don't think the industry is prepared to take the productivity hit, and they're fooling themselves, thinking there isn't a tradeoff.

[–] magic_lobster_party@fedia.io 14 points 1 week ago

Hah, wishful thinking

[–] who@feddit.org 9 points 1 week ago
[–] PoY@lemmygrad.ml 8 points 1 week ago (3 children)

that was always an option.. but people don't know how to program for efficiency because "ram is cheap" was ALWAYS the answer to everything

[–] crabsoft@gamerstavern.online 6 points 1 week ago

@PoY @who

It really can't be stated strongly enough, the damage this advice has done. When I first started seeing it, I rolled my eyes thinking, "This is obviously a mistake, even new students won't ignore reality>". 20 years later, jokes on me, all the software is slow and bad.

[–] Feyd@programming.dev 3 points 1 week ago

It's not even that they didn't know how to program for efficiency but that they chose the most inefficient tools possible to share web browser and app code

[–] eemon@programming.dev 3 points 1 week ago

I think you nailed it there. Now that ram is no longer cheap, devs better start learning.

[–] Jankatarch@lemmy.world 6 points 1 week ago (1 children)

Software engineers and game designers should be allowed 4 gb ram.

load more comments (1 replies)
[–] Xylight 4 points 1 week ago (1 children)

it might be time for me to learn GPUI, i wonder if it's any good.

load more comments (1 replies)
[–] kerrigan778@lemmy.blahaj.zone 2 points 1 week ago

I'm not sure what there's less excuse for, the software bloat or the memory running out.

load more comments
view more: next ›