this post was submitted on 12 Mar 2026
1119 points (98.7% liked)

Programmer Humor

30336 readers
1697 users here now

Welcome to Programmer Humor!

This is a place where you can post jokes, memes, humor, etc. related to programming!

For sharing awful code theres also Programming Horror.

Rules

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] boonhet@sopuli.xyz 5 points 1 day ago (2 children)

I mean unused RAM is still wasted: You'd want all the things cached in RAM already so they're ready to go.

[–] Buddahriffic@lemmy.world 2 points 1 day ago (1 children)

I don't want my PC wasting resources trying to guess every possible next action I might take. Even I don't know for sure what games I'll play tonight.

[–] boonhet@sopuli.xyz 4 points 1 day ago (1 children)

Well you'd want your OS to cache the start menu in the scenario you highlighted above. The game could also run better if it can cache assets not currently in use instead of waiting for the last moment to load them. Etc.

[–] Buddahriffic@lemmy.world 1 points 1 day ago (1 children)

Yeah, for things that will likely be used, caching is good. I just have a problem with the "memory is free, so find more stuff to cache to fill it" or "we have gigabytes of RAM so it doesn't matter how memory-efficient any program I write is".

[–] boonhet@sopuli.xyz 1 points 1 day ago

“memory is free, so find more stuff to cache to fill it”

As long as it's being used responsibly and freed when necessary, I don't have a problem with this

“we have gigabytes of RAM so it doesn’t matter how memory-efficient any program I write is”

On anything running on the end user's hardware, this I DO have a problem with.

I have no problem with a simple backend REST API being built on Spring Boot and requiring a damn gigabyte just to provide a /status endpoint or whatever. Because it runs on one or a few machines, controlled by the company developing it usually.

When a simple desktop application uses over a gigabyte because of shitty UI frameworks being used, I start having a problem with it, because that's a gigabyte used per every single end user, and end users are more numerous than servers AND they expect their devices to do multiple things, rather than running just one application.

[–] echodot@feddit.uk 1 points 1 day ago (1 children)

I mean I have access to a computer with a terabyte of RAM I'm gonna go ahead and say that most applications aren't going to need that much and if they use that much I'm gonna be cross.

[–] boonhet@sopuli.xyz 2 points 1 day ago (1 children)

Wellll

If you have a terabyte of RAM sitting around doing literally nothing, it's kinda being wasted. If you're actually using it for whatever application can make good use of it, which I'm assuming is some heavy-duty scientific computation or running full size AI models or something, then it's no longer being wasted.

And yes if your calculator uses the entire terabyte, that's also memory being wasted obviously.

[–] echodot@feddit.uk 1 points 17 hours ago

That's a different definition of wasted though. The RAM isn't lost just because it isn't being currently utilised. It's sitting there waiting for me to open a intensive task.

What I am objecting to is programs using more RAM than they need simply because it's currently available. Aka chromium.