NodeJS is worse. One dude just had to write a cli based JavaScript runtime and holy hell now entire backends run on the least performant runtime possible.
Programmer Humor
Welcome to Programmer Humor!
This is a place where you can post jokes, memes, humor, etc. related to programming!
For sharing awful code theres also Programming Horror.
Rules
- Keep content in english
- No advertisements
- Posts must be related to programming or programmer topics
Yeah, and all because god forbid you give your (future) employees time to learn another language besides JavaScript. Nope, line must go up so programming must be further commodified.
You can bash the Javascript language all you want, but don't come for its performance lol. Nodejs was very fast across the board when it came out, and still beats most scripting languages. Even some bigger runtimes in IO.
Its performance as a backend server is abysmal compared to standard compiled languages.
It’s absolutely wasteful to use it.
The reality is that most backends don't use compiled languages, but stuff like PHP, Java and Python.
NodeJS scores very high on performance, concurrency, and especially IO, in that category.
And calling it abysmal compared to compiled languages is not fair, but yes, there are much better alternatives.
i use neovim btw
Linux wins again. Still runs on same hardware as 10 years ago. :) No forced updates by any big corp.
Electron apps are on Linux too
In terms of performance yeah. Though not every old device keeps working. You're still vulnerable to driver support for newer kernels. My old Thinkpad no longer functions properly because the Nvidia drivers are not compatible with newer kernels. I can either have an unsafe machine that runs fine or an up-to-date machine that can barely open a web browser.
Nvidia yeah. The source of many many linux issues.
I struck lucky. Never had any issues with nvidia on linux in all my >2 decades using linux.
Still prefer AMD though. Straight through.
I've had 1 faulty driver since 2011, so I just downgraded in 10 seconds and waited for the next patch
Damn that's very lucky. Every device with Nvidia hardware that I installed Linux on has at some point during updates or whatever gone to shit. However I must say that it has become way better in recent years. My Thinkpad was the worst because it was my first Linux device and it had an integrated Intel gpu and a dedicated Nvidia GPU and getting it to work was horror. In the end a friend of mine who was better at Linux just forced it to always use the Nvidia card because then at least stuff worked reliably (tm).
But even then it pretty much always died during Ubuntu release updates. I've nuked my whole system once because the screen went black (due to GPU drivers presumably) during one and after an hour or so I forcefully turned off the laptop because I couldn't do anything anymore. After restarting into a tty my laptop was in some sort of limbo between 2 Ubuntu versions and I basically just had to reinstall.
Ever since I made Linux (Arch btw) my main OS for gaming at the start of this year it has been quite stable though. I did switch to LTS kernels and after that everything has been pretty chill.
Yeah I got amd graphics 3 years ago and the same day, all those weird issues with graphics artifacts or suspend bugs just went away.
If you didnt have any of those, you are Indeed lucky. I had many of them through the years.
Spotify using several processes and GB of memory just play some music and browse a library is an abomination. WinAMP did most of that 20 years ago while using a fraction of the resources.
Discord similarly is an affront.
Spotify suck at programming. When using the app offline, I can view and play songs and podcasts directly or from the queue, but the menu to add stuff to the queue doesn't load.
I hate Discord passionately. I miss the days of Mumble + IRC
IRC was great, but it failed to progress meaningfully.
If you have Spotify Premium, try a third party client. Even GUI clients like Spotify-qt are memory light [though not at feature parity] whilst terminal clients like ncspot, spotify-player take 1/10th the memory. The latter even supports Spotify connect.
Meanwhile my Linux runtime still boots for 1G and Emacs is looking pretty good right now lol
... Now got the idea of doing lemmy through Emacs.
There's probably a mode for that.
... ah, there's one.. https://codeberg.org/martianh/lem.el
Looking pretty good right now. ;D
Hey, I'm posting this via lem.el! I've been using it for a few months now.
Could still use some work, but far better than using a Web Browser.
Atom was kinda revolutionary in its plugin support and everything IIRC.
Well, now that Atom has been replaced by VSCode, which is also an electron app, the original Atom devs, or at least some of them, are creating Zed. Zed's written in Rust and uses a lot less memory.
Of course it's not yet as mature and they're trying to earn money by integrating AI and selling that as a service. BUT the AI is voluntary and even if you do want to use it, you don't have to pay to use their AI (which comes with a free tier if you DO want to use it), you can literally run your own model in ollama.
It's not perfect, but I love how little RAM it uses compared to VSCode and (shudders) the Jetbrains suite (which I normally love, but hate the RAM and CPU usage, it can drive my computer pretty slow)
That explains alot. I have both PyCharm and RustRover open as I ~~steal~~ convert stuff from a project I found. Anywho I was typing in discord and I was typing faster than it rendered and I thought that was strange
I've had PyCharm max out 3 or 4 CPU cores out of the 6 I have :/ I do have several million lines of code indexed by it though
still have the patch they sent for people who published packages. I made a theme no one but me used but still! Pre microsoft github was cool

Got that patch still in it’s brown envelope somewhere in a drawer, for doing a syntax highlighting plugin.
They were indeed cool
If there's any upside to the entire situation, it's that perhaps, maybe, developers will again start paying more attention to optimization instead of just throwing more powerful hardware at it.
Some of the greatest games ever developed for consoles were great because the developers had to get extremely creative with the limited resources at their disposal. This led to some incredibly optimized games that could do a whole lot with those very limited resources.
If there’s any upside to the entire situation, it’s that perhaps, maybe, developers will again start paying more attention to optimization instead of just throwing more powerful hardware at it.
Amen.
Long time irked that so many developers fail with the mathematics of the situation.
If hardware multiplies its resources 1000x, that does not mean you can make your program use 1000x resources, along with thousands of other developers failing at that mathematics too, making bloat radically outpace Moore's Law.
If hardware multiplies its resources 1000x, that should mean that developers continue to keep their software tight, lean, and fast, and that should mean users have 1000x more resources available to do more with.
*Dreamer*
I think pretty much every dev understands the issue but they are limited in what they can do about it. Quitting a job because they won't let you optimize is noble but unrealistic for the vast majority of devs.
I would love for optimizations to start being prioritized. More specifically, I would love to see vendors place limits on memory use in apps. For example, Steam could reject any game over 50gb. I do not believe for a moment that any game we currently have needs more than 50gb except maybe an mmo with 20 years of content. Or Microsoft could reject apps that use more than X ram. They won't ever do that but without an outright rejection, this won't be fixed.