this post was submitted on 26 Mar 2025
546 points (97.1% liked)

Programmer Humor

22780 readers
965 users here now

Welcome to Programmer Humor!

This is a place where you can post jokes, memes, humor, etc. related to programming!

For sharing awful code theres also Programming Horror.

Rules

founded 2 years ago
MODERATORS
 
top 50 comments
sorted by: hot top controversial new old
[–] lime@feddit.nu 146 points 1 month ago (4 children)

all programs are single threaded unless otherwise specified.

[–] firelizzard@programming.dev 49 points 1 month ago (3 children)

It’s safe to assume that any non-trivial program written in Go is multithreaded

[–] Scoopta@programming.dev 18 points 1 month ago (1 children)

But it's still not a guarantee

load more comments (1 replies)
[–] kbotc@lemmy.world 16 points 1 month ago (1 children)

And yet: You’ll still be limited to two simultaneous calls to your REST API because the default HTTP client was built in the dumbest way possible.

load more comments (1 replies)
[–] Opisek@lemmy.world 7 points 1 month ago (4 children)

I absolutely love how easy multi threading and communication between threads is made in Go. Easily one of the biggest selling points.

load more comments (4 replies)
[–] Successful_Try543@feddit.org 23 points 1 month ago (3 children)

Does Python have the ability to specify loops that should be executed in parallel, as e.g. Matlab uses parfor instead of for?

[–] lime@feddit.nu 51 points 1 month ago (2 children)

python has way too many ways to do that. asyncio, future, thread, multiprocessing...

[–] WolfLink@sh.itjust.works 42 points 1 month ago (1 children)

Of the ways you listed the only one that will actually take advantage of a multi core CPU is multiprocessing

[–] lime@feddit.nu 11 points 1 month ago (1 children)

yup, that's true. most meaningful tasks are io-bound so "parallel" basically qualifies as "whatever allows multiple threads of execution to keep going". if you're doing numbercrunching in pythen without a proper library like pandas, that can parallelize your calculations, you're doing it wrong.

[–] WolfLink@sh.itjust.works 8 points 1 month ago* (last edited 1 month ago) (1 children)

I’ve used multiprocessing to squeeze more performance out of numpy and scipy. But yeah, resorting to multiprocessing is a sign that you should be dropping into something like Rust or a C variant.

load more comments (1 replies)
[–] danhab99@programming.dev 9 points 1 month ago (1 children)

I've always hated object oriented multi threading. Goroutines (green threads) are just the best way 90% of the time. If I need to control where threads go I'll write it in rust.

[–] lime@feddit.nu 7 points 1 month ago (4 children)

nothing about any of those libraries dictates an OO approach.

load more comments (4 replies)
[–] Midnitte@beehaw.org 9 points 1 month ago (1 children)
[–] enemenemu@lemm.ee 8 points 1 month ago (2 children)

Are you still using matlab? Why? Seriously

[–] Successful_Try543@feddit.org 18 points 1 month ago (1 children)

No, I'm not at university anymore.

[–] enemenemu@lemm.ee 5 points 1 month ago (1 children)
[–] Successful_Try543@feddit.org 5 points 1 month ago* (last edited 1 month ago) (1 children)

We weren't doing any ressource extensive computations with Matlab, mainly just for teaching FEM, as we've had an extensive collection of scripts for that purpose, and pre- and some post processing.

load more comments (1 replies)
[–] Panties@lemmy.ca 7 points 1 month ago (1 children)

I was telling a colleague about how my department started using Rust for some parts of our projects lately. (normally Python was good enough for almost everything but we wanted to try it out)

They asked me why we're not using MATLAB. They were not joking. So, I can at least tell you their reasoning. It was their first programming language in university, it's safer and faster than Python, and it's quite challenging to use.

[–] twice_hatch@midwest.social 4 points 1 month ago

"Just use MATLAB" - Someone with a kind heart who has never deployed anything to anything

[–] AndrasKrigare@beehaw.org 14 points 1 month ago (8 children)

I think OP is making a joke about python's GIL, which makes it so even if you are explicitly multi threading, only one thread is ever running at a time, which can defeat the point in some circumstances.

load more comments (8 replies)
[–] groknull@programming.dev 5 points 4 weeks ago

I initially read this as “all programmers are single-threaded” and thought to myself, “yeah, that tracks”

[–] nickwitha_k@lemmy.sdf.org 26 points 1 month ago (1 children)
[–] lena@gregtech.eu 4 points 1 month ago (2 children)

Oooooh this is really cool, thanks for sharing. How could I install it on Linux (Ubuntu)? I assume I would have to compile CPython. Also, would the source of the programs I run need any modifications?

[–] computergeek125@lemmy.world 5 points 1 month ago (1 children)

From memory I can only answer one of those: The way I understand it (and I could be wrong), your programs theoretically should only need modifications if they have a concurrency related bug. The global interlock is designed to take a sledgehammer at "fixing" a concurrency data race. If you have a bug that the GIL fixed, you'll need to solve that data race using a different control structure once free threading is enabled.

I know it's kind of a vague answer, but every program that supports true concurrency will do it slightly differently. Your average script with just a few libraries may not benefit, unless a library itself uses threads. Some libraries that use native compiled components may already be able to utilize the full power of you computer even on standard Python builds because threads spawned directly in the native code are less beholden to the GIL (depending on how often they'd need to communicate with native python code)

load more comments (1 replies)
[–] nickwitha_k@lemmy.sdf.org 5 points 4 weeks ago

In this case, it's a feature of the language that enables developers to implement greater amounts of parallelism. So, the developers of the Python-based application will need to refactor to take advantage of it.

[–] SaharaMaleikuhm@feddit.org 24 points 1 month ago (5 children)

Oh wow, a programming language that is not supposed to be used for every single software in the world. Unlike Javascript for example which should absolutely be used for making everything (horrible). Nodejs was a mistake.

load more comments (5 replies)
[–] twice_hatch@midwest.social 17 points 1 month ago (2 children)

don't worry it'll use all the RAM anyway

[–] SatouKazuma@programming.dev 9 points 1 month ago (1 children)

I paid for all the memory. I'll use all the memory.

[–] goodbible@lemm.ee 2 points 4 weeks ago

JG Memoryworth

[–] lena@gregtech.eu 8 points 1 month ago

No RAM gets wasted!

[–] kSPvhmTOlwvMd7Y7E@programming.dev 16 points 1 month ago (1 children)

let's be honest here, he actually means 0.01 core performance

load more comments (1 replies)
[–] dan@upvote.au 13 points 1 month ago (4 children)

Do you mean Synapse the Matrix server? In my experience, Conduit is much more efficient.

[–] jimmy90@lemmy.world 5 points 1 month ago

i wish they would switch the reference implementation to conduit

there is core components on the client side in rust so maybe that's the way for the future

[–] lena@gregtech.eu 4 points 1 month ago

Yep, I mean as in matrix. There is currently no was to migrate to conduit/conduwuit. Btw from what I've seen conduwuit is more full-featured.

load more comments (2 replies)
[–] driving_crooner@lemmy.eco.br 12 points 1 month ago

I tough this was about excel and was like yeah haha!

But is about Python, so I'm officially offended.

[–] tetris11@lemmy.ml 11 points 1 month ago* (last edited 1 month ago)

I prefer this default. Im sick of having to rein in Numba cores or OpenBlas threads or other out of control software that immediately tries to bottleneck my stack.

CGroups (Docker/LXC) is the obvious solution, but it shouldn't have to be

[–] h4x0r@lemmy.dbzer0.com 10 points 1 month ago
[–] TropicalDingdong@lemmy.world 9 points 1 month ago

Python

..so.. so you made it single threaded?

[–] Gonzako@lemmy.world 7 points 1 month ago

I'll be honest, this only matters when running single services that are very expensive. it's fine if your program can't be pararlelized if the OS does its job and spreads the love around the cpus

[–] alcasa@lemmy.sdf.org 5 points 1 month ago

It only took us how many years?

load more comments
view more: next ›