95
top 16 comments
sorted by: hot top controversial new old
[-] doughless@lemmy.world 42 points 3 weeks ago

A comment on the YouTube video makes a good point that we already have a better word for the concept of dealing with multiple things at once: multitasking. Using a word that literally means "things happening at the same time" just adds to the confusion, since people already have a difficult time understanding the distinction between multitasking and concurrency.

[-] FizzyOrange@programming.dev 21 points 3 weeks ago

Yeah it always bothered me that they're saying "concurrency is not concurrency".

I'm going to start using "multitasking" instead. That's so much better. Who's with me?

[-] doughless@lemmy.world 7 points 3 weeks ago

I will typically use the terms asynchronous and parallel when discussing the concepts, but I hadn't thought about using multitasking until I saw that comment. I mean, even C# calls them "tasks".

[-] lysdexic@programming.dev 3 points 3 weeks ago

A comment on the YouTube video makes a good point that we already have a better word for the concept of dealing with multiple things at once: multitasking.

I don't think that's a good comment at all. In fact, it ignores fundamental traits that separate both concepts. For example, the concept of multitasking is tied to single-threaded task switching whereas concurrency has a much broader meaning, which covers multi threaded and multiprocess execution of many tasks that may or may not yield or be assigned to different cores, processors, or even nodes.

Meaning, concurrency has a much broader meaning that goes well beyond "doing many things at once". Such as parallelism and asynchronous programming.

[-] FrostyPolicy@suppo.fi -4 points 3 weeks ago* (last edited 3 weeks ago)

A cpu (core) can only do one thing at a time. When you have multiple cores you can do multiple things at the same time. Multitasking in programming sense is a bad term, it's a term more for the masses.

Bit simplified:

  • concurrency: you seem to be doing multiple things at the same time. In reality they are run little by little one after another. Doesn't really speed things up.
  • parallelism: you actually run multiple things at the same time (multiple cpus/cpu cores required). If the code scales properly or is designed to truly run in parallel the speed up is relative to the number of cpus available.

Edit: It's much more complex subject then I've presented here.

[-] FizzyOrange@programming.dev 21 points 3 weeks ago

You missed the point. He understands all these things you tried to explain. The point is that your definition of the word "concurrency" is objectively wrong.

You:

you seem to be doing multiple things at the same time. In reality they are run little by little one after another

The actual meaning of the word "concurrency":

The property or an instance of being concurrent; something that happens at the same time as something else.

Wiktionary actually even disagrees with your pedantic definition even in computing!

(computer science, by extension) A property of systems where several processes execute at the same time.

I suspect that concurrency and parallelism were actually used interchangeably until multicore became common, and then someone noticed the distinction (which is usually irrelevant) and said "aha! I'm going to decide that the words have this precise meaning" and nerds love pedantic "ackshewally"s so it became popular.

[-] nous@programming.dev 12 points 3 weeks ago

I suspect that concurrency was used back when there were only single threaded cpus, when process scheduling became a thing, to talk about the difference between running one process after another vs interleaving the processes so they appear to be concurrent. Then once true multithreaded programs became a thing they needed a new word to describe things happening at the exact same time instead of only appearing to.

[-] FrostyPolicy@suppo.fi 2 points 3 weeks ago

Wikpedia puts it nicely:

"The concept of concurrent computing is frequently confused with the related but distinct concept of parallel computing,[3][4] although both can be described as "multiple processes executing during the same period of time". In parallel computing, execution occurs at the same physical instant: for example, on separate processors of a multi-processor machine, with the goal of speeding up computations—parallel computing is impossible on a (one-core) single processor, as only one computation can occur at any instant (during any single clock cycle).[a] By contrast, concurrent computing consists of process lifetimes overlapping, but execution does not happen at the same instant. "

[-] FizzyOrange@programming.dev 3 points 3 weeks ago

You're still missing the point. We all understand that definition. We're just saying that it is incorrect use of the word "concurrent". Does that make sense? The word "concurrent" means things happening at the same time. It's stupid for programmers to redefine it to mean things not happening at the same time.

[-] bear@lemmynsfw.com 1 points 3 weeks ago
execution = if concurrency then parallelism else multitasking

Simple. Easy. But it doesn't confuse my boss or make everyone angry.

concurrency = if concurrency then concurrency else concurrency 

Now this I can work with.

[-] BB_C@programming.dev 4 points 3 weeks ago

With hyper-threading and preemption in mind, maybe it's concurrency all the way down 😎 . But we should definitely keep this on the down low. Don't want the pesky masses getting a whiff of this.

[-] platypus_plumba@lemmy.world 3 points 3 weeks ago

Do we really need a video about this in 2024? Shouldn't this be already a core part of our education as software engineers?

[-] lysdexic@programming.dev 5 points 3 weeks ago

Do we really need a video about this in 2024? Shouldn’t this be already a core part of our education as software engineers?

I'm not sure what point you tried to make.

Even if you believe some concept should be a core part of the education of every single software engineer who ever lived, I'm yet to meet a single engineer who had an encyclopedic knowledge of each and every single topic covered as a core part of their education. In fact, every single engineer I ever met only retained a small subset of their whole curriculum.

So exactly what is your expectation?

[-] platypus_plumba@lemmy.world 1 points 3 weeks ago* (last edited 3 weeks ago)

My expectation is that this is something core that programmers should be aware of all the time. Forgetting about this is like forgetting what an interface is. It's at the core of what we do. At least I think so, maybe I'm wrong assuming this is something every programmer should be aware of all the time.

[-] Mad_Punda@feddit.org 4 points 3 weeks ago

Well the like article has a date in 2013 at the top.

[-] NostraDavid@programming.dev 2 points 3 weeks ago* (last edited 3 weeks ago)

Shouldn't it? Yes, just like the ability to unit test, but that doesn't stop schools from skipping over them either.

this post was submitted on 13 Oct 2024
95 points (99.0% liked)

Programming

17314 readers
319 users here now

Welcome to the main community in programming.dev! Feel free to post anything relating to programming here!

Cross posting is strongly encouraged in the instance. If you feel your post or another person's post makes sense in another community cross post into it.

Hope you enjoy the instance!

Rules

Rules

  • Follow the programming.dev instance rules
  • Keep content related to programming in some way
  • If you're posting long videos try to add in some form of tldr for those who don't want to watch videos

Wormhole

Follow the wormhole through a path of communities !webdev@programming.dev



founded 1 year ago
MODERATORS