It can't do the literal entire thing an operating system is supposed to do: manage applications and their resulting windows, in a sensible way.
I want to know what application is running.
Sure it's in the dock!
I want to find a specific application window.
Go fuck yourself right to hell.
Wait, the taskbar doesn't show the running windows, like it does on every other OS? It's at least discrete right?
It discretely takes up 1.5cm of the bottom of the screen at all times. It's so discrete it doesn't even need to use the corners.
Uh, alright, well that's all the system space you need right?
Yeah of course just that bottom inch or so .... And a top of screen system level menu bar to display what windows does in the bottom corners.
/sigh/ ok, fine, I just want to be able to full screen a window and still see what else is open.
Burn in hell and die.
I want to be able to easily switch left and right between open windows.
Go full screen or I will shoot you.
I want to move an open window into the other monitor.
You can't because you're full screen dumbass.
I want to let a window present a popup like they normally do.
You can't because youre full screen dumbass. Why would you be full screen?
I want an application like Slack to be able to popup and remove notifications when is appropriate.
Choose to have every single notification persists on screen until you manually remove it, or miss all your notifications.
Can't we trouble you for something in between, where we trust an application and let it manage them in a way that makes sense based on their context?
You can trouble me for something in between these cheeks, shit stain.
Like honestly, I fucking hate what an advertising and AI filled mess Windows is, but it can actually manage your windows and virtual desktops in a way that makes a modicum of sense.
It feels like a single Apple product manager decided that the way that they use their computer (a single application at a time, no windows to manage) is the only way anyone does, so who cares if we implement a nonsensical full screen paradigm, it makes one tiny niche edge case slightly simpler.

Ignoring your speculation on the source of my expectations, the expectation that when an application is not doing something I asked it to (i.e. making draw calls to my window manager, processing data, polling for updates or notifications, etc) it doesn't run at all is not unreasonable.
To continue to run when I've instructed it that I'm done and that I have no further work for it is a violation of my intent in interacting with the machine I own. On Windows that violation is up the app developer and most that implement such systems have a settings option to disable staying alive. On MacOs Apple has made the decision of what I want, and, at least in my case, it's the wrong one. On Linux I have extremely acute control over whatever the heck my computer is doing and it works how I like it. Linux is a good OS.
But that's the thing, you haven't instructed it that you want it to quit, you've only instructed it to close a window. That's what that button does, and its function shouldn't change based on whether it's the last window in the application. Plenty of uses for running programs headless and not having them take up resources by keeping a window drawn (though certainly less of an issue now than it used to be).
Dunno, I like more granular control instead of changing functions based on context when it comes to basic UI. If I want to quit a program, I quit it. If I want to close a window, I hit the UI element that does that, and only that.
But this split goes back to the late 80s: Microsoft was late to the multi-window paradigm, and their first implementation pretty much was wrapped in one program, one window. If a program needed multiple windows or panes, they were all drawn in a parent window. Closing that parent window closed the program. They caught up I think with Windows 3.1 (and not fully until Win 95, though my memory is fuzzy, it's been 31 years!), but kept the program-window coupling because their users were used to that, and it's stuck. Linux desktop environments were built more towards the Windows paradigm so as not to confuse the largest source of new users, so now that stuck, too.
It’s quite easy to explicitly tell an application to stop running: Quit (command-q). The Mac has worked this way since 1984. If you have unsaved documents you will be prompted to save them (though most modern apps have used the OS’s built in support for autosave for years now) and then all windows will be closed before the app quits.
Closing the last open window of an application is not an instruction to close the application, it’s an instruction of the form “I am done working with this document now.” No more, no less.
This dates to a time when computers could reasonably be expected to work on single documents that consume all available memory such that the user must close the current document before opening a new one. Furthermore, in those days the application itself may reside on a different floppy disk from the document itself. Forcing the application to close upon closing the last document would then force the user to swap floppies in order to restart the application and then swap floppies again to open another document.
I digress. The floppy swapping issue is clearly no longer relevant but the metaphor remains: the Mac was conceived as a virtual desktop where users would work on their documents using applications (tools). If I’m cutting a piece of paper with a pair of scissors and then I put away the piece of paper, I don’t expect the scissors to put themselves away at the same time. I took out the scissors deliberately and I will put them away when I decide I’m finished with them.