There's a very long history of extremely effective labor saving tools in software.
Writing in C rather than Assembly, especially for more than 1 platform.
Standard libraries. Unix itself. More recently, developing games in Unity or Unreal instead of rolling your own engine.
And what happened when any of these tools come on the scene is that there is a mad gold rush to develop products that weren't feasible before. Not layoffs, not "we don't need to hire junior developers any more".
Rank and file vibe coders seem to perceive Claude Code (for some reason, mostly just Claude Code) as something akin to the advantage of using C rather than Assembly. They are legit excited to code new things they couldn't code before.
Boiling the rivers to give them an occasional morale boost with "You are absolutely right!" is completely fucked up and I dread the day I'll have to deal with AI-contaminated codebases, but apart from that, they have something positive going for them, at least in this brief moment. They seem to be sincerely enthusiastic. I almost don't want to shit on their parade.
The AI enthusiast bigwigs on the other hand, are firing people, closing projects, talking about not hiring juniors any more, and got the media to report on it as AI layoffs. They just gleefully go on about how being 30% more productive means they can fire a bunch of people.
The standard answer is that they hate having employees. But they always hated having employees. And there were always labor saving technologies.
So I have a thesis here, or a synthesis perhaps.
The bigwigs who tout AI (while acknowledging that it needs humans for now) don't see AI as ultimately useful, in the way in which C compiler was useful. Even if its useful in some context, they still don't. They don't believe it can be useful. They see it as more powerfully useless. Each new version is meant to be a bit more like AM or (clearly AM-inspired, but more familiar) GLaDOS, that will get rid of all the employees once and for all.
Well, is A* useful? But that's not a fair example, and I can actually tell a story that is more specific to your setup. So, let's go back to the 60s and the birth of UNIX.
You're right that we don't want assembly. We want the one true high-level language to end all discussions and let us get back to work: Fortran (1956). It was arguably IBM's best offering at the time; who wants to write COBOL or order the special keyboard for APL? So the folks who would write UNIX plotted to implement Fortran. But no, that was just too hard, because the Fortran compiler needed to be written in assembly too. So instead they ported Tmg (WP, Esolangs) (1963), a compiler-compiler that could implement languages from an abstract specification. However, when they tried to write Fortran in Tmg for UNIX, they ran out of memory! They tried implementing another language, BCPL (1967), but it was also too big. So they simplified BCPL to B (1969) which evolved to C by 1973 or so. C is a hack because Fortran was too big and Tmg was too elegant.
I suppose that I have two points. First, there is precisely one tech leader who knows this story intimately, Eric Schmidt, because he was one of the original authors of
lex
in 1975, although he's quite the bastard and shouldn't be trusted or relied upon. Second, ChatGPT should be considered as a popular hack rather than a quality product, by analogy to C and Fortran.Very interesting! I didn't realize there was this historical division between fortran and c. I thought c was just "better" because it came later.
Oh, not at all. It would be very rude of me to describe C as a pathogen transmitted through the vector of Unix, so I won't, even if it's mostly accurate to say so.
Many high level systems programming languages predate C, like the aforementioned Fortran, Pascal, PL/I and the ALGOL family. The main advantage C had over them in the early 1970s was its relatively light implementation. The older, bigger languages were generally considered superior to C for actual practical use on systems that could implement them, i.e. not a tiny cute little PDP-7.
Since then C has grown some more features and a horrible standard filled to the brim with lawyerly weasel words that let compilers optimize code in strange and terrifying ways, allowing it to exists as something of a lingua franca of systems programming, but at the time of its birth C wouldn't have been seen as anything particularly revolutionary.
Posting cause im afraid people will miss this. This is what a pdp-7 looked like: https://en.wikipedia.org/wiki/PDP-7
@Soyweiser @bitofhope Is that a built-in oscilloscope in the center section?
So cool ... but WHY
It's used as a display that does 1024x1024 bit raster graphics.
https://vintagetek.org/pdp7/