this post was submitted on 09 Feb 2026
17 points (94.7% liked)

TechTakes

2441 readers
60 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 2 years ago
MODERATORS
 

Want to wade into the snowy surf of the abyss? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid.

Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

Any awful.systems sub may be subsneered in this subthread, techtakes or no.

If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

The post Xitter web has spawned so many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)

Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

(Credit and/or blame to David Gerard for starting this.)

you are viewing a single comment's thread
view the rest of the comments
[–] CinnasVerses@awful.systems 5 points 2 days ago (3 children)

News story from 2015:

(Some people might have been concerned to read that) almost 3,000 “researchers, experts and entrepreneurs” have signed an open letter calling for a ban on developing artifical intelligence (AI) for “lethal autonomous weapons systems” (LAWS), or military robots for short. Instead, I yawned. Heavy artillery fire is much more terrifying than the Terminator.

The people who signed the letter included celebrities of the science and high-tech worlds like Tesla’s Elon Musk, Apple co-founder Steve Wozniak, cosmologist Stephen Hawking, Skype co-founder Jaan Tallinn, Demis Hassabis, chief executive of Google DeepMind and, of course, Noam Chomsky. They presented their letter in late July to the International Joint Conference on Artificial Intelligence, meeting this year in Buenos Aires.

They were quite clear about what worried them: “The key question for humanity today is whether to start a global AI arms race or to prevent it from starting. If any major military power pushes ahead with AI weapon development, a global arms race is virtually inevitable, and the endpoint of this technological trajectory is obvious: autonomous weapons will become the Kalashnikovs of tomorrow.”

“Unlike nuclear weapons, they require no costly or hard-to-obtain raw materials, so they will become ubiquitous and cheap for all significant military powers to mass-produce. It will only be a matter of time until they appear on the black market and in the hands of terrorists, dictators wishing to better control their populations, warlords wishing to perpetrate ethnc cleansing, etc.”

The letter was issued by the Future of Life Institute which is now Max Tegmark and Toby Walsh's organization.

People have worked on the general pop culture that inspired TESCREAL, and on the current hype, but less on earlier attempts to present machine minds as a clear and present danger. This has the 'arms race' narrative, the 'research ban' proposed solution, but focuses on smaller dangers.

[–] fiat_lux@lemmy.world 3 points 1 day ago

Oh hey. I remember this. I was confused at the time how it seemed to almost come out of left field, and how some of the names ended up on the same letter.

Now I recognise all those names from the Epstein files, although some were only mentions rather than direct participants.

[–] Amoeba_Girl@awful.systems 5 points 1 day ago (1 children)

and, of course, Noam Chomsky

lmao the shade

[–] CinnasVerses@awful.systems 7 points 1 day ago (1 children)

shade

If you follow world politics, it has been obvious that Noam Chomsky is a useful idiot since the 1990s and probably the 1970s. I wish he had learned from the Khmer Rouge that not everyone who the NYT says is a bad guy is a good guy!

[–] Amoeba_Girl@awful.systems 5 points 1 day ago

Oh absolutely. It's frankly shocking how wrong he's been about so many things for so so long. He's also managed to pen the most astonishingly holocaust-denial-coded diatribe I've ever read from (ostensibly) a non-holocaust denier. I guess his overdeveloped genocide-denial muscle was twitching!

[–] YourNetworkIsHaunted@awful.systems 8 points 1 day ago (1 children)

The point about heavy artillery is actually pretty salient, though a more thorough examination would also note that "Lethal Autonomous Weapons Systems" is a category that includes goddamn land mines. Of course this would serve to ground the discussion in reality and is thus far less interesting to people who start organizations like the Future of Life Institute.

[–] jaschop@awful.systems 8 points 1 day ago

I'm pretty sure LAWS exist right now, even without counting landmines. Automatic human targeting and friend/foe distinction aren't exactly cutting edge technologies.

The biggest joke to me is that these systems are somewhat cost-efficient on the scale of a Kalashnikov. Ukraine is investing heavily into all kinds of drones, but that is because they're trying to be casualty-efficient. And it's all operator based. No-one wants the 2M€ treaded land-drone to randomly open fire on a barn and expose its position to a circling 5k€ kamikaze drone.