96

Shut down all the large GPU clusters (the large computer farms where the most powerful AIs are refined). Shut down all the large training runs. Put a ceiling on how much computing power anyone is allowed to use in training an AI system, and move it downward over the coming years to compensate for more efficient training algorithms. No exceptions for governments and militaries. Make immediate multinational agreements to prevent the prohibited activities from moving elsewhere. Track all GPUs sold. If intelligence says that a country outside the agreement is building a GPU cluster, be less scared of a shooting conflict between nations than of the moratorium being violated; be willing to destroy a rogue datacenter by airstrike.

Frame nothing as a conflict between national interests, have it clear that anyone talking of arms races is a fool. That we all live or die as one, in this, is not a policy but a fact of nature. Make it explicit in international diplomacy that preventing AI extinction scenarios is considered a priority above preventing a full nuclear exchange, and that allied nuclear countries are willing to run some risk of nuclear exchange if that’s what it takes to reduce the risk of large AI training runs.

top 35 comments
sorted by: hot top controversial new old
[-] itappearsthat@hexbear.net 53 points 6 months ago

my man this world order cannot even take climate change seriously with all its evidence and real-world consequences people can see right before their eyes, and you want them to nuke anybody who makes a chat bot stop telling lies

[-] BelieveRevolt@hexbear.net 47 points 6 months ago

Yudkowsky is a decision theorist from the U.S. and leads research at the Machine Intelligence Research Institute.

He founded that institute, how can someone be considered an expert just because they "lead research" at their own institute? They used to be called the Singularity Institute for Artificial Intelligence too, which just tells you how unserious they are.

Who funds them anyway?

https://intelligence.org/transparency/

An anonymous Ethereum cryptocurrency investor in 2018 and 2021.

Open Philanthropy, a joint initiative with the philanthropic foundation Good Ventures.

Vitalik Buterin, the inventor and co-founder of Ethereum.

The Thiel Foundation, a private foundation funded by Paypal co-founder and venture capitalist Peter Thiel.

A different anonymous Ethereum cryptocurrency investor in 2017.

Oh...

[-] FourteenEyes@hexbear.net 22 points 6 months ago

Founding the FourteenEyes Giant Hog Foundation to research my own hog (accepting donations)

[-] GarbageShoot@hexbear.net 9 points 6 months ago

FourteenEyes Giant Hog

These Yugioh monster retrains have really gone downhill

[-] EnsignRedshirt@hexbear.net 20 points 6 months ago

Yudkowsky is hilarious because he has zero education or professional experience. Like, he didn’t attend high school or college, has never had anything like a job, and has never attempted to produce anything. He could have been a normal nepo baby and gotten into tech investing or entertainment or just fucked off and lived quietly, but for some reason his deepest desire was to dedicate his life to fan-fiction blogging about AI in the style of academic writing. A truly unique individual who could only be produced by the stagnant cesspit that is the Silicon Valley ecosystem.

[-] Nachorella@lemmy.sdf.org 11 points 6 months ago

I remember seeing a remarkably stupid quote by him and looking into who the hell he was and the fact anyone listens to him is baffling. He just yammers on about stuff he knows nothing about and then apparently there's enough oblivious people who believe him to keep the whole thing going in some perpetual shit eating machine type situation.

[-] EnsignRedshirt@hexbear.net 9 points 6 months ago

Baffling is the right word. I don’t really know where he fits in the ecosystem. If he’s a grifter, he could be leveraging his position much better. If he’s a true believer, I don’t get what purpose he serves to his funders. Shit eating machine, indeed.

[-] VernetheJules@hexbear.net 12 points 6 months ago* (last edited 6 months ago)

how can someone be considered an expert just because they "lead research" at their own institute?

yeah but like do you even grift bro

[-] replaceable@hexbear.net 32 points 6 months ago

And when the world needed @UlyssesT@hexbear.net the most, he vanished

[-] EmmaGoldman@hexbear.net 25 points 6 months ago

He did what he had to do and took Kissinger with him.

[-] BeamBrain@hexbear.net 20 points 6 months ago
[-] Outdoor_Catgirl@hexbear.net 12 points 6 months ago

You're not nearly annoying enough

[-] VILenin@hexbear.net 11 points 6 months ago

More like UselessT these days, smdh

[-] Comp4@hexbear.net 29 points 6 months ago

I would rather have a machine apocalypse than a nuclear one

[-] goose@hexbear.net 25 points 6 months ago

This except it’s about fossil fuel extraction

[-] JohnBrownNote@hexbear.net 23 points 6 months ago

what if we just nuke the boardrooms of "ai" companies instead

[-] Magician@hexbear.net 22 points 6 months ago

It's easier to imagine the end of the world than the end of capitalism.

[-] FourteenEyes@hexbear.net 22 points 6 months ago

Why does anyone listen to this fucking moron? Every time he speaks he makes it clear he has no idea what he's talking about

[-] BeamBrain@hexbear.net 20 points 6 months ago

Capitalists shovel money at him and give him a platform because he tells them what they want to hear.

[-] nat_turner_overdrive@hexbear.net 20 points 6 months ago

heartbreaking

can we expand this policy to crypto mines?

[-] BeamBrain@hexbear.net 24 points 6 months ago

I would not consider "we should do a nuclear holocaust so Roko's Basilisk can't get us" an especially good point

[-] nat_turner_overdrive@hexbear.net 16 points 6 months ago

I just want to destroy ai bros

[-] BeamBrain@hexbear.net 14 points 6 months ago

Entirely understandable

[-] Llituro@hexbear.net 20 points 6 months ago

ooh my plug is getting into learning about the effective altruism type losers. she's gonna love how insane the yud is big-yud

[-] WhyEssEff@hexbear.net 20 points 6 months ago
[-] Owl@hexbear.net 15 points 6 months ago

Most AI research is in the US, so uh... yeah go for it dude.

[-] CliffordBigRedDog@hexbear.net 8 points 6 months ago
[-] Tankiedesantski@hexbear.net 7 points 6 months ago

What he's saying is xi-plz

[-] POKEMONGOTOTHEGULAG@hexbear.net 15 points 6 months ago

If China researches AI their copywriters will be unstoppable. We cannot lose the marketing war to China

[-] barrbaric@hexbear.net 15 points 6 months ago

Someone remind me, this guy has 0 actual education or expertise when it comes to programming, right? He just got famous for writing a harry potter fanfic?

[-] Mardoniush@hexbear.net 15 points 6 months ago* (last edited 6 months ago)

I mean...he can program somewhat. He's been extremely online since he was single digits, which means he has USENET transhumanist brainworms with a heavy dose of gifted kid syndrome.

And I mean, same. But I didn't go off the deep end.

I think if you want to understand Big Yud you have to understand his early work (which is exactly what he tells you not to do), both his early AI enthusiast work (Staring into the Singularity, Shock Levels) and his magnum opus, Levels of Organisation in General Intellegence.

These works are out of date but explain why he thinks AI is so important.

More importantly it shows why his final (frankly obvious to everyone else) realisation that "maybe a really smart thing might not only not map easily onto human internal states but also might not automatically find a super nice objective morality". Drove him entirely off the deep end and into the arms and bank accounts of Thiel and his ilk.

Finally they show why he's a very smart boy who is skilled enough at nerd rhetoric to even fool himself. But not smart enough to doubt himself in future after he fucked up the foundations of his entire worldview

[-] duderium@hexbear.net 12 points 6 months ago

“This technology is incredibly dangerous and must be stopped!” — investors in said technology who are definitely not pumping and dumping

[-] SkingradGuard@hexbear.net 10 points 6 months ago
[-] drhead@hexbear.net 7 points 6 months ago* (last edited 6 months ago)

I think you forgot to include the part where he thinks this needs to be done so that we can, essentially, kill all of the dumb people who would get tricked by a rising superintelligent AI.

There are so many cranks in "AI safety" stuff to the point where it is legitimately difficult to talk about what should be done that isn't very obviously slanted for some industry's benefit. You've got people like this, you've also got people like Gladstone that are LITERALLY EX-PENTAGON PEOPLE SPONSORED BY LOCKHEED MARTIN (who I am sure are very concerned about AI safety -- the only way I could be more convinced is if it was Boeing), who have suspicious demands that the publication of open-source models should be made illegal (probably out of concerns about China, as if half of the papers I read on new developments aren't already from them or the Noah's Ark lab in Moscow). There is no well that is unpoisoned here.

[-] FuckyWucky@hexbear.net 6 points 6 months ago

AI for me, not for thee

this post was submitted on 13 Apr 2024
96 points (100.0% liked)

the_dunk_tank

15909 readers
481 users here now

It's the dunk tank.

This is where you come to post big-brained hot takes by chuds, libs, or even fellow leftists, and tear them to itty-bitty pieces with precision dunkstrikes.

Rule 1: All posts must include links to the subject matter, and no identifying information should be redacted.

Rule 2: If your source is a reactionary website, please use archive.is instead of linking directly.

Rule 3: No sectarianism.

Rule 4: TERF/SWERFs Not Welcome

Rule 5: No ableism of any kind (that includes stuff like libt*rd)

Rule 6: Do not post fellow hexbears.

Rule 7: Do not individually target other instances' admins or moderators.

Rule 8: The subject of a post cannot be low hanging fruit, that is comments/posts made by a private person that have low amount of upvotes/likes/views. Comments/Posts made on other instances that are accessible from hexbear are an exception to this. Posts that do not meet this requirement can be posted to !shitreactionariessay@lemmygrad.ml

Rule 9: if you post ironic rage bait im going to make a personal visit to your house to make sure you never make this mistake again

founded 4 years ago
MODERATORS