Soyweiser

joined 1 year ago
[–] Soyweiser@awful.systems 1 points 30 minutes ago

Let the Wookie win.

[–] Soyweiser@awful.systems 2 points 12 hours ago

So, you want to replace the guys with beertap backpacks with robots? Because this already exists without robots.

[–] Soyweiser@awful.systems 3 points 12 hours ago

The power of these people us that they project a field in which normal reality doesn't seem to hold and they can do things that seem to distort reality. Like a clown car. The Great Clown theory of history.

[–] Soyweiser@awful.systems 10 points 1 day ago (3 children)

That is a lot of mental hoops to jump through to keep holding on to the idea IQ is useful. High IQ is a force multiplier for being dumb. The horseshoe theory of IQ.

[–] Soyweiser@awful.systems 6 points 1 day ago (1 children)

Not something you should admit on the internet, but I actually have not watched that much of the simpsons, it just wasn't that much on our tvs. Bundy was however.

[–] Soyweiser@awful.systems 11 points 1 day ago* (last edited 1 day ago) (2 children)

Ow wait you weren't asking me to explain what I meant, you were asking me to defend the correctness of the professors in genetics vs a crackpot at an event where I wasn't, nor am I qualified as im not a professor in genetics, nor is Yud, after I just mentioned that I don't think unqualified people are talking about this. So you were trying the Socratic method? How is that working out for you?

The text you’ve just quoted

Yes, and im quoting the LW crackpot, they are not saying they are unwittingly wrong, it is hinting at that they are intentionally wrong. (Using some very dodgy analogies (no making a chicken bigger isn't like creating a 14 foot human that is a crazy comparison due to the whole thing in biology where stuff works differently at different scales (see also the strength of ants), it is powerful hype language however) and unscientific shit (the random asspulled graphs)). Also note that his whole article is using their fears of AI to promote that we should do more eugenics (using the weirdest logic imaginable, we should take care not to make mistakes and do everything slow in AI so we need to do eugenics fast) and that the professors are wrong/keeping back. And this is just what I can come up with after quickly skimming parts of the article (I don't have the time/energy/expertise to do more anyway, I mean imagine if I had to look up all the literature they reference and see if it is correct (all 5 of them, I mean you did notice that there were only ~5 links to actual scientific articles right? Not an amount of backing I would want to base my political actions on (you also noticed that right?))). It also hits classic crankery levels, not only are the professors missing/suppressing something this thing is also a revolutionary thing which could save humanity. (also note he admits that the technology of editing babies on one gene is not solved yet (but they are close). Which should make you wonder why they are dismissive of 'ethical issues').

It also doesn't help that your reactions are pattern matching the 'im just curious, could you explain yourself' kind of person we used to get on r/sneerclub who 90% of the times wasn't curious but actually was just very pro race science or an annoying contrarian debatebro with yt induced brain damage (which got them banned very quickly, so word of warning).

E: and ow, you did notice that people in the comments are trying to say they should the guy who was recently famous for being able to keep his arm down into this right? (Fucking Ents who are pretending that the rest of the world doesn't affect them).

[–] Soyweiser@awful.systems 13 points 1 day ago

You don't understand, it is important to look at all diverse viewpoints (no not those), there might be some good ideas up there.

[–] Soyweiser@awful.systems 11 points 1 day ago* (last edited 1 day ago) (5 children)

That there is a secret group of scientists who know something is up and they are suppressing this technology.

watching prominent tenured professors in the field of genetics take turns misrepresenting their own data

[–] Soyweiser@awful.systems 3 points 1 day ago

On the subject of AI agents, I saw a baffling commercial (or well half saw it several times) where they where trying to sell I think AI powered phones where they had the revolutionary AI agent idea of randomly feeling like you should meet up with friends, then just go to your friends and tell your phone that it should reschedule everything they had planned for a day later. Which baffled me in various ways, as mentioning this to your phone doesn't do anything, you can't reschedule without input from other people. Sure I might not want to go to my doc appointment today, but I can't just tell my phone 'hey tell my doc I will do it tomorrow not today'. And this is ignoring the fact that with AI reliability you need to actually check they did it correctly. It might have forgotten the order of the days, I mean it is a technology that already has failed on the most basic task. Just a very strange commercial, disconnected from how modern agendas and time works.

[–] Soyweiser@awful.systems 11 points 1 day ago* (last edited 1 day ago) (2 children)

Not really a sneer, nor that related to techbro stuff directly, but I noticed that the profile of Chris Kluwe (who got himself arrested protesting against MAGA) has both warcraft in his profile name and prob paints miniatures looking at his avatar. Another stab in the nerd vs jock theory.

[–] Soyweiser@awful.systems 6 points 1 day ago

Some manager is going to see the metrics on that article vaguely think about the word viral and take the absolute wrong conclusions.

 

Some light sneerclub content in these dark times.

Eliezer complements Musk on the creation of community notes. (A project which predates the takeover of twitter by a couple of years (see the join date: https://twitter.com/CommunityNotes )).

In reaction Musk admits he never read HPMOR and he suggests a watered down Turing test involving HPMOR.

Eliezer invents HPMOR wireheads in reaction to this.

view more: next ›