Soyweiser

joined 2 years ago
[–] Soyweiser@awful.systems 11 points 3 weeks ago* (last edited 3 weeks ago) (2 children)

The implication here that it isnt methodically flawed is quite something.

E: and I don't have the inclination for to do the math, but a 97% accuracy seems to be on the unusable side considering the rate of 'criminals' vs not-criminals in the population. (Yeah, see also 'wtf even is a criminal').

[–] Soyweiser@awful.systems 17 points 3 weeks ago* (last edited 3 weeks ago) (2 children)

The war on weird looking people continues. (The false positive/negative rate of this bs is immense. Wait a 69% succes rate? Ow god the false positives on that are going to be immense (even worse, the model works worse than random chance on a online game dataset, and then also the statistical uselessness of 69% due to low amount of pedos in general public isn't even mentioned in the conclusions, toss this where it belongs, in the dustbin of history).

[–] Soyweiser@awful.systems 13 points 3 weeks ago (2 children)

wikipedia talk pages: what is wrong with you people

Sorry this remark is a WP:NAS, WP:SDHJS, WP:NNNNNNANNNANNAA and WP:ASDF violation.

[–] Soyweiser@awful.systems 6 points 3 weeks ago (3 children)

This kind of stuff, which seems to hit a lot harder than the anti trump stuff, makes me feel that a vance presidency would implode quite quickly due to other maga toadies trying to backstab toadkid here.

[–] Soyweiser@awful.systems 8 points 3 weeks ago

For me it feels like this is pre ai/cryptocurrency bubble pop. But with luck (as the maga gov infusions of both fail, and actually quicken the downfall (Musk/Trump like it so it must be iffy), if we are lucky). Sadly it will not be like the downfall of enron, as this is all very distributed, so I fear how much will be pulled under).

[–] Soyweiser@awful.systems 6 points 3 weeks ago

I'd assume that is very intentional, nominative determinism is one of those things a lot of LW style people like. (Scott Alexander being a big one, which has some really iffy implications (which I fully think is a coincidence btw)).

[–] Soyweiser@awful.systems 3 points 3 weeks ago* (last edited 3 weeks ago)

It wasn't really done that much during the era when Scott A was called the new leader of lesswrong so not sure if it has increased again. I assume a lot still do, as I assume a lot also pretend to have read it. Never looked into any stats, or if those stats are public. I know they put them all on a specific site in 2015. (https://www.readthesequences.com/) The bibliography is a treat (esp as it starts with pop sci books, and a SSC blog post, but also: "Banks, Iain. The Player of Games. Orbit, 1989.", and not one but 3 of the Doc EE Smith lensmen books).

[–] Soyweiser@awful.systems 4 points 3 weeks ago* (last edited 3 weeks ago)

I did a quick search on Ribbonfarm (I couldn't recall what his blog was called quickly) myself. And see how much I had forgotten, it should have been called meta-rationality, and yes, insight porn, that was the term. (linking to two posts where ribbonfarm/this stuff was discussed).

E: Sad feels when you click on a name in the sub from years ago and see them now being a full blast AI bro.

[–] Soyweiser@awful.systems 3 points 3 weeks ago* (last edited 3 weeks ago)

The CAPTCHA failed to load properly for me at first, and then was mega slow. Quality custom implementation of a (wrapper around a) CAPTCHA, millions of EA money well spent.

[–] Soyweiser@awful.systems 6 points 3 weeks ago (2 children)

We talked about that on r/sneerclub in the past, can't recall the specific consensus. Seems post-rational, has innovation on rationalism from binary 'object vs meta' to 2x2 grids.

[–] Soyweiser@awful.systems 9 points 3 weeks ago* (last edited 3 weeks ago) (3 children)

Instead of increasing the capabilities of llms a lot of work is done in the field of downplaying human capabilities to make llms look better in comparison. You would assume that the 'be aware of biasses, and learn to think rationally' place would notice this trap. But nope, nobody reads the sequences anymore. (E: for the people not in the know, the sequences is the Rationalist bible written by Yud (extremely verbose, the new bits are not good and the good bits are not new) used here as a joke, reading it (and saying you should) used to be part of the cultic milieu of LW).

[–] Soyweiser@awful.systems 6 points 3 weeks ago

AI is going to wreck the world without even being asked to turn things into paperclips. Just giving all coders out of the loop performance problems.

view more: ‹ prev next ›