this post was submitted on 02 Mar 2026
14 points (88.9% liked)

TechTakes

2478 readers
329 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 2 years ago
MODERATORS
 

Want to wade into the snowy surf of the abyss? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid.

Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

Any awful.systems sub may be subsneered in this subthread, techtakes or no.

If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

The post Xitter web has spawned so many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)

Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

(Credit and/or blame to David Gerard for starting this.)

you are viewing a single comment's thread
view the rest of the comments

I'd say that the great problems that last for decades do not fall purely to random bullshit and require serious advances in new concepts and understanding. But even then, the romanticized warrior culture view is inaccurate. It's not like some big brain genius says "I'm gonna solve this problem" and comes up with big brain ideas that solve it. Instead, a big problem is solved after people make tons of incremental progress by trying random bullshit and then someone realizes that the tools are now good enough to solve the big problem. A better analogy than the Good Will Hunting genius is picking a fruit: you wait until it is ripe.

But math/CS research is not just about random bullshit go. The truly valuable part is theory and understanding, which comes from critically evaluating the results of whatever random bullshit one tries. Why did idea X work well with Y but not so well with Z, and where else could it work? So random bullshit go is a necessary part of the process, but I'd say research has value (and prestige) because of the theory that comes from people thinking about it critically. Needless to say, LLMs are useless at this. (In the Knuth example, the AI didn't even prove that its construction worked.)

I think intelligence is overrated for research, and the most important quality for research is giving a shit. Solving big problems is mostly a question of having the right perspective and tools, and raw intelligence is not very useful without them. To do that, one needs to take time to develop opinions and feelings about the strengths and weaknesses of various tools.

Of course, every rule has exceptions, and there have been long standing problems that have been solved only when someone had the chutzpah to apply far more random bullshit than anyone had dared to try before.