this post was submitted on 13 May 2026
30 points (84.1% liked)

TechTakes

2567 readers
263 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 2 years ago
MODERATORS
 

as seen here and here, some instances are feeding posts wholesale to prompts, for what seem like extremely unsound reasons to me

any of you run into this shit yet?

you are viewing a single comment's thread
view the rest of the comments
[–] snooggums@piefed.world 4 points 9 hours ago (1 children)

Having a LLM confirm a decision is the same thing as having the LLM make a decision and then figure out if the mod agrees with it. If they would have chosen not to rule based on the LLM output, then the LLM was part of the decision making process. The order does not matter.

Including the LLM outputting something that implies a determination at any step automatically makes it part of the process.