this post was submitted on 01 Apr 2026
24 points (80.0% liked)
Showerthoughts
41406 readers
808 users here now
A "Showerthought" is a simple term used to describe the thoughts that pop into your head while you're doing everyday things like taking a shower, driving, or just daydreaming. The most popular seem to be lighthearted clever little truths, hidden in daily life.
Here are some examples to inspire your own showerthoughts:
- Both “200” and “160” are 2 minutes in microwave math
- When you’re a kid, you don’t realize you’re also watching your mom and dad grow up.
- More dreams have been destroyed by alarm clocks than anything else
Rules
- All posts must be showerthoughts
- The entire showerthought must be in the title
- No politics
- If your topic is in a grey area, please phrase it to emphasize the fascinating aspects, not the dramatic aspects. You can do this by avoiding overly politicized terms such as "capitalism" and "communism". If you must make comparisons, you can say something is different without saying something is better/worse.
- A good place for politics is c/politicaldiscussion
- Posts must be original/unique
- Adhere to Lemmy's Code of Conduct and the TOS
If you made it this far, showerthoughts is accepting new mods. This community is generally tame so its not a lot of work, but having a few more mods would help reports get addressed a little sooner.
Whats it like to be a mod? Reports just show up as messages in your Lemmy inbox, and if a different mod has already addressed the report, the message goes away and you never worry about it.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
The word 'fault' is commonly used interchangeably with 'responsible'. Following your described definition, I agree that LLM's are faulty, and they can be at 'fault'.
I invoked the ethical dilemma as it's almost universally understood that it's a scenario forcing an individual to make a decision. I've never before heard someone blame the trolley. The brakes are broken? Come now, if we're going to be so semantic about it, a human should have regularly inspected the brakes and subsequently had them repaired.
I appreciate your explanation of your viewpoint. Cheers.
I use "fault" and the idea of blame to go "what can we change to prevent the bad thing from happening again?"
Since we can prevent the bad thing by banning LLMs, and there are no significant downsides to doing so, I blame the LLMs. They're evil.
Bonus: banning LLMs hurts Sam Altman
I see what you're getting at, but we have differing views as to what constitutes 'evil'. Cyanide isn't evil because it can kill a person, just as an oven isn't evil when it burns the roast dinner.
Without possession of a level of consciousness akin to ours, good and evil aren't really possible. An inanimate object can't make a decision to produce a negative outcome - nor a positive one for that matter. They can only be used by us to bring about an outcome. LLM's have been built to be ingratiating and as a result they are addictive to an extent for those that use them.
ChatGPT's fundamental design was crafted by humans, under instruction by humans, under leadership by Musk and Altman. They carry a good part of the responsibility and blame for that kid's death. The machine was only doing what it was created for - just as the rope was.
I don't think any human can be trusted with LLM technology. It's not possible to control it and use it for good.