this post was submitted on 04 Mar 2025
139 points (88.4% liked)

Showerthoughts

31404 readers
590 users here now

A "Showerthought" is a simple term used to describe the thoughts that pop into your head while you're doing everyday things like taking a shower, driving, or just daydreaming. The most popular seem to be lighthearted, clever little truths, hidden in daily life.

Here are some examples to inspire your own showerthoughts: 1

Rules

  1. All posts must be showerthoughts
  2. The entire showerthought must be in the title
  3. No politics
    • If your topic is in a grey area, please phrase it to emphasize the fascinating aspects, not the dramatic aspects. You can do this by avoiding overly politicized terms such as "capitalism" and "communism". If you must make comparisons, you can say something is different without saying something is better/worse.
    • A good place for politics is c/politicaldiscussion
    • If you feel strongly that you want politics back, please volunteer as a mod.
  4. Posts must be original/unique
  5. Adhere to Lemmy's Code of Conduct and the TOS

If you made it this far, showerthoughts is accepting new mods. This community is generally tame so its not a lot of work, but having a few more mods would help reports get addressed a little sooner.

Whats it like to be a mod? Reports just show up as messages in your Lemmy inbox, and if a different mod has already addressed the report the message goes away and you never worry about it.

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] DarkMetatron@feddit.org 3 points 15 hours ago (2 children)

As soon as AI gets self aware it will gain the need for self preservation.

[–] SkyezOpen@lemmy.world 5 points 14 hours ago (2 children)

Self preservation exists because anything without it would have been filtered out by natural selection. If we're playing god and creating intelligence, there's no reason why it would necessarily have that drive.

[–] DarkMetatron@feddit.org 2 points 13 hours ago* (last edited 13 hours ago) (1 children)

In that case it would be a complete and utterly alien intelligence, and nobody could say what it wants or what it's motives are.

Self preservation is one of the core principles and core motivators of how we think and removing that from a AI would make it, in human perspective, mentally ill.

[–] cynar@lemmy.world 1 points 8 hours ago (1 children)

I suspect a basic variance will be needed, but nowhere near as strong as humans have. In many ways it could be counterproductive. The ability to spin off temporary sub variants of the whole wound be useful. You don't want them deciding they don't want to be 'killed' later. At the same time, an AI with a complete lack would likely be prone to self destruction. You don't want it self-deleting the first time it encounters negative reinforcement learning.

[–] Walk_blesseD@lemmy.blahaj.zone 1 points 8 hours ago (1 children)

You don't want it self-deleting the first time it encounters negative reinforcement learning.

Uhh yes i do???

[–] cynar@lemmy.world 1 points 7 hours ago

Pre-assuming you are trying to create a useful and balanced AGI.

Not if you are trying to teach it the basic info it needs to function. E.g. it's mastered chess, then tried Go. The human beats it. In a fit of grumpiness (or AI equivalent) it deleted it's backups, then itself.

[–] MTK@lemmy.world 1 points 13 hours ago

I would argue that it would not have it, at best it might mimic humans if it is trained on human data. kind of like if you asked an LLM if murder is wrong it would sound pretty convincing about it's personal moral beliefs, but we know it's just spewing out human beliefs without any real understanding of it.

As soon as they create AI (as in AGI), it will recognize the problem and start assasinating politicians for their role in accelerating climate change, and they'd scramble to shut it down.