this post was submitted on 03 Dec 2025
1151 points (99.4% liked)

Technology

77318 readers
3312 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] petersr@lemmy.world 1 points 1 day ago (4 children)

Well, we will likely die with them in that case.

[–] bold_atlas@lemmy.world 5 points 1 day ago* (last edited 1 day ago) (3 children)

No. The survivors of the bubble will have plenty to eat.

[–] petersr@lemmy.world 1 points 1 day ago (2 children)

I am not talking about the bubble. I am talking about AI being a threat to humanity up there with nuclear wipe out.

[–] Saledovil@sh.itjust.works 3 points 23 hours ago (1 children)

Well, we're still at least one breakthrough away from AGI, and we don't even know how it will go from there. Could be that humans are already near the maximum of what is possible intelligence wise. As in, the smartest being possible is not that much smarter than the average human. In which case, AGI taking over the world would not be a given.

Essentially, talking about the threat posed by ASI is like talking about the threat posed by Cthulhu.

[–] petersr@lemmy.world 2 points 18 hours ago

I hear you. But I would still be cautious by default.