329
submitted 9 months ago by L4s@lemmy.world to c/technology@lemmy.world

OpenAI's offices were sent thousands of paper clips in an elaborate prank to warn about an AI apocalypse::The prank was a reference to the "paper clip maximizer" scenario – the idea that AI could destroy humanity if it were told to build as many paper clips as possible.

you are viewing a single comment's thread
view the rest of the comments
[-] Destraight@lemm.ee 3 points 9 months ago

I highly doubt that would ever happen.If this AI is building paperclips to overthrow humanity then someone is going to notice

[-] ayaya 14 points 9 months ago

You would think so, but you have to remember AGI is hyper-intelligent. Because it can constantly learn, build, and improve upon itself at an exponential rate it's not only a little bit smarter than a human-- it's smarter than every human combined. AGI would know that if it's caught trying to maximizing paperclips humans would shut it down at the first sign something is wrong, so it would find unfathomably clever ways to avoid detection.

If you're interested in the subject the YouTube channel Computerphile has a series of videos with Robert Miles that explain the importance of AI safety in an easy to understand way.

[-] axzxc1236@lemm.ee 10 points 9 months ago

There is a game that is based on the same thinking (Universal Paperclips), you play the rule of "the AI".

load more comments (6 replies)
load more comments (6 replies)
this post was submitted on 24 Nov 2023
329 points (94.8% liked)

Technology

57932 readers
2899 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS