207
submitted 1 month ago by 101@feddit.org to c/technology@lemmy.world
you are viewing a single comment's thread
view the rest of the comments
[-] AwesomeLowlander@sh.itjust.works 62 points 1 month ago

The measure, aimed at reducing potential risks created by AI, would have required companies to test their models and publicly disclose their safety protocols to prevent the models from being manipulated to, for example, wipe out the state’s electric grid or help build chemical weapons.

How exactly do LLMs do that? If you've given an LLM's pseudorandom output control over your electrical grid, no regulation will mitigate your stupidity.

[-] bamfic@lemmy.world 11 points 1 month ago

Could he understand the halting problem? I doubt he does, but the legislators evidently don't either

[-] oce@jlai.lu 6 points 1 month ago

I think it's more about asking it the steps to create a bomb or how to disrupt the grid, for example, where to cut the major edges.

[-] AwesomeLowlander@sh.itjust.works 15 points 1 month ago

asking it the steps to create a bomb

That sounds like a self-correcting issue right there

[-] dual_sport_dork@lemmy.world 16 points 1 month ago

That, and the Internet has been teaching people how to create bombs since the dial-up days. I don't predict that LLM's will be either a benefit or a detriment to that particular strain of natural selection.

[-] vrighter@discuss.tchncs.de 2 points 1 month ago

anyone remember the anarchist's cookbook?

[-] oce@jlai.lu 0 points 1 month ago
[-] AwesomeLowlander@sh.itjust.works 4 points 1 month ago

Is it more of a public safety issue than if they actually build a working one from a legit bomb manual and deploy it?

[-] oce@jlai.lu 0 points 1 month ago

No, but I think it could make the knowledge more easily available which increases the risk that it may happen.

[-] AwesomeLowlander@sh.itjust.works 3 points 1 month ago
[-] oce@jlai.lu 1 points 1 month ago

I think I heard about it before, but instead of having to remember that, I could just ask an uncensored LLM.

[-] AwesomeLowlander@sh.itjust.works 3 points 1 month ago

The actual point was, bomb making instructions have been floating around on search engine results since the days of dial up. That particular manuscript itself has existed since before the days of the Internet. There's nothing cgpt could give you that you couldn't have found by typing the same query into Google. Getting the instructions is literally the easiest, least effort, least risk part of building a bomb.

[-] UnderpantsWeevil@lemmy.world 4 points 1 month ago

How exactly do LLMs do that?

If you hook an LLM up as an interface replacement for a manual/analog Power Plant interface and start asking the translator to intuit decisions based on fuzzy inputs, you can create a cascade of errors that result in grid failure.

If you’ve given an LLM’s pseudorandom output control over your electrical grid, no regulation will mitigate your stupidity.

This rule would prevent a business or public regulator from doing such a thing without proving out safeguards.

And the governor vetoed it.

this post was submitted on 29 Sep 2024
207 points (96.4% liked)

Technology

59081 readers
3563 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS