264
submitted 11 months ago by L4s@lemmy.world to c/technology@lemmy.world

Data poisoning: how artists are sabotaging AI to take revenge on image generators::As AI developers indiscriminately suck up online content to train their models, artists are seeking ways to fight back.

you are viewing a single comment's thread
view the rest of the comments
[-] HejMedDig@feddit.dk 1 points 11 months ago

The Nightshade poisoning attack claims that it can corrupt a Stable Diffusion in less than 100 samples. Probably not to NSFW level. How easy it is to manufacture those 100 samples is not mentioned in the abstract

[-] AVincentInSpace@pawb.social 2 points 11 months ago* (last edited 11 months ago)

yeah the operative word in that sentence is "claims"

I'd love nothing more than to be wrong, but after seeing how quickly Glaze got defeated (not only did it make the images nauseating for a human to look at despite claiming to be invisible, not even 48 hours after the official launch there was a neural network trained to reverse its effects automatically with like 95% accuracy), suffice to say my hopes aren't high.

[-] HejMedDig@feddit.dk 1 points 11 months ago

You seem to have more knowledge on this than me, I just read the article ๐Ÿ™‚

this post was submitted on 18 Dec 2023
264 points (93.7% liked)

Technology

59438 readers
4317 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS