this post was submitted on 13 Apr 2024
409 points (98.6% liked)

Technology

83728 readers
1024 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

When Adobe Inc. released its Firefly image-generating software last year, the company said the artificial intelligence model was trained mainly on Adobe Stock, its database of hundreds of millions of licensed images. Firefly, Adobe said, was a “commercially safe” alternative to competitors like Midjourney, which learned by scraping pictures from across the internet.

But behind the scenes, Adobe also was relying in part on AI-generated content to train Firefly, including from those same AI rivals. In numerous presentations and public postsabout how Firefly is safer than the competition due to its training data, Adobe never made clear that its model actually used images from some of these same competitors.

top 50 comments
sorted by: hot top controversial new old
[–] seaQueue@lemmy.world 149 points 2 years ago (2 children)

Oh hey, look. The cycle of AI ingesting garbage output from another AI model has begun. This can't possibly impact quality or reliability in any way /s

[–] balder1991@lemmy.world 16 points 2 years ago

Time to save the models we have now, cause they’ll never the quite the same.

[–] h_ramus@lemm.ee 10 points 2 years ago

The AI centipede era has begun

[–] Mereo@lemmy.ca 54 points 2 years ago* (last edited 2 years ago) (2 children)
  • Garbage in -> Garbage out (x2)
  • Garbage in (x2) -> = Garbage out (x4)
  • Garbage in (x4) -> = Garbage out (x8)
  • Garbage in (x8) -> = Garbage out (x16)
  • ...
[–] Beetschnapps@lemmy.world 14 points 2 years ago

Yea! Can you believe how long it took us to make garbage before all this?

Y'all never heard of recycling?

[–] alexdeathway@programming.dev 37 points 2 years ago (15 children)

why would they do this, doesn't that reduce the quality of training dataset?

[–] cynar@lemmy.world 13 points 2 years ago

Depends how it's done.

Full generative images would definitely start creating a copying error type problem.

However it's not quite that simple. An AI system can be used to distort an image. The derivatives force the learning AI to notice different things. This can vastly extend the pool of data to learn from, and so improve the end AI.

Adobe obviously decided that the copying errors were worth the extended datasets.

[–] Even_Adder@lemmy.dbzer0.com 8 points 2 years ago (2 children)

Supplementary synthetic data increases the quality of the model.

[–] SomeGuy69@lemmy.world 9 points 2 years ago

Correct. To a certain extend one can add AI data into AI, too much and you add noise, making the result worse, like a copy of a copy.

[–] General_Effort@lemmy.world 3 points 2 years ago (1 children)

Yes, though that's not what they're doing. They train on images uploaded to their marketplace and, of course, some of these are AI generated.

[–] Even_Adder@lemmy.dbzer0.com 3 points 2 years ago (6 children)

It's fine as long as it's not the majority.

load more comments (6 replies)
load more comments (13 replies)
[–] CosmoNova@lemmy.world 23 points 2 years ago (1 children)

I said it around 2 years ago when the term "ethical" was first coined by media when talking about AI. Ehtical in this context just means those who own data centers and made a huge efford to extract and process user data (Facebook, Google, Amazon, etc.) have all the cards. Nevermind the technology being so new users couldn't possibly consent to it years ago. They just update their TOS and get that consent retroactively while law makers are absent as they happily watch their strocks go up.

[–] Grimy@lemmy.world 12 points 2 years ago

Its really frustrating to see people get riled up and manipulated into thinking legislating to make illegal anything "unethical" is in their interest.

Its a fantasy to think individual creators will get a slice of the pie and not just the data brokers. Its also a convenient way to destroy the competition.

People are getting emotional and they are going to use that to build one of the grossest monopoly ever seen.

[–] jimmydoreisalefty@lemmy.world 17 points 2 years ago

Adobe said a relatively small amount — about 5% — of the images used to train its AI tool was generated by other AI platforms. “Every image submitted to Adobe Stock, including a very small subset of images generated with AI, goes through a rigorous moderation process to ensure it does not include IP, trademarks, recognizable characters or logos, or reference artists’ names,” a company spokesperson said.

Adobe Stock’s library has boomed since it began formally accepting AI content in late 2022. Today, there are about 57 million images, or about 14% of the total, tagged as AI-generated images. Artists who submit AI images must specify that the work was created using the technology, though they don’t need to say which tool they used. To feed its AI training set, Adobe has also offered to pay for contributors to submit a mass amount of photos for AI training — such as images of bananas or flags.

[–] TheBat@lemmy.world 11 points 2 years ago
[–] Zink@programming.dev 11 points 2 years ago (1 children)

We always thought the singularity is when our technology would take off advancing without us.

Maybe that moment when it decides it doesn’t need us will be a rapid disintegration by machine circle jerk.

[–] uriel238@lemmy.blahaj.zone 3 points 2 years ago

They [the Golgafrincham] sent the B ship off first, but of course, the other two-thirds of the population stayed on the planet and lived full, rich and happy lives until they were all wiped out by a virulent disease contracted from a dirty telephone.

[–] airrow@hilariouschaos.com 7 points 2 years ago

the problem is "intellectual property" existing at all, just get rid of it entirely and make everything public domain

[–] 0nekoneko7@lemmy.world 7 points 2 years ago

AI daisy chain. One AI output is another AI input.

[–] SeaJ@lemm.ee 4 points 2 years ago (1 children)

I've seen Multiplicity enough times to know how this turns out.

[–] GamingChairModel@lemmy.world 2 points 2 years ago

You've been watching the original movie multiple times? I just watch the most recent recording of myself describing the movie, and then record a new description over that, with each successive generation.

load more comments
view more: next ›