623
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 03 Jan 2024
623 points (93.4% liked)
Technology
59300 readers
798 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
Why would they not? There’s no way for such a system to know it’s AI generated unless there’s some metadata that makes it obvious. And even if it was, who’s to say the user wouldn’t want to see them in the results?
This is a nothing issue. It’s not like this is being generated in response to a search, it’s something that already existed being returned as a result because there is assembly something that links it to the search.
To put it bluntly: this is kind of like complaining a pencil drawing on a napkin showed up in the results.
I agree with your comment but just want to point out that AI-generated images actually often do contain metadata, usually describing the model and prompt used.
By the time a user has shared them, 99% of the time all superfluous metadata has been stripped, for better or worse.
That's fine for looking up cat pictures or porn, but many people are searching for information contained in images, and that is a problem. What if you were looking for a graph, a map, a blueprint, etc.? How do you discern the real from the fake? What if you click through and the image seems to come from a legit source that is also generated?
You’re missing the point: How would a search engine discern the real from the fake?