this post was submitted on 05 Feb 2026
848 points (99.0% liked)
Fuck AI
5654 readers
589 users here now
"We did it, Patrick! We made a technological breakthrough!"
A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.
AI, in this case, refers to LLMs, GPT technology, and anything listed as "AI" meant to increase market valuations.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I don't get how they're allowed to do this in the first place.
People are using LLM's instead of visiting websites, reading books, or watching videos. And lots of people are PAYING AI companies for this.
It's such a clear case of copyright infringement, and it's leading to countless losses for creators.
I think part of the issue is it's relatively new, new things don't have laws written about them and haven't been tried in court. So, until one of the copyright holders want to push the issue it's sort of like "well, maybe it's illegal, maybe it's not."
And of course the copyright holders just make deals so that they get paid and they move on with life (Disney).
Our society is structured in a way that leads to anyone who could disallow it to not do so
No. It is not clear. I read books and train myself from them, and then teach others for money. That's legal... Obviously computers are not humans, but the parallel is there. So it's not clear what the law is or ought to be.
Interesting point. But you didn't steal all those books. I think that's the problem. If you couldn't access a book you might pirate it, But these guys pirated them all, all the video all the books all the art all the photos all the conversations. They used copyrighted works to train their models.
This seems like the key point. The teacher who buys a text book to share it's contents with others is the intended use of the content. There's clearly no theft there. If the creators of all this content had genuinely intended it to be used this way then there would be no problem. But the vast majority of artists/authors/creators seem to be against the use of their work like this (perhaps given appropriate compensation they could have been brought on side?)
This logic appeals to me but I'm curious how it could work legally as well as potential side effects. It seems likely that legal arguments would ensue over intended use of content, and it doesn't seem like it should be illegal to use some created work in a new or unintended manner.
I think the overall goals are to encourage creative and academic work (which requires funding creators), discourage centralization of knowledge (prevent leverage over and manipulation of populace), encourage distrust of llm output without source references in output, and discourage overuse of generative AI. I'm sure there are more, but that's what comes to mind.