Then let it be over then.
Technology
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
Good. I hope this is what happens.
- LLM algorithms can be maintained and sold to corpos to scrape their own data so they can use them for in house tools, or re-sell them to their own clients.
- Open Source LLMs can be made available for end users to do the same with their own data, or scrape whats available in the public domain for whatever they want so long as they don't re-sell
- Altman can go fuck himself
But if you stop me from criming, how will I get better at crime!?!
No amigo, it's not fair if you're profiting from it in the long run.
This is basically a veiled admission that OpenAI are falling behind in the very arms race they started. Good, fuck Altman. We need less ultra-corpo tech bro bullshit in prevailing technology.
To be fair copyright is a disease. But then so is billionaires, capitalism, business, etc.
I mean, if there's a war, and you shoot somebody, does that make you bad?
Yes and no.
These fuckers are the first one to send tons of lawyers whenever you republish or use any IP of them. Fuck these idiots.
Good.
I have conflicting feelings about this whole thing. If you are selling the result of training like OpenAI does (and every other company), then I feel like it’s absolutely and clearly not fair use. It’s just theft with extra steps.
On the other hand, what about open source projects and individuals who aren’t selling or competing with the owners of the training material? I feel like that would be fair use.
What keeps me up at night is if training is never fair use, then the natural result is that AI becomes monopolized by big companies with deep pockets who can pay for an infinite amount of random content licensing, and then we are all forever at their mercy for this entire branch of technology.
The practical, socioeconomic, and ethical considerations are really complex, but all I ever see discussed are these hard-line binary stances that would only have awful corporate-empowering consequences, either because they can steal content freely or because they are the only ones that will have the resources to control the technology.
Fuck these psychos. They should pay the copyright they stole with the billions they already made. Governments should protect people, MDF
TLDR: "we should be able to steal other people's work, or we'll go crying to daddy Trump. But DeepSeek shouldn't be able to steal from the stuff we stole, because China and open source"
If giant megacorporations can benefit by ignoring copyright, us mortals should be able to as well.
Until then, you have the public domain to train on. If you don't want AI to talk like the 1920s, you shouldn't have extended copyright and robbed society of a robust public domain.
Either we can now have full authority to do anything we want with copyright, or the companies have to have to abide the same rules the plebs and serfs have to and only take from media a century ago, or stuff that fell through the cracks like Night of the Living Dead.
Copyright has always been a farce and a lie for the corporations, so it's nothing new that its "Do as I say, not as I do."
I think the answer is there just do what deepseek did.
At the end of the day the fact that openai lost their collective shit when a Chinese company used their data and model to make their own more efficient model is all the proof I need they don't care about being fair or equitable when they get mad at people doing the exact thing they did and would aggressively oppose others using their own work to advance their own.
Good. If I ever published anything, I would absolutely not want it to be pirated by AI so some asshole can plagiarize it later down the line and not even cite their sources.
Oh no, not the plagiarizing machine! How are rich hacks going to feign talent now? Pay an artist for it?! Crazy!
Sounds fair, shut it down.
Why training openai with literally millions of copyrighted works is fair use, but me downloading an episode of a series not available in any platform means years of prison?
I dont wanna be mean but I always thought this guy had a weird face
Good. Fuck AI
If I had to pay tuition for education (buying text books, pay for classes and stuff), then you have to pay me to train your stupid AI using my materials.
Fuck OpenAI for stealing the hard work of millions of people
“The plagiarism machine will break without more things to plagiarize.”
Do you promise?!?!
As an artist, kindly get fucked ass hole. I'd like compensation for all the work of mine you stole.
But I can't pirate copyrighted materials to "train" my own real intelligence.
Vote pirate party.
Maybe as a consumer product but governments will still want it
"We can't succeed without breaking the law. We can't succeed without operating unethically."
I'm so sick of this bullshit. They pretend to love a free market until it's not in their favor and then they ask us to bend over backwards for them.
Too many people think they're superior. Which is ironic, because they're also the ones asking for handouts and rule bending. If you were superior, you wouldn't need all the unethical things that you're asking for.
That's a good litmus test. If asking/paying artists to train your AI destroys your business model, maybe you're the arsehole. ;)
I wonder if there's some validity to what OpenAI is saying though (but I certainly don't completely agree with them).
If the US makes it too costly to train AI models, then maybe China will relax any copyright laws so that Chinese AI models can be trained quickly and cheaply. This might result in China developing better AI models than the US.
Maybe the US should require AI companies to pay a large chunk of their profits to copyright holders. So copyright holders would be compensated, but an AI company would only have to pay if they generate profits.
Maybe someone more knowledgeable in this field will tell me I'm totally wrong.
Good, end this AI bullshit, it has little upsides and a metric fuckton of downsides for the common man
The only way this would be ok is if openai was actually open. make the entire damn thing free and open source, and most of the complaints will go away.