this post was submitted on 05 Mar 2026
54 points (96.6% liked)

Fuck AI

6252 readers
1017 users here now

"We did it, Patrick! We made a technological breakthrough!"

A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.

AI, in this case, refers to LLMs, GPT technology, and anything listed as "AI" meant to increase market valuations.

founded 2 years ago
MODERATORS
 

for those not familiar with Mark Pilgrim, he is/was a prolific author, blogger, and hacker who abruptly disappeared from the internet in 2011.

cross-posted from: https://lemmy.bestiver.se/post/968527

HN comments

top 8 comments
sorted by: hot top controversial new old
[–] theunknownmuncher@lemmy.world 28 points 4 days ago (1 children)

US courts have already ruled that AI generated works cannot be copyrighted, so yeah, if you vibecode something, it is not yours to license.

[–] hendrik@palaver.p3x.de 5 points 4 days ago (1 children)

Yeah. Relicensing it to MIT is really weird. Should either be Public Domain, or LGPL 😁

[–] TootGuitar@sh.itjust.works 12 points 4 days ago (1 children)

It can’t be public domain by definition. The original work was licensed, and its license (LGPL) requires derivative works to be licensed as LGPL. There is no other choice.

[–] hendrik@palaver.p3x.de 1 points 4 days ago

Yes. I just followed their argument to the logical conclusion, if it were true. Which it probably isn't.

[–] flamingos@feddit.uk 2 points 4 days ago

Trust the Flask guy to be in favour of this.

[–] s38b35M5@lemmy.world 2 points 4 days ago

That was a fun scroll. Will be interesting to watch this play out.

[–] Grimy@lemmy.world -5 points 4 days ago (1 children)

I think we may be mixing two different questions here.

The legal question is not whether an AI model might have seen the original code during training. The relevant question is whether the new implementation contains protected expression from the LGPL codebase.

At the moment, the available evidence points the other way:

reported similarity is extremely low (~0.04% average / ~1.29% max)

module structure and APIs differ

the detection pipeline appears to have been reimplemented

Solving the same problem (encoding detection) does not by itself make a work derivative.

Seems kind of fair. Codings has always been a bit of a wild west, are we going to start copyrighting concepts? The original repo wasn't used during the rewrite either.

I also prefer MIT to copy left in any case and I don't even think reverse engineering something and rewriting it is bad either. I don't dig the constant copyright bootlicking.

[–] dogs0n@sh.itjust.works 4 points 3 days ago

If you truly enjoy free software and the right to modify, then you enjoy copy left.

MIT is how you have your work "stolen" by people who hate freedoms and want to lock everything down.

I really think everyone using copyleft licenses is the correct future. Licensing with MIT has become very popular, but I don't think it should be.