this post was submitted on 23 Oct 2025
64 points (98.5% liked)
technology
24062 readers
196 users here now
On the road to fully automated luxury gay space communism.
Spreading Linux propaganda since 2020
- Ways to run Microsoft/Adobe and more on Linux
- The Ultimate FOSS Guide For Android
- Great libre software on Windows
- Hey you, the lib still using Chrome. Read this post!
Rules:
- 1. Obviously abide by the sitewide code of conduct. Bigotry will be met with an immediate ban
- 2. This community is about technology. Offtopic is permitted as long as it is kept in the comment sections
- 3. Although this is not /c/libre, FOSS related posting is tolerated, and even welcome in the case of effort posts
- 4. We believe technology should be liberating. As such, avoid promoting proprietary and/or bourgeois technology
- 5. Explanatory posts to correct the potential mistakes a comrade made in a post of their own are allowed, as long as they remain respectful
- 6. No crypto (Bitcoin, NFT, etc.) speculation, unless it is purely informative and not too cringe
- 7. Absolutely no tech bro shit. If you have a good opinion of Silicon Valley billionaires please manifest yourself so we can ban you.
founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
very silly to be upset about this policy
the code is going to be held to the same standards as always, so it's not like they're going to be blindly adding slop
you can't stop people from using LLMs, how would you know the difference? so formalizing the process allows for better accountability
Yeah, AI enthusiastic coders are out there and you can't just pretend that nobody is ever going to submit AI generated code and try to pass it off as their own. This seems to be a pretty reasonable way of ensuring that there's accountability and hopefully disclosure.
Obviously I'd prefer that nobody submits vibe code but good luck stopping them all.
steamed hams
a good time was had by all
I think having a policy that forces disclose of LLM code is important. It's also important to solidify that AI code should only ever be allowed to exist in userland/ring 3. If you can't hold the author accountable, the code should not have any permissions or be packaged with the OS.
I can maybe see using an LLM for basic triaging of issues, but I also fear that adding that system will lead to people placing more trust in it than they should have.
but you can hold the author accountable, that's what OP's quoted text is about.
I know, that was me just directly voicing that opinion. I do still think that AI code should not be allowed in anything that eve remotely needs security.
Even if they can still be held accountable, I don't think it's a good idea to allow something that is known to hallucinate believable code to write important code. Just makes everything a nightmare to debug.
The AI hate crowd are getting increasingly nonsensical, I see it as any other software, if it's more open that's the best.
Me getting links to AI generated wikis where nearly all the information is wrong but of course I'm overreacting because AI is only used by real professionals who already are experts in their domain. I just need to wait 5 years and it'll probably be only half wrong.
It's under the same process for validation, criticize on each case.
Sure this isn't true, but go off.
🙄 I see y'all's posts I'm old not insensible
Old and wrong is possible.
Sure, it's just software. It's useless software which makes everything it is involved in worse and it's being shoved into everything to prop up the massive bubble that all the tech companies have shoveled all their money into, desperate for any actual use case to justify all their terrible 'investments.'
I'm still working on a colleague's AI generated module that used 2,000 lines of code to do something that could've been done in 500. Much productivity is being had by all.
If you wrote 4x the code it must be 4x as good!
But you think it's okay that the reviewers should waste their time reviewing slop?
I've had to spend so much time the last few months reviewing and refactoring garbage code, all because corporate whitelisted some LLMs. This group I've worked with for many years used to be really competent developers, but they've all become blind. It's a tragedy.
Maybe you can't, but it's very obvious in many cases.
fantasy world where if they simply make rules where you're Not Allowed to submit LLM code nobody will
so not all cases? don't waste my time
Oh yeah, I forgot that it's smart to get rid of all rules and laws and such, simply because some folk will disregard it.
The thing is it'll probably be fine for the end product beyond the wave of codeslop that will be brought to the project once the shitty vibe coders hear the news. that's just more work for the volunteers but you're right that it isn't really that different of a policy in practice
seems sensible.