this post was submitted on 24 Feb 2025
433 points (99.5% liked)
Gaming
2822 readers
612 users here now
The Lemmy.zip Gaming Community
For news, discussions and memes!
Community Rules
This community follows the Lemmy.zip Instance rules, with the inclusion of the following rule:
You can see Lemmy.zip's rules by going to our Code of Conduct.
What to Expect in Our Code of Conduct:
- Respectful Communication: We strive for positive, constructive dialogue and encourage all members to engage with one another in a courteous and understanding manner.
- Inclusivity: Embracing diversity is at the core of our community. We welcome members from all walks of life and expect interactions to be conducted without discrimination.
- Privacy: Your privacy is paramount. Please respect the privacy of others just as you expect yours to be treated. Personal information should never be shared without consent.
- Integrity: We believe in the integrity of speech and action. As such, honesty is expected, and deceptive practices are strictly prohibited.
- Collaboration: Whether you're here to learn, teach, or simply engage in discussion, collaboration is key. Support your fellow members and contribute positively to shared learning and growth.
If you enjoy reading legal stuff, you can check it all out at legal.lemmy.zip.
founded 2 years ago
MODERATORS
By their definition of Gen AI, it's unclear to me if the label says anything about code. I'm not sure I would consider it "writing."
This might be a little off-topic, but I've noticed what seems to be a trend of anti-AI discourse ignoring programmers. Protect artists, writers, animators, actors, voice-actors... programmers, who? No idea if it's because they're partly to blame, or people are simply unaware code is also stolen by AI companies—still waiting on that GitHub Copilot lawsuit—but the end result appears to be a general lack of care about GenAI in coding.
I think it's because most programmers use and appreciate the tool. This might change once programmers start to blame gen AI for not having a job anymore.
And programmers retain complete control of the output - it's just a bit of text that you can adapt as needed. Same as looking up snippets from Stack Overflow. Programmers are used to finding some snippet, checking if it actually works, and then adapting it to the rest of their code, so if doesn't feel like introducing media that you didn't create, but like a faster version of what everyone was already doing.
I noticed a bad trend with my colleagues who use copilot, chatgpt etc. They not only use it to write code, but also trust it with generally poor design decisions.
Another thing is that those people also hate working on existing code, claiming it is communicated and offering to write their (which also ends up complicated) version of it. I suspect it's because copilot doesn't help as much when code is more mature.
There remains a significant enclave that rejects it, but yeah, it's definitely smaller than equivalent groups in other mentioned professions. Hopefully things won't get that far. I think the tech is amazing, but it's an immense shame that so many of my/our peers don't give a flying fuck about ethics.
Reporting in.
Yup. Very much agreed here. There are some uses that are acceptable but it's a but hard to say that any are ethical due to the ethically bankrupt foundations of its training data.