this post was submitted on 03 Feb 2026
144 points (98.6% liked)
Technology
80273 readers
3524 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
It’s a machine. It doesn’t understand the concept of consent or anything else for that matter.
It must observe even these rules that it doesn't understand. Like everybody.
It can’t, it’s software that needs a governing body to dictate the rules.
The rules are in its code. It was not designed with ethics in mind, it was designed to steal IP, fool people into thinking it's AI, and be profitable for its creators. They wrote the rules, and they do not care about right or wrong unless it impacts their bottom line.
The issue is more that there aren't rules. Given there are billions of parameters that define how these models work, there isn't really a way to ensure that it cant produce unwanted content.
Then they should be banned and made illegal. If one wants to run a LLM locally on their consumer machine then fine, they're paying the electric bill.
But these things should not be running remotely on the internet were it's doing nothing but destroying our planet and collective sanity.
That’s the point, there has to be a human in the loop that sets explicit guard rails
No, the point is the humans are there, but they're the wrong kind of humans who make the wrong kind of guardrails.
...and that is an excuse since when?
It’s not an excuse, it doesn’t think or reason.
Unless the software owner sets the governing guardrails it cannot act or present or redact in the way a human can.
Then the software owner needs to be put in prison.