this post was submitted on 07 Mar 2026
375 points (98.7% liked)

Technology

82363 readers
3565 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] TropicalDingdong@lemmy.world 0 points 8 hours ago (1 children)

Wikipedia, Google, chatgpt etc are not legal authorities or legal professionals.

Yes. And neither are LLMs or their derivatives.

The reason it's dangerous to get legal or health information from a chatbot is the same reason you wouldn't want to randomly trust reddit.

And yet people do, and we accept that as a necessary consequence of maintaining free speech as a principal.

The exact arguments being accepted in this thread are the same which led directly to crackdowns in Hungary, China, and Russia.

If you are okay with limiting and regulating LLMs as a form of speech, I promise it's your speech which will end up limited, and a very small number of companies will control all speech on the internet. You should stop.

[–] deliriousdreams@fedia.io 1 points 8 hours ago (1 children)

Who's speach is being limited by limiting LLM'S? Because as a legal entity their speech cannot be infringed because the LLM doesn't have basic rights in the way that a human does.

So what you're saying is that you don't want these companies to be held to any legal standard for the information they output (which is different from reddit because the companies can't be held responsible in the US under section 230 for what their users write).

The chatbot is the output of the company's data set and somehow you're saying the company can't be held responsible for what that output is and if it's dangerous because it's curtailing free speech?

That's such an interesting take.

[–] TropicalDingdong@lemmy.world 1 points 7 hours ago (1 children)

I'm gaming out the realistic consequences of what a always will mean. It has nothing to do whatsoever if you approve if these companies or not to try and understand the consequences of what will happen if a law like this passes. You don't get to pick or choose if the speech is from an LLM or a company that gets limited or from an individual. There is no difference from a legal perspective.

And this law and approach to limiting speech to "protect people" from the stupid consequences of their own action, they aren't new. And we already know the consequences. Large corporate entities will just get around them or pay an inconsequential fine, and individuals will have their rights curtailed as a result

The entire thread here is falling for an incredibly obvious astroturfing campaign because they associate LLMs with big bad corporations and the real consequences these bad companies have wreaked. But limiting free speech on the internet won't stop them, what it will stop is our ability to communicate and resist them.

[–] deliriousdreams@fedia.io 0 points 4 hours ago

You appear to have gone completely around the twist.

You haven't shown a logical progression of anything you claim. You don't point to any current legal precedent, clearly aren't paying attention to the actual wording being used to draft this bill/law proposal, and are spreading what amounts to FUD.

about the only truthful logical statement you've made is that it's not about whether you like or dislike these companies.

Companies are considered a lawful entity with rights. The supreme Court literally just rules that LLM's do not count as the same kind of legal entity because if they did they'd be able to copyright their "work". So I really do question how you think we go from that to "nobody has free speech because the LLM can't give legal advice".

Speech that causes harm has pretty much never been a protected form of speech in the US, even if I were to humor you and assume that an LLM could have the rights to it.

And you mean the "bad these companies have wrought".