this post was submitted on 23 Jan 2026
492 points (97.9% liked)

Technology

79236 readers
1633 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[โ€“] voracitude@lemmy.world 79 points 2 days ago* (last edited 2 days ago) (13 children)

Limited liability companies are "limited" in the sense that there are limitations on the responsibilities of the members of the corporation. The CEO can't be held personally liable for the actions of the company, for example; their underlings could have been responsible and kept the leader in the dark.

However, there's this interesting legal standard wherein it is possible to "pierce the corporate veil" and hold corporate leadership personality liable for illegal actions their company took, if you can show that by all reasonable standards they must or should have known about the illegal activity.

Anyway Elon has been elbow-deep in the inner workings of Xitter for years now, by his own admission, right? Really getting in there to tinker and build new stuff, like Grok and its image generation tools. Seems like he knows an awful lot about how that works. An awful lot.

[โ€“] bluGill@fedia.io 4 points 2 days ago

That is a tricky question. IT isn't just does the CEO know, but should the CEO have known. If you make a machine that injures people the courts ask should you have expected that.

The first time someone uses a lawnmower the cut a hedge the companies and gets hurt can say "we never expected someone to be that stupid" - but we now know people do such stupid things and so if you make a lawn mower and someone uses it to cut a hedge the courts will ask why you didn't stop them - the response is then we can't think of how to stop them but look at the warnings we put on.

When Grok was first used to make porn X can get by with "we didn't think of that". However this is now known. They now need to do more to stop it. there are a number of options. Best is fix Grok so it can't do that; they could also just collect enough information on users that when it happens the police can arrest the person who instructed grok. There are a number of other options, if the court accepts them depends on if the tool is otherwise useful and if whatever they do reduces the amount of porn (or whatever evil) that gets through - perfection isn't needed but it needs to get close.

load more comments (12 replies)