this post was submitted on 25 Feb 2026
108 points (99.1% liked)

news

816 readers
390 users here now

A lightweight news hub to help decentralize the fediverse load: mirror and discuss headlines here so the giant instance communities aren’t a single choke-point.

Rules:

  1. Recent news articles only (past 30 days)
  2. Title must match the headline or neutrally describe the content
  3. Avoid duplicates & spam (search before posting; batch minor updates).
  4. Be civil; no hate or personal attacks.
  5. No link shorteners
  6. No entire article in the post body

founded 7 months ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] EndlessNightmare@reddthat.com 8 points 1 month ago (1 children)

Malicious compliance: don't prohibit it's use, just make it shitty and useless for the task

[–] pivot_root@lemmy.world 2 points 1 month ago (1 children)

Maliciously complying to sabotage a tool that will be used for military purposes and an overly zealous admin that loves to accuse people of treason. A bold strategy.

[–] EndlessNightmare@reddthat.com 1 points 1 month ago* (last edited 1 month ago) (1 children)

They aren't marketing or selling it for the task. If someone intentionally chooses to misuse a tool after being told not to use it that way, the results are not always favorable.

They can even clearly state that the tool isn't designed for that/does not meet the specifications while not outright prohibiting it's use. A disclaimer to the effect of "this tool is not certified or tested for such use and we cannot guarantee results." That could even be the reason why they want to bar if for such use, and they don't want the liability.

Edit: It is very common for companies to remove features or limit capabilities to prevent misuse, including the legal risks that come with a product being used beyond its design specifications.

[–] pivot_root@lemmy.world 1 points 1 month ago (1 children)

I get what you're saying, but I think that's giving them too much credit.

The logic is obvious that "AI not trained on thing is not good at doing thing." The people who care more about feelings than facts would see it as "AI works fine for everyone else but not for me" and react emotionally instead.

[–] EndlessNightmare@reddthat.com 1 points 1 month ago* (last edited 1 month ago)

I mean that's already true for a whole bunch of things.

Military has stringent specifications due to functional and security requirements. They aren't buying consumer grade vehicles, tools, electronics, weapons, etc. There was recently huge stink about Signal (a consumer-grade chat app) being used.

This would be putting a lot of demands on the company and imposing significant liability.

If the military wants a technology, there is already a process for developing such. This is a major departure from that process.