this post was submitted on 25 Feb 2026
108 points (99.1% liked)

news

816 readers
359 users here now

A lightweight news hub to help decentralize the fediverse load: mirror and discuss headlines here so the giant instance communities aren’t a single choke-point.

Rules:

  1. Recent news articles only (past 30 days)
  2. Title must match the headline or neutrally describe the content
  3. Avoid duplicates & spam (search before posting; batch minor updates).
  4. Be civil; no hate or personal attacks.
  5. No link shorteners
  6. No entire article in the post body

founded 7 months ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] pivot_root@lemmy.world 1 points 1 month ago (1 children)

I get what you're saying, but I think that's giving them too much credit.

The logic is obvious that "AI not trained on thing is not good at doing thing." The people who care more about feelings than facts would see it as "AI works fine for everyone else but not for me" and react emotionally instead.

[–] EndlessNightmare@reddthat.com 1 points 1 month ago* (last edited 1 month ago)

I mean that's already true for a whole bunch of things.

Military has stringent specifications due to functional and security requirements. They aren't buying consumer grade vehicles, tools, electronics, weapons, etc. There was recently huge stink about Signal (a consumer-grade chat app) being used.

This would be putting a lot of demands on the company and imposing significant liability.

If the military wants a technology, there is already a process for developing such. This is a major departure from that process.