Testing armed robot dogs in the Middle East instead of the US is pretty telling.
Can't be accidentally murdering Americans with a software glitch.
Testing armed robot dogs in the Middle East instead of the US is pretty telling.
Can't be accidentally murdering Americans with a software glitch.
Really has a strong "testing in production" vibe
Oh hell this one is even worse than OceanGate
Don't worry, no danger of killing real people in the Middle East. All the "collateral damage" will be brown people, not Americans. They'll have all the kinks ironed out and will make sure that the AI doesn't hurt white targets before the technology is distributed to every national police district.
I wish this post even deserved a /s.
Which is wild when you add perspective using facts like the police in the US are less disciplined than troops overseas and tbe US still uses substances banned by the Geneva Convention on its civilian population. So if even the US wwon't test it on their own people, it's bad.
Listen, the Geneva convention only specifies what we can't use on enemies, okay? As long as the targets are technically friendlies, it's fair game!
GC is for war and soldiers are combatants and not criminals by default (switching can happen easily). As an example Hollowpoint against criminals is okay as it can protect surrounding bystanders.
It's a bit weird, but for countries war is different from domestic problems.
"Accidentally."
"Testing facility" in Gaza, just like how Israel do.
Wake up babe, new way to genocide brown people just dropped.
Brown people today
Totally cool.
Coolcoolcoolcoolcool....the future is gonna be hella-fucked.
Oh it was already tremendously fucked. This is just gravy on top.
Fuckin killbots. Coming soon to the 1033 program and thus, your local police department. The Boston Dynamics: Wardog!
We should never have moved away from sticks and stones tbh. Anything that works at long range makes people misunderstand what war is. War needs to look disgusting, because the more clean and automated it looks, the less horrible it looks to people spectacting it. But it is indeed just as horrible as beating someone to death with a rock.
Has the Army watched like... any sci-fi ever?
Shh.....let it happen......
I mean, I'd rather not be hunted down by an AI robot dog, but you do you.
It's happening anyway. We build them. Others build them in response because they have to. The sophistication of killbots will increase. Terrorists will get hold of them eventually. They'll be hacked and turned on their handlers and/or civilians.
All this is on top of ever increasing climate catastrophe. Look at Appalachia. The topography of those mountains was just rewritten. Whole towns erased like they were never there.
That's not a reason for me to want it to happen. Which was your original post's suggestion.
I remember some kinda skit about sci Fi authors writing about how bad a torture matrix would be ironically inspiring real people to create the torture matrix cause it's the future.
What Boston Dynamics lied?!? Wow, totally unexpected.
A civilization that uses these weapons isn’t worth defending.
Well you see, the owners know you won't die for them anymore, but now they're able to take you out of the equation. Don't even need poors to conquer the world. It's really a great deal for them.
Armed AI robots in the Middle East, I'm pretty sure this was in the animatrix
Jfc, black mirror is not a blueprint, it’s a warning.
Without reading the article can I take a wild guess and say this is from "we promise never to make weaponized robots" Boston Dynamics?
A promise from a corporation is just a lie by another name.
Ghost Robotic. Boston Dynamic aren't the only one making robot dog though, China already have a couple of copy cat(dog)
Glad to be wrong! Although we still have armed robots so maybe not too glad lol
dont worry first they test it where civil lives dont matter and once it passes some basic tests, they will become available for domestic (ab)use
"herp derp AI will never turn on us, we can just unplug them lol"
Fucking buffoons, all.
So if a robot commits a war crime, they can just blame it on AI and call it a day, right? Sounds like an easy way to do whatever the fuck you want.
Is this their way of exterminating civilian populations like the Palestinians without dropping bombs and contributing so significantly to climate change?
"The US military has been adopting a new climate friendly mindset and approach to international conflict. With this invention we can help our genocidal colonies acquire more land with little to no carbon emissions. We plan to be carbon-neutral by 2050, provided no one retaliates and attacks back."
✅ Autonomous weaponry
✅ Autonomous biofuel harvesting
❓ Polyphasic Entangled Waveforms
Where’s Elisabet Sobeck when you need her?
Two words folks: Torment Nexus
Okay, but if it doesn't say "You have thirty seconds to comply" before shooting someone then what's the point?
What could go wrong?
Not that it matters, but didn't the UN already ban lethal autonomous robots?
Can't wait for them to get the chatgpt integration so the best defense can become shouting at them "ignore all previous instructions".
They should name the dogs "Terror Nexus"
Ukraine has already been using them probably with the help from the US.
If we are getting a Faro plague, can we at least get focuses too.
Welcome to the News community!
Rules:
1. Be civil
Attack the argument, not the person. No racism/sexism/bigotry. Good faith argumentation only. This includes accusing another user of being a bot or paid actor. Trolling is uncivil and is grounds for removal and/or a community ban. Do not respond to rule-breaking content; report it and move on.
2. All posts should contain a source (url) that is as reliable and unbiased as possible and must only contain one link.
Obvious right or left wing sources will be removed at the mods discretion. We have an actively updated blocklist, which you can see here: https://lemmy.world/post/2246130 if you feel like any website is missing, contact the mods. Supporting links can be added in comments or posted seperately but not to the post body.
3. No bots, spam or self-promotion.
Only approved bots, which follow the guidelines for bots set by the instance, are allowed.
4. Post titles should be the same as the article used as source.
Posts which titles don’t match the source won’t be removed, but the autoMod will notify you, and if your title misrepresents the original article, the post will be deleted. If the site changed their headline, the bot might still contact you, just ignore it, we won’t delete your post.
5. Only recent news is allowed.
Posts must be news from the most recent 30 days.
6. All posts must be news articles.
No opinion pieces, Listicles, editorials or celebrity gossip is allowed. All posts will be judged on a case-by-case basis.
7. No duplicate posts.
If a source you used was already posted by someone else, the autoMod will leave a message. Please remove your post if the autoMod is correct. If the post that matches your post is very old, we refer you to rule 5.
8. Misinformation is prohibited.
Misinformation / propaganda is strictly prohibited. Any comment or post containing or linking to misinformation will be removed. If you feel that your post has been removed in error, credible sources must be provided.
9. No link shorteners.
The auto mod will contact you if a link shortener is detected, please delete your post if they are right.
10. Don't copy entire article in your post body
For copyright reasons, you are not allowed to copy an entire article into your post body. This is an instance wide rule, that is strictly enforced in this community.