this post was submitted on 13 Mar 2026
49 points (100.0% liked)

Slop.

816 readers
452 users here now

For posting all the anonymous reactionary bullshit that you can't post anywhere else.

Rule 1: All posts must include links to the subject matter, and no identifying information should be redacted.

Rule 2: If your source is a reactionary website, please use archive.is instead of linking directly.

Rule 3: No sectarianism.

Rule 4: TERF/SWERFs Not Welcome

Rule 5: No bigotry of any kind, including ironic bigotry.

Rule 6: Do not post fellow hexbears.

Rule 7: Do not individually target federated instances' admins or moderators.

founded 1 year ago
MODERATORS
49
submitted 19 hours ago* (last edited 19 hours ago) by plinky@hexbear.net to c/slop@hexbear.net
 

https://rentahuman.ai/

lmao, i expect they will leak their database faster than advertizing exec thinks about it

you are viewing a single comment's thread
view the rest of the comments
[–] plinky@hexbear.net 7 points 19 hours ago* (last edited 19 hours ago) (1 children)

also, if you are somewhat shielded from legal scrutiny, it's easy to accident someone without anybody being responsible, one person puts some staff in one place another puts something else in another place, oops stuff happens when that precise car goes around at that precise time, or someone runs through a trail with a wire between trees

[–] Alcoholicorn@mander.xyz 3 points 18 hours ago (1 children)

You don't need AI to hire someone to do something extremely dangerous.

[–] plinky@hexbear.net 4 points 18 hours ago (1 children)

with correct iterative prompting (how to remove my competition in business x or someshit, dunno, i don't talk to chatbots) one can achieve an illegal (or morally dubious) result with "innocent" prompts and innocent actions, not to mention getting a hireling to do stuff has a barrier of them being ready to be a goon, which is significantly lower with something like this

[–] Alcoholicorn@mander.xyz 3 points 18 hours ago (1 children)

An AI can't run a vending machine, its not going to secretly organize corporate assassinations.

[–] plinky@hexbear.net 4 points 18 hours ago* (last edited 18 hours ago) (1 children)

if you know where people will be, it can easily do it with human in the loop. and obviously it won't be corporate, it will be intelligence agencies and 4chan.

take package from forest drop to amazon before x, receive package with qr code, put package on the bench in the park. for 300 bucks you can easily organize a crime, which previously required telegram and darknet

i'm not saying ai is the problem here btw, i'm saying it provides very convenient shield (if it becomes popular with say 100k users) for malicious actors to hide their actions in noise.

[–] Alcoholicorn@mander.xyz 2 points 18 hours ago* (last edited 18 hours ago) (2 children)

I'm just not seeing what the AI adds here. Are we prompting the AI to hire someone to tie a wire across a trail instead of hiring them ourselves because "It wasn't me, the AI did it" is supposed to hold up in court?

[–] plinky@hexbear.net 4 points 18 hours ago* (last edited 17 hours ago)

i think it will do both, (i slightly edited the comment), of some cyberpunk style shit where intelligence agencies lead someone to unfortunate end through normal gigs for people participating in them, and cheated on spouses get very mad and might accidentally do stuff.

gaslight that person with minor inconveniences, act on first 10 ideas for 500 bucks, that type of deal

what it creates for agencies - convenient noise and expectation of people participating in this that what they are doing is normal, for other people it creates socially alienated labor to the max and opportunity to get flesh robots to do shenanigans if you are rich enough

like i were rich commie, i would hire a car to drive military neighborhood with blown-out turbo downshifting to give vets ptsd. not illegal, not anything, just weird request.

The most obvious part ai does is simplifying deliver this package in 5 hops from location to location, that’s it, that’s the request. You just shield you request with some natsec reason, and no one will be able to reconstruct what happened, it all goes through one entity

[–] LeeeroooyJeeenkiiins@hexbear.net 3 points 17 hours ago

It adds plausible deniability when humans enter the chain and kill someone and they can just go "oops the AI made a scheduling oopsie"