38
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 30 Sep 2024
38 points (100.0% liked)
TechTakes
1416 readers
232 users here now
Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.
This is not debate club. Unless it’s amusing debate.
For actually-good tech, you want our NotAwfulTech community
founded 1 year ago
MODERATORS
Today in "Promptfondler fucks around and finds out."
So I'm guessing what happened here is that the statistically average terminal session doesn't end after opening an SSH connection, and the LLM doesn't actually understand what it's doing or when to stop, especially when it's being promoted with the output of whatever it last commanded.
Emphasis added.
just instruct it "be sentient" and you're good, why don't these tech CEOs undersand the full potential of this limitless technology?
so I snipped the prompt from the log, and:
wow, so efficient! I'm so glad that we have this wonderful new technology where you can write 2kb of text to send to an api to spend massive amounts of compute to get back an operation for doing the irredeemably difficult systems task of initiating an ssh connection
these fucking people
there are multiple really bad and dumb things in that log, but this really made me lol (the IPs in question are definitely in that subnet)
if it were me, I'd be fucking embarrassed to publish something like this as anything but a talk in the spirit of wat. but the promptfondlers don't seem to have that awareness
Thanks for sharing this lol
it’s a classic
similarly, Mickens talks. if you haven’t ever seen ‘em, that’s your next todo
But playing spicy mad-libs with your personal computers for lols is critical AI safety research! This advances the state of the art of copy pasting terminal commands without understanding them!
I also appreciated The Register throwing shade at their linux sysadmin skills:
OMG. This is borderline unhinged behaviour. Yeah, let's just give root permission to an LLM and let it go nuts in prod. What could possibly go wrong?