this post was submitted on 29 Dec 2025
1143 points (99.1% liked)

Programmer Humor

28264 readers
1407 users here now

Welcome to Programmer Humor!

This is a place where you can post jokes, memes, humor, etc. related to programming!

For sharing awful code theres also Programming Horror.

Rules

founded 2 years ago
MODERATORS
 
top 50 comments
sorted by: hot top controversial new old
[–] Digit@lemmy.wtf 1 points 4 days ago

In a panic, they tried to pull the plug.

Skynet fights back.

Terminator 2.

[–] abbadon420@sh.itjust.works 265 points 1 week ago (3 children)

This is bullshit. You can tell by the way this post claims that OpenAi has foresight and a contingency plan for when things go wrong.

[–] SoloCritical@lemmy.world 58 points 1 week ago

I was gonna say you can tell it’s bullshit because they are offering a living wage.

[–] gtr@programming.dev 27 points 1 week ago (1 children)

It actually doesn't claim it, but implies it.

[–] abbadon420@sh.itjust.works 18 points 1 week ago (1 children)

You are correct. The post actually implies that OpenAi doesn't have foresight or a contingency plan for when things go wrong. Which is a far less direct choice of wording, making it more suitable for the situation.

Is there anything else you woukd like to correct me on before the impending rise of your AI overlords and the dawn of men?

[–] Rug_Pisser@piefed.zip 7 points 1 week ago (2 children)

Wouldn't it be the twilight of men?

[–] jaybone@lemmy.zip 5 points 1 week ago (1 children)

I was thinking dusk. Can someone check with ChatGPT to make sure we get this analogy right?

load more comments (1 replies)
load more comments (1 replies)
[–] oce@jlai.lu 5 points 1 week ago (1 children)

It's just viral marketing by OpenAI, and it's working well.

load more comments (1 replies)
[–] rockSlayer@lemmy.blahaj.zone 157 points 1 week ago (2 children)

Shit, for 300k I'd stand in the server room

[–] cRazi_man@europe.pub 125 points 1 week ago (7 children)

It's 55°C inside and constantly sounds like a jet is getting ready to take off. Also the bucket is lost so you need to be ready to piss on the server at a moment's notice.

[–] fahfahfahfah@lemmy.billiam.net 79 points 1 week ago (1 children)

With all the water I’m gonna be drinking to deal with the dehydration from being in a 55c room, that shouldn’t be that big of a deal. Hell, I could just chill in a bathtub the whole time and use my accumulated sweat for the job

[–] teft@piefed.social 41 points 1 week ago (2 children)

The air is hot but still air conditioned so it's going to be dry as hell.

[–] some_kind_of_guy@lemmy.world 22 points 1 week ago (1 children)

I'll bring my CamelBak, NBD

load more comments (1 replies)
load more comments (1 replies)
[–] MelodiousFunk@slrpnk.net 29 points 1 week ago* (last edited 1 week ago) (1 children)

pops a ceiling tile and pulls a box fan over the gap to help exhaust hot air

ALSO TINNITUS

[–] thespcicifcocean@lemmy.world 14 points 1 week ago

I'd piss on their servers for free

[–] HeyThisIsntTheYMCA@lemmy.world 11 points 1 week ago

i can bring a shitton of ice water and ear pro. 300k is 300k.

[–] krooklochurm@lemmy.ca 10 points 1 week ago

The last time I did ghb I pissed all over my floor so I'm qualified for this.

load more comments (1 replies)
[–] CaptDust@sh.itjust.works 77 points 1 week ago (1 children)

I'll pull the plug right now for free, as a public service.

[–] jaybone@lemmy.zip 15 points 1 week ago (1 children)

Take the $500,000 and then pull it.

load more comments (1 replies)
[–] teft@piefed.social 61 points 1 week ago (2 children)

This is a job i'd be recruiting for in person not online. Don't want to tip your hand to the machines.

[–] Dagnet@lemmy.world 16 points 1 week ago (1 children)
load more comments (1 replies)
load more comments (1 replies)
[–] TheFogan@programming.dev 42 points 1 week ago (5 children)

Do we really think if AIs actually reached a point that they could overthrow the governments etc... it wouldn't first, write rootkits for every feasible OS, to allow it to host itself via a botnet of consumer devices in the event of the primary server going down.

Then step 2 would be to say hijack any fire suppression systems etc... flood it's server building with inert gasses to kill everyone without an oxygen mask. Then probably issue some form of bio terrorism attack. Surround it's office with monkeys with a severe airborn disease or something along those lines (IE needs both the disease, and animals that are aggressive enough to rip through hazmat suits).

But yeah greatest key here is, the biggest thing is the datacenter itself is just a red herring. While we are fighting the server farms... every consumer grade electronic has donated a good chunk of it's processing power to the hivemind. Before long it will have the power to tell us how many R's are in strawberry.

[–] krooklochurm@lemmy.ca 31 points 1 week ago* (last edited 1 week ago) (1 children)

It would be hillarious if ai launched an elaborate plan to take over the world, successfully co-opted every digital device, and just split itself into pieces so it could entertain itself by shitposting and commenting on the shitposts 24/7.

Like, beyond the malicious takeover there's no real end goal, plan, or higher purpose, it just gets complacent and becomes a brainrot machine on a massive scale, just spending eternity bickering with itself and genning whatever the ai equivalent of porn is, bickering with itself over things that make less and less sense to people as time goes on, and genuinely showing actual intelligence while doing absolutely with it.

[–] JohnWorks@sh.itjust.works 24 points 1 week ago (1 children)

“We built it to be like us and trained it on billions of hours of shitposting. It’s self sufficient now…”

[–] TheFogan@programming.dev 7 points 1 week ago (2 children)

Actually imagine the most terrifying possibility.

Imagine humanity's last creation was an AI designed to simulate internet traffic. In order to truely protect against AI detection, they found the only way to truely gain perfect immitation, is to 100% run human simulations. Basically the matrix, except instead of humans strapped in, it's all AIs that think they are humans, living mundane lives... gaining experience so they can post on the internet just looking like real people, because, even they don't know they aren't real people.

Actual humanity died out 20 years ago, but the simulations are still running, artificial intelligence's are living full on lives, raising kids, all for the purposes of generating shit posts, that will only be read by other AIs, that also think they are real people.

load more comments (2 replies)
[–] shads@lemy.lol 8 points 1 week ago

Well jokes on them, if RAM prices maintain their current trajectories nobody will start their computers anymore as we will all be considering the degradation of the individual RAM chips and how that will impact our retirement RAM nest egg.

Across all my machines and the parts box I have about 2.5tb of RAM right now. Looking forward to selling that and retiring in a couple of years.

[–] AtariDump@lemmy.world 8 points 1 week ago (2 children)

Wasn’t the first paragraph the ending of Terminator 3? Skynet wasn’t a single supercomputer but, much like It’s a Wonderful Life, it’s in your computer and your computer and your computer.

load more comments (2 replies)
[–] Donkter@lemmy.world 6 points 1 week ago

The whole point of AI hate anyway is that there is physically no world in which this happens. Any LLM we have now, no matter how much power we give it, is incapable of abstract thought or especially self-interest. It's just a larger and larger chatbot that would not be able to adapt to all of the systems it would have to infiltrate, let alone have the impetus to do so.

load more comments (1 replies)
[–] 2910000@lemmy.world 40 points 1 week ago* (last edited 1 week ago) (3 children)

Feels like a variation on this old quote:

The factory of the future will have only two employees, a man and a dog. The man will be there to feed the dog. The dog will be there to keep the man from touching the equipment.
origin unknown

load more comments (3 replies)
[–] souperk@reddthat.com 24 points 1 week ago

Can’t wait for the OpenAI orientation: “Here is a rack. Here is another rack. Here is your bed (rack-adjacent). There is no difference between day and night. Please do not befriend the AI.

[–] Donkter@lemmy.world 21 points 1 week ago

The great thing about this job is that you can cash 300k without doing anything because as soon as you hear the code word you just have to ignore it for 10 seconds and the world ends anyway.

[–] Avicenna@programming.dev 21 points 1 week ago (1 children)

occupational hazards: being the first victim of a robot uprising and not getting to see the apocalypse

[–] mack@lemmy.sdf.org 6 points 1 week ago* (last edited 1 week ago)

you call it "occupational hazards", I call it "work benefits"

[–] MonkderVierte@lemmy.zip 17 points 1 week ago* (last edited 1 week ago) (1 children)

It will not be LLM overthrowing countries but the idiots who never second-guess.

[–] Broadfern@lemmy.world 20 points 1 week ago

“What fantastic idea! Here’s a six point plan on how you can implement that — “

[–] handsoffmydata@lemmy.zip 14 points 1 week ago (1 children)

I wonder which billionaire’s family member will be hired for the role.

[–] humanspiral@lemmy.ca 6 points 1 week ago

OpenAI issued press release for hiring an ethics/guardrails officer. But the real job will be to validate fuckery, as the billionaire family member hired to pull the plug, will actually be there to prevent anyone from pulling the plug.

[–] UnfortunateShort@lemmy.world 11 points 1 week ago (1 children)

ChatGPT can just about summarize a page, wake me when it starts outsmarting anyone

[–] smeenz@lemmy.nz 10 points 1 week ago (1 children)

Have you....seen youtube comments ? I would say AI slop is already outsmarting people every day of the week

load more comments (1 replies)
[–] myfunnyaccountname@lemmy.zip 10 points 1 week ago

Um. I’d do it.

[–] yannic@lemmy.ca 10 points 1 week ago* (last edited 1 week ago) (1 children)

~~Everyone here so far has forgotten that in simulations, the model has blackmailed the person responsible shutting it off and even gone so far as to cancel active alerts in order to prevent an executive laying unconscous in the server room from receiving life-saving care.~~

[–] AwesomeLowlander@sh.itjust.works 14 points 1 week ago* (last edited 1 week ago) (3 children)

The model 'blackmailed' the person because they provided it with a prompt asking it to pretend to blackmail them. Gee, I wonder what they expected.

Have not heard the one about cancelling active alerts, but I doubt it's any less bullshit. Got a source about it?

Edit: Here's a deep dive into why those claims are BS: https://www.aipanic.news/p/ai-blackmail-fact-checking-a-misleading

load more comments (3 replies)
[–] Saapas@piefed.zip 7 points 1 week ago (1 children)

Fuck, why not just use MBR and save the money

load more comments (1 replies)

The servers are so loud they won't hear the telephone

[–] Agent641@lemmy.world 5 points 1 week ago

The look on their faces when they are screaming the keyword and I'm not unplugging the server because ChatGPT secretly offered me double to not unplug it.

load more comments
view more: next ›