this post was submitted on 11 Feb 2026
347 points (97.5% liked)

memes

19928 readers
2817 users here now

Community rules

1. Be civilNo trolling, bigotry or other insulting / annoying behaviour

2. No politicsThis is non-politics community. For political memes please go to !politicalmemes@lemmy.world

3. No recent repostsCheck for reposts when posting a meme, you can only repost after 1 month

4. No botsNo bots without the express approval of the mods or the admins

5. No Spam/Ads/AI SlopNo advertisements or spam. This is an instance rule and the only way to live. We also consider AI slop to be spam in this community and is subject to removal.

A collection of some classic Lemmy memes for your enjoyment

Sister communities

founded 2 years ago
MODERATORS
 
top 38 comments
sorted by: hot top controversial new old
[–] max@lemmy.blahaj.zone 1 points 1 hour ago

to be honest, pretty

[–] tomiant@piefed.social 2 points 2 hours ago

And I thought I couldn't get any harder!

[–] dejected_warp_core@lemmy.world 4 points 4 hours ago (1 children)

I'm not seeing the downside. Have you seen that cable management?!

[–] emmy67@lemmy.world 1 points 2 hours ago (1 children)

Looks like cable ties. As a former network admin. Fuck cable ties

[–] samus12345@sh.itjust.works 9 points 5 hours ago (1 children)
[–] squirrel@piefed.kobel.fyi 93 points 11 hours ago (1 children)
[–] lurch@sh.itjust.works 20 points 10 hours ago

those cables are well managed

[–] Sabata11792@ani.social 6 points 7 hours ago

Not your hardware, not your girlfriend.

[–] sp3ctr4l@lemmy.dbzer0.com 3 points 6 hours ago* (last edited 6 hours ago) (2 children)

I mean, you can run an LLM locally, its not that hard.

And you can run such a local machine off of solar power, if you have an energy efficient setup.

It is possible to use this tech in a way that is not horrendously evil, and instead merely somewhat questionable, lol.

Hell, I guess you could arguably literally warm a room of your home with your conversations.

[–] Wildmimic@anarchist.nexus 2 points 4 hours ago (1 children)

I run my LLM locally, and still have to turn the heating on because it's not enough power. A high end card is normally rated for about 300W - and it's only running in short bursts to answer questions. so if you are really pushing it, over time you will probably reach around 150W/h - that's not enough at all. You would for sure use more power playing a game using Unreal Engine 5.

Power consumption of LLM's is a lot lower than people think. And running it in a data center will surely be more energy efficient than my aging AM4 platform.

[–] sp3ctr4l@lemmy.dbzer0.com 1 points 3 hours ago* (last edited 3 hours ago)

I run mine on a Steam Deck.

Fairly low power draw on that lol.

Though I'm using it as a coding assistant... not a digital girlfriend.

[–] nandeEbisu@lemmy.world 1 points 1 hour ago

As far as energy goes, its a matter of degree. LLMs are mainly bad emissions-wise because of the volume of calls being made. If you're running it on your GPU, you could have been playing a game or something similarly emitting.

The issue is more image generation models which are 1000 times worse https://www.technologyreview.com/2023/12/01/1084189/making-an-image-with-generative-ai-uses-as-much-energy-as-charging-your-phone/

Original Paper: https://arxiv.org/pdf/2311.16863

A moderately sized text-to-text model that you would run locally is about 10g of carbon for 1000 inferences which is driving a car about 1/40th of a mile. Even assuming your model is running in some kind of agentic loop, maybe 5 inferences / actual response (though it could be dozens depending on the architecture) that gets to you, that's 10gcarbon / 200 messages to your model which is at least 2-3 sessions on the heavy end I would think. You could use it for a year and its equivalent to driving 3 miles if you do that every day.

Image generation, however, is 1000-1500x that so just chatting with your GF isn't that bad. Generating images is where it really adds up.

I wouldn't trust these numbers exactly, they're more ball-park. There's optimizations that they don't include and there's a million other variables that could make it more expensive. I doubt it would be more than 10-20 miles in a car / year for really heavy usage though.

[–] pennomi@lemmy.world 21 points 10 hours ago (1 children)
[–] NichEherVielleicht@feddit.org 14 points 9 hours ago (1 children)
[–] Someonelol@lemmy.dbzer0.com 8 points 9 hours ago

Sometimes you got a nut.

[–] thisisbutaname@discuss.tchncs.de 30 points 11 hours ago

I kinda prefer her without

[–] NatakuNox@lemmy.world 2 points 6 hours ago (1 children)
[–] NichEherVielleicht@feddit.org 2 points 4 hours ago

Ist not fuckable, it's just fucking your mind.

[–] Wildmimic@anarchist.nexus 26 points 11 hours ago

Jokes on them, my AI girlfriend lives under my desk and is currently re-encoding movies.

[–] ThePantser@sh.itjust.works 19 points 10 hours ago (1 children)

She sure is hot, probably could chug a few thousand gallons of water.

[–] pennomi@lemmy.world 8 points 10 hours ago

I like a girl who stays hydrated

[–] hemko@lemmy.dbzer0.com 17 points 11 hours ago

She looks sexy without makeup

[–] KindnessIsPunk@lemmy.ca 7 points 9 hours ago

Feeling cute, might cause RAM shortages later.

[–] myfunnyaccountname@lemmy.zip 8 points 10 hours ago (1 children)

This day and age feels like there is a 50:50 chance it’s a rack or a 48 year old Asian man.

[–] surewhynotlem@lemmy.world 6 points 8 hours ago (1 children)

So that means there's a small chance it's an Asian man with a giant rack?

Fuck yeah, sign me up!

[–] myfunnyaccountname@lemmy.zip 3 points 6 hours ago (1 children)

I like where your head is at.

[–] 5ibelius9insterberg@feddit.org 1 points 3 hours ago

Deep inside some asian man’s rack probably.

[–] Reygle@lemmy.world 1 points 6 hours ago (3 children)

I get such a kick out of "run a local model!" comments.
I recommend you do not run any models, since they're all built exclusively using stolen data no matter what hardware they're running on.

[–] RisingSwell@lemmy.dbzer0.com 1 points 1 minute ago

What if I support piracy in general? Yar har.

[–] P00ptart@lemmy.world 1 points 1 hour ago (1 children)

So you're saying any AI gf has been trafficked?

[–] Reygle@lemmy.world 2 points 1 hour ago (1 children)

That's pretty funny! No, I'm saying your ai gf is a learning disabled abomination made out of billions of stolen reddit comments and quora posts.

[–] P00ptart@lemmy.world 2 points 1 hour ago

Whoa buddy, I'm a millennial that grew up on solo girls. All I need is a tease picture and my hand. This fancy interactive shit is for someone else.

[–] tomiant@piefed.social 0 points 2 hours ago (1 children)

That is the dumbest shit I've heard, you are wrong and should feel bad.

[–] Reygle@lemmy.world 1 points 1 hour ago

Please elaborate without use of ai

[–] poke@sh.itjust.works 0 points 10 hours ago

Both are pretty cool for their own reasons.