this post was submitted on 16 Mar 2026
186 points (97.9% liked)

Fuck AI

6380 readers
1328 users here now

"We did it, Patrick! We made a technological breakthrough!"

A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.

AI, in this case, refers to LLMs, GPT technology, and anything listed as "AI" meant to increase market valuations.

founded 2 years ago
MODERATORS
 

cross-posted from: https://lemmy.world/post/44340504

Our actions and voices do make a difference! Keep AI out of games and reward original creative work.

top 49 comments
sorted by: hot top controversial new old
[–] Kazumara@discuss.tchncs.de 3 points 12 hours ago

Someone should have heckled him off stage at that moment. We are all shocked and sad that a birdbrain like him holds any power in games publishing.

[–] bitjunkie@lemmy.world 4 points 15 hours ago

Because some other dipshit sold them on the idea that they'd be able to continue to make games people wanted to pay for without paying other people to make them. Cry me a river and I'll piss you a puddle.

[–] Astrius@lemmy.ml 3 points 16 hours ago
[–] Aceticon@lemmy.dbzer0.com 4 points 19 hours ago

Greedy fucker "investors" selling their book is literally one of the greatest informational problems in the modern age - they'll do everything in their power to mislead others, from plain old lying and appeals to emotion to buying and turning into propaganda outlets traditional news media and funding projects and even institutions to spread misinformation purelly to push up the profits of their "investments".

[–] MagnyusG@lemmy.world 105 points 1 day ago (2 children)

Major investor is 'shocked and sad' they're not seeing returns on AI.

[–] SaveTheTuaHawk@lemmy.ca 13 points 1 day ago (1 children)

Shocked and sad he is going to be broke.

[–] sukhmel@programming.dev 3 points 1 day ago

That is not very likely, I guess

[–] Ghostie@lemmy.zip 2 points 1 day ago

They are really disappointed in us.

[–] Dojan@pawb.social 19 points 1 day ago

Mm, few things get me as excited as investors being sad. Cry harder, baby.

[–] ShaggySnacks@lemmy.myserv.one 48 points 1 day ago (1 children)

Yeah, I'd say that's one of the reasons they don't like it! Others include the use of artists' work without consent, environmental issues, the quality of AI output, and the feeling that automating culture production can only result in what is now commonly called "AI slop

Summed it prefectly why people hate AI in culture. AI can be very useful in science, medicine, engineering, and similar professions. When the AI is built upon very specific data set. There is no conscious reasoning behind why the AI did what when it makes art.

Generative AI is just slop. It takes previous works and repackages it what the code says. When people make art, there are hundreds of micro decisions that people make. Those micro decisions are gone when AI makes it. Gabi Belle did a great video of why they hate AI art. https://youtu.be/QtZDkgzjmQI

[–] INeedANewUserName@piefed.social 9 points 1 day ago (3 children)

AI is generally only considered useful in professions people aren't actually familiar with. AKA it isn't in its current form to actual experts in anything.

[–] LwL@lemmy.world 2 points 17 hours ago

I don't think they were talking about GenAI with that, and AI (aka ML models) built on specific data sets for a specific purpose can be quite useful. Expecting an LLM to do anything other than language processing well, on the other hand, is insanity.

[–] webadict@lemmy.world 18 points 1 day ago (4 children)

"Generative AI is great at doing everything I suck at, but it's completely terrible at the things I actually know!"

Too many people think that this and do not seem to understand that it is pretty shitty at everything. Well, except getting people to kill themselves, I guess. It's pretty good at doing that.

[–] 8baanknexer@lemmy.world 2 points 20 hours ago

Part of the probleem is how broad the term ai is, and how narrowly it is used. People just mean autoregressor llms and maybe diffusion models, while the term ai is much broader than even machine learning (for instance formal reasoning), which is again broader than backpropagation with gradient descent (for instance boosted trees) which is again broader than generative ai (for instance classifiers and deep learning). All of these are definitely useful in science and engineering and have been for decades, although llms are now beginning to find uses as well.

[–] webadict@lemmy.world 3 points 1 day ago* (last edited 1 day ago) (1 children)

Cue the serial killer telling me that I don't know what I'm talking about and that they could get people to kill themselves so much better and easier.

[–] ShaggySnacks@lemmy.myserv.one 3 points 1 day ago (1 children)

With the way AI companies seem to avoid liability for everything. Fantastic way of becoming a serial killer. Can we workshop some serial killer names?

ShotGPT? Anthraxic?

[–] Beth@piefed.social 1 points 1 day ago

I was watching Ryan hall and his little AI bot the other day. It occasionally goes off the rails.. Weird how he keeps trying though. Sometimes a bit entertaining, but if something I was using was malfunctioning that much I would not consider it a useful tool.

[–] jj4211@lemmy.world 0 points 1 day ago

The silver lining for the AI companies is that there's a lot of real humans getting real money that are also really shitty at what they are paid to do.

[–] jj4211@lemmy.world 3 points 1 day ago

Coincidentally, Hollywood is pretty good at portraying every profession except the one I know!

[–] Furbag@lemmy.world 21 points 1 day ago

Investors don't care about games as art, they carr about games as a vehicle for making money.

If they are pushing for AI in games, it's because they think it will make them money, not because they think it will be good for games.

[–] Ghostie@lemmy.zip 20 points 1 day ago

“Won’t someone please think of the poor shareholders!?”

[–] FlashMobOfOne@lemmy.world 11 points 1 day ago

GenAI sucks.

And no matter how they gaslight us, it continues to suck.

[–] LostWanderer@fedia.io 17 points 1 day ago

Good, those dirty fuckers don't deserve accolades or reward for peddling their lies about the capabilities of LLMs (which are limited because these are just tools). It's honestly better that creative endeavors like games development is human lead, because LLM garbage is so flat and empty. Humanity might have tricked rocks into carrying out complex calculations and other operations using silicon and electricity...We haven't taught it to think or feel. Human beings with lived experiences should be the only people involved in the creative and technical aspect of games development.

I hope they eventually take the L on peddling LLMs as AI, moving on to normal grifts I can point and laugh at them about. ROFL

[–] imacatnotaman@lemmy.ml 11 points 1 day ago

Good. They can go have a pity party circle jerk at Satya's house lol

[–] Klanky@sopuli.xyz 6 points 1 day ago

Hahahahahahahahahahahahaha

[–] 20cello@lemmy.world 2 points 1 day ago
[–] brucethemoose@lemmy.world -3 points 1 day ago* (last edited 1 day ago) (2 children)

I mean, AI in games can be neat.

As a specific example, consider Rimworld mods that generate conversations for characters, flesh out bios, make portraits based on their in-game traits. For free, on lightweight community finetunes that run on your PC.

…I like that. I like how it’s tightly integrated and a good fit, yet also “optional flavor,” not the foundation of a game.

What no one wants is AI Bro bullshit like:

...A group discussion about how the games industry can "capitalize on shifting trends in customer engagement.”

[–] verdigris@lemmy.ml 27 points 1 day ago (2 children)

No thanks, I don't want all of the descriptions and dialogues to be low-quality semi-plagiarized nonsense blabber just to fill space. I don't want modders spending their precious time massaging slop to be fairly relevant when they could just make bespoke content instead.

If a part of a game isn't worthy of human attention, let it be boring or non-existent or an afterthought.

[–] leoj@piefed.zip 9 points 1 day ago

Right?

AI companies stole the collective knowledge, creative juices, and artistic endeavors of the internet, which was shared freely to expand human knowledge and artistry.

Now they wonder why we aren't willing to pay for their repackaged and pillaged slop...

I see a world where knowledge slowly gets hidden behind the choke hold of AI answers and paywalled sources. How do we hold the line?

[–] brucethemoose@lemmy.world 0 points 1 day ago* (last edited 1 day ago) (3 children)

No thanks, I don’t want all of the descriptions and dialogues to be low-quality semi-plagiarized nonsense blabber just to fill space. I don’t want modders spending their precious time massaging slop to be fairly relevant when they could just make bespoke content instead.

That's the thing. It can't be bespoke content, unless it's a quest mod. Rimworld situations are so dynamic they rarely fit the "mould" of something written ahead of time. Hence the placeholder dialogue you often see in base Rimworld is already autogenerated "nonsense blabber just to fill space"

That... and have you ever used small LLMs finetuned for writing? While not perfect, it's nothing like the slop you'd get out of, say, an OpenAI model. The finetuning datasets are open, and some of the base model datasets are open, too.

[–] november@piefed.blahaj.zone 3 points 1 day ago

have you ever used small LLMs finetuned for writing?

No. I don't outsource things I like to do.

[–] Hackworth@piefed.ca 3 points 1 day ago* (last edited 1 day ago) (1 children)

I could see that becoming the standard. Games ship with a switch in Settings that turns on/off the LLM features, with a field to either enter your API key or point it to a local model.

[–] brucethemoose@lemmy.world -3 points 1 day ago* (last edited 1 day ago)

I mean, an paid API key shouldn't be default. It shouldn't even be an option, if you ask me. It should default to a community "horde" of folks playing the game, and prompt you to host an LLM and/or generate some responses for other users if you wish to.

Kinda like the Fediverse. Or the AI Horde, but for a specific game: https://aihorde.net/

I really don't want one more drop of traffic redirected to OpenAI. They're like a cancer in the machine learning community.

[–] jj4211@lemmy.world 2 points 1 day ago (1 children)

My experience with the 'look how amazing it is at writing' is being exceedingly bored by the prattling on without substance.

Like sure the style and structure can be less obviously bad, but it is still ultimately senselessly padding out a short prompt into a mountain of words that say no more than what the short prompt conveyed in the first place.

If I want to dwell on some imagery, I can and have set down a book and just contemplated what I read and let it fill my mind. I don't need a ton of words to force me to linger.

[–] brucethemoose@lemmy.world 1 points 1 day ago* (last edited 1 day ago)

It wouldn't be long monologues. It's short bits of conversation, or maybe 1 sentence descriptions.

Again, throw everything you think you know about chat models out of your head. Throw everything related to multi-turn conversation and prompt engineering out.

The prompt would look like a mess of programming variables: Rimworld skill levels and passions, traits, injuries, clothes and their state, logs of events, maybe a plot of entities around them. It would condense a bunch of information down (to, say, some reasonable quip of dialogue this character would say,) which is what text modeling was supposed to do before these stupid chatbots came in and spammed everything up.


I get the sentiment that, sometimes, imagination is better. I like to read, or write out stories stuck in my head.

...But sometimes I'd rather play a game.

[–] Kirk@startrek.website 4 points 1 day ago (2 children)

I think it could be cool for background NPC dialogue in big open RPGs like Skyrim. Imagine if townsfolk could have realistic conversations and interactions like bartering over good, etc. Nothing major or plot-dependent, obviously. Just something more natural than a handful of repeated, scripted and prerecorded phrases.

I would compare it to ray-tracing. Ray tracing means the artists don't have to plan out every single light beam, and the result is actually more realistic than if they had. A tactfully used LLM could mean they don't have to plan out every line of background dialogue and also achieve a more realistic result.

[–] brucethemoose@lemmy.world 4 points 1 day ago* (last edited 1 day ago) (1 children)

Imagine if townsfolk could have realistic conversations and interactions like bartering over goods, etc. Nothing major or plot-dependent, obviously. Just something more natural than a handful of repeated, scripted and prerecorded phrases.

There's already a in-development Skyrim mod for that. Many sandbox games have mods for exactly this.

I haven't tried the Skyrim one though; haven't been in the Skyrim scene for awhile. And TBH, some of the mods use pretty sad or sloppy LLMs by default.

[–] Kirk@startrek.website 2 points 1 day ago

Ah cool, I'm not surprised. The Skyrim modding community is insane.

[–] jj4211@lemmy.world 2 points 1 day ago

Think the critical thing would be to identify "background content" so that you don't spend forever trying to tease out actionable info from a background character.

That's the biggest thing is that while LLM can do 'flavor text', it's not very good at making sure that characters convey specific relevant detail reliably to a player.

I don't know about 'more realistic' though, LLM game demos can often go pretty out of character. Like a medieval setting NPC discussing coding. Or in one the character talked about how they had just came in from an outside walk, but they were chained in a dungeon cell. Another character talking about how the developer wrote them this way. Keeping an LLM "on the rails" of a scenario can break down.