this post was submitted on 22 Jan 2026
52 points (84.2% liked)

No Stupid Questions

45526 readers
722 users here now

No such thing. Ask away!

!nostupidquestions is a community dedicated to being helpful and answering each others' questions on various topics.

The rules for posting and commenting, besides the rules defined here for lemmy.world, are as follows:

Rules (interactive)


Rule 1- All posts must be legitimate questions. All post titles must include a question.

All posts must be legitimate questions, and all post titles must include a question. Questions that are joke or trolling questions, memes, song lyrics as title, etc. are not allowed here. See Rule 6 for all exceptions.



Rule 2- Your question subject cannot be illegal or NSFW material.

Your question subject cannot be illegal or NSFW material. You will be warned first, banned second.



Rule 3- Do not seek mental, medical and professional help here.

Do not seek mental, medical and professional help here. Breaking this rule will not get you or your post removed, but it will put you at risk, and possibly in danger.



Rule 4- No self promotion or upvote-farming of any kind.

That's it.



Rule 5- No baiting or sealioning or promoting an agenda.

Questions which, instead of being of an innocuous nature, are specifically intended (based on reports and in the opinion of our crack moderation team) to bait users into ideological wars on charged political topics will be removed and the authors warned - or banned - depending on severity.



Rule 6- Regarding META posts and joke questions.

Provided it is about the community itself, you may post non-question posts using the [META] tag on your post title.

On fridays, you are allowed to post meme and troll questions, on the condition that it's in text format only, and conforms with our other rules. These posts MUST include the [NSQ Friday] tag in their title.

If you post a serious question on friday and are looking only for legitimate answers, then please include the [Serious] tag on your post. Irrelevant replies will then be removed by moderators.



Rule 7- You can't intentionally annoy, mock, or harass other members.

If you intentionally annoy, mock, harass, or discriminate against any individual member, you will be removed.

Likewise, if you are a member, sympathiser or a resemblant of a movement that is known to largely hate, mock, discriminate against, and/or want to take lives of a group of people, and you were provably vocal about your hate, then you will be banned on sight.



Rule 8- All comments should try to stay relevant to their parent content.



Rule 9- Reposts from other platforms are not allowed.

Let everyone have their own content.



Rule 10- Majority of bots aren't allowed to participate here. This includes using AI responses and summaries.



Credits

Our breathtaking icon was bestowed upon us by @Cevilia!

The greatest banner of all time: by @TheOneWithTheHair!

founded 2 years ago
MODERATORS
 

Imagine a game like “the sims” where you can adjust how autonomous the sims you control are. I could see Ai being used to control that.

Or having an elder scroll game were you just respond however you want and the npc adapts to it.

you are viewing a single comment's thread
view the rest of the comments
[–] etchinghillside@reddthat.com 6 points 1 day ago* (last edited 1 day ago) (2 children)

Are you willing to put in an API key and pay money for interactions with an LLM?

It’s not really a one time cost. And I don’t know if devs really want to take on that expense.

[–] lmmarsano@lemmynsfw.com 3 points 1 day ago (1 children)

Is an API key necessary? Pretty sure there are local LLMs.

[–] SGforce@lemmy.ca 7 points 1 day ago (2 children)

They would increase requirements significantly and be generally pretty bad and repetitive. It's going to take some time before that happens.

[–] lmmarsano@lemmynsfw.com 2 points 1 day ago (1 children)

Would it? Game developers can run anything on their own servers.

[–] hayvan@piefed.world 6 points 1 day ago (1 children)

That would be crazy expensive for the studios. LLM companies are selling their services at a loss at the moment.

[–] lmmarsano@lemmynsfw.com 4 points 1 day ago (1 children)

crazy expensive

Citation missing, so unconvincing. We're not talking about a general purpose LLM here. Are pretrained, domain-specific LLMs or SLMs "crazy expensive" to run?

[–] studmaster@lemmy.world 1 points 17 hours ago

"it doesn't have a library already so we dont wanna do it" - basically everyone

[–] Pika@sh.itjust.works 1 points 1 day ago* (last edited 1 day ago)

games already have pretty extensive requirements for function, one model running for all NPC's wouldn't be that bad i dont think. it would raise ram requirements by maybe a gig or 2 and likely slow down initial loading time as the model has to load in. I'm running a pretty decent model on my home server which does the duties of a personified char and the CT im running ollama on only has 3 gigs allotted to it. And thats not even using the GPU which would speed it up tremendously

I think the bigger problem would be testing wise that would be a royal pain in the butt to manage, having to make a profile/backstory for every char that you want running on the LLM. You would likely need a boilerplate ruleset, and then make a few basic rules to model it after. But the personality would never be the same player to player nor would it be accurate, like for example I can definitly see the model trying to give advice that is impossible for the Player to actually do as it isn't in the games code.

[–] hesh@quokk.au 3 points 1 day ago* (last edited 1 day ago) (1 children)

I'd figure that small models could be run locally and even incorporated into the local game code without needing to use a big company's API, if they wanted to.

[–] AA5B@lemmy.world 1 points 1 day ago* (last edited 1 day ago)

There are models that can run on raspberry pi. Obviously not the latest and greatest but still useful

The training is much more expensive than the actual usage