118
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 09 May 2024
118 points (95.4% liked)
PCGaming
6507 readers
4 users here now
Rule 0: Be civil
Rule #1: No spam, porn, or facilitating piracy
Rule #2: No advertisements
Rule #3: No memes, PCMR language, or low-effort posts/comments
Rule #4: No tech support or game help questions
Rule #5: No questions about building/buying computers, hardware, peripherals, furniture, etc.
Rule #6: No game suggestions, friend requests, surveys, or begging.
Rule #7: No Let's Plays, streams, highlight reels/montages, random videos or shorts
Rule #8: No off-topic posts/comments
Rule #9: Use the original source, no editorialized titles, no duplicates
founded 1 year ago
MODERATORS
Because players are just dying for generative AI!!!!
I unironically am; but I don't think EA is gonna use it the way I would actually like to see it used in a video game. They hear "I want AI dialogue" and will use AI to write (and probably code) the game, instead of having it generated real time so you can communicate with the NPCs as if they were actually there.
The problem is that you can't really control what AI spits out. May fit perfectly into the story, or it may be immersion breaking nonsense that doesn't even fit into the narrative. What if a character suddenly makes a promise or tells you a key plotline point that it has just made up? I, for one, prefer games to be handcrafted to deliver a quality reliable experience instead of being a coinflip.
You can control what it spits out, though. They already do somewhat.
Edit: Gonna go out on a limb and assume most of you haven't actually played any of the projects currently doing this. Or mess with chatbots at all.
Somewhat is key. You can try to guide it in a direction, but that's it. Also, as a player, you can never be sure if the dialogue is meaningful or not. Does it reveal something about the plot? Is it a key information about the character? Is it just hallucinated gibberish to fill the space?
Besides, LLMs struggle with retaining contextual information for long and they're pretty dang resource hungry. Expect a game with LLM-driven dialogue to reserve several gigs of VRAM and a fair chunk of GPU processing power solely for that.
And then you still get characters who hallucinate plot points or suddenly speak gibberish.
You really can't.
You can run checks and fence it in with traditional software, you can train it more narrowly...
I haven't seen anything that suggests AI hallucinations are actually a solvable problem, because they stem from the fact that these models don't actually think, or know anything.
They're only useful when their output is vetted before use, because training a model that gets things 100% right 100% of the time, is like capturing lightning in a bottle.
It's the 90/90 problem. Except with AI it's looking more and more like a 90/99.99999999 problem.