669
submitted 1 year ago by kolorafa@lemmy.world to c/games@lemmy.world

This should be illegal, companies should be forced to open-source games (or at least provide the code to people who bought it) if they decide to discontinue it, so people can preserve it on their own.

you are viewing a single comment's thread
view the rest of the comments

oh god this reminds me of Japanese man who married Hatsune Miku in hologram form can no longer speak to his wife of four years.

"The doting husband has gained thousands of followers on Instagram by sharing insights into his life with Miku, but things took an unexpected turn during the pandemic when Gatebox announced it was discontinuing its service for Miku."

this is why I have trust issues with proprietary software

[-] Agent641@lemmy.world 47 points 1 year ago

"Megacorp killed my cyberwife" Is a heck of an vigilante origin story

[-] brsrklf@jlai.lu 9 points 1 year ago

Or, you know, supervillain.

[-] FartsWithAnAccent@lemmy.world 7 points 1 year ago

"Ultraweeb's Revenge: Coming this Fall!"

[-] CaptObvious@literature.cafe 3 points 1 year ago

I’d read it! :D

[-] Surreal@programming.dev 32 points 1 year ago

If that man harnesses the power of LLM like Chat GPT, he can continue talking with his wife

[-] ChaoticNeutralCzech@feddit.de 44 points 1 year ago

Until ChatGPT is shut down. Have control over your waifus, people!

[-] GhostMatter@lemmy.ca 11 points 1 year ago

Let's take back the means of waifus!

[-] Drusenija@lemmy.drusenija.com 13 points 1 year ago

One could say you must seize the means of reproduction.

[-] eestileib@sh.itjust.works 3 points 1 year ago

Representative Boebert has been a commie all along! 🤯

[-] Honytawk@lemmy.zip 1 points 1 year ago

Don't think reproduction has anything to do with this.

[-] magikmw@lemm.ee 10 points 1 year ago

At least the general idea behind language models isn't proprietary and fairly well available in ooen source. Sure, GPT is better, but it could change.

[-] smashboy@kbin.social 8 points 1 year ago

Well, Llama 2 then.

[-] recursive_recursion@programming.dev 1 points 1 year ago* (last edited 1 year ago)

hmm not sure if that would work as the model that he was using would be different from what's available so he'd probably notice some differences which might cause a mix of uncanny valley and surrealism/suspension of disbelief where the two are noticably not the same

plus using a chat-only model would be real tragic as it's a significant downgrade from what they already had

his story actually feels like a Romeo and Juliet situation

[-] brsrklf@jlai.lu 1 points 1 year ago

Doesn't even take a change of service provider to get there.

Replika had what had very obviously become a virtual mate service too, until they decided "love" wasn't part of their system anymore. Probably because it looked bad for investors, as happened for a lot of AI-based services people used for smut.

So a bunch of lonely people had their "virtual companion" suddenly lobotomized, and there's nothing they could do about it.

[-] SCB@lemmy.world 1 points 1 year ago

I always thought replika was a sex chatbot? Is/was it "more" than that?

[-] brsrklf@jlai.lu 2 points 1 year ago

It's... complicated.

At first the idea was it'd be training an actual "replica" of yourself, that could reflect your own personality. Then when they realized their was a demand for companionship they converted it into virtual friend. Then of course there was a demand for "more than friends", and yeah, they made it possible to create a custom mate for a while.

Then suddenly it became a problem for them to be seen as a light porn generator. Probably because investors don't want to touch that, or maybe because of a terms of servce change with their AI service provider.

At that point they started to censor lewd interactions and pretend replika was never supposed to be more than a friendly bot you can talk to. Which is, depending on how you interpret what services they proposed and how they advertized them until then, kind of a blatant lie.

[-] Surreal@programming.dev 1 points 1 year ago* (last edited 1 year ago)

LLM is capable of role-playing, character.ai for example can get into the role of any character after being trained. The sound is just text-to-speech, character.ai already includes that, though if a realistic voice is desired, it would need to be generated by a more sophisticated method, which is already being done. Example: Neuro-sama, ElevenLabs

[-] FrostbyteIX@lemmy.world 21 points 1 year ago

Geez.....that guy really needs ~~to get laid by~~ a Miku Robot.

[-] ICastFist@programming.dev 1 points 1 year ago

Next thing you know, he doesn't read the fine print, ther "brain" is internet connected and, sooner or later, he won't have a Miku talking back to him again

this post was submitted on 17 Sep 2023
669 points (92.2% liked)

Games

32362 readers
302 users here now

Welcome to the largest gaming community on Lemmy! Discussion for all kinds of games. Video games, tabletop games, card games etc.

Weekly Threads:

What Are You Playing?

The Weekly Discussion Topic

Rules:

  1. Submissions have to be related to games

  2. No bigotry or harassment, be civil

  3. No excessive self-promotion

  4. Stay on-topic; no memes, funny videos, giveaways, reposts, or low-effort posts

  5. Mark Spoilers and NSFW

  6. No linking to piracy

More information about the community rules can be found here.

founded 1 year ago
MODERATORS