58

I've fucked around a bit with ChatGPT and while, yeah, it frequently says wrong or weird stuff, it's usually fairly subtle shit, like crap I actually had to look up to verify it was wrong.

Now I'm seeing Google telling people to put glue on pizza. That a bit bigger than getting the name of George Washington's wife wrong or Jay Leno's birthday off by 3 years. Some of these answers seem almost cartoonish in their wrongness I almost half suspect some engineer at Google is fucking with it to prove a point.

Is it just that Googles AI sucks? I've seen other people say that AI is now getting info from other AIs and it's leading to responses getting increasingly bizarre and wrong so... idk.

you are viewing a single comment's thread
view the rest of the comments
[-] FlakesBongler@hexbear.net 57 points 5 months ago

From the looks of it, it mainly trawls for info off of Reddit and Quora

The problem being that these sources are frequently wrong or deliberately jokey nonsense

But the AI doesn't know that!

[-] Dessa@hexbear.net 31 points 5 months ago

Can someone with the google ai answers ask it where pee is stored?

[-] BeamBrain@hexbear.net 28 points 5 months ago

I regret to inform you that it answered correctly

this post was submitted on 24 May 2024
58 points (100.0% liked)

askchapo

22738 readers
161 users here now

Ask Hexbear is the place to ask and answer ~~thought-provoking~~ questions.

Rules:

  1. Posts must ask a question.

  2. If the question asked is serious, answer seriously.

  3. Questions where you want to learn more about socialism are allowed, but questions in bad faith are not.

  4. Try !feedback@hexbear.net if you're having questions about regarding moderation, site policy, the site itself, development, volunteering or the mod team.

founded 4 years ago
MODERATORS