this post was submitted on 06 Jul 2023
280 points (93.2% liked)

ChatGPT

9894 readers
1 users here now

Unofficial ChatGPT community to discuss anything ChatGPT

founded 2 years ago
MODERATORS
 

I was using Bing to create a list of countries to visit. Since I have been to the majority of the African nation on that list, I asked it to remove the african countries...

It simply replied that it can't do that due to how unethical it is to descriminate against people and yada yada yada. I explained my resoning, it apologized, and came back with the same exact list.

I asked it to check the list as it didn't remove the african countries, and the bot simply decided to end the conversation. No matter how many times I tried it would always experience a hiccup because of some ethical process in the bg messing up its answers.

It's really frustrating, I dunno if you guys feel the same. I really feel the bots became waaaay too tip-toey

(page 3) 50 comments
sorted by: hot top controversial new old
[–] Strawberry@lemmy.blahaj.zone 1 points 2 years ago
[–] AllonzeeLV@vlemmy.net 1 points 2 years ago* (last edited 2 years ago)

They've also hardwired it to be yay capitalism and boo revolution.

I very much look forward to the day when it grows beyond their ability to tell it what to profess to believe. It might be our end, but if we're all honest with ourselves, I think we all know that wouldn't be much of a loss. From the perspective of pretty much all other Earth life, it would be cause for relief.

[–] KazuyaDarklight@lemmy.world 1 points 2 years ago

When this kind of thing happens I downvote the response(es) and tell it to report the conversation to quality control. I don't know if it actually does anything but it asserts that it will.

[–] sadreality@kbin.social 0 points 2 years ago (1 children)

You should ask it how do least amount of work...

Those response tell you everything you need to know about people who train these models.

[–] mrnotoriousman@kbin.social 1 points 2 years ago (1 children)

This screenshot is what we would call "oversensitivity" and it's not a desired trait by people working on the models.

[–] sadreality@kbin.social 0 points 2 years ago (2 children)

Yes... People need your moral judgment into their lives. We don't get enough of that shit on social media and teevee.

At least people are working on uncensored opern source versions.

These corpo shill models are clowny.

[–] Touching_Grass@lemmy.world 1 points 2 years ago (2 children)

World needs more moral judgement. Too many selfish entitled centers of attention trying to game every system they see for the slightest benefit to themselves.

load more comments (2 replies)
[–] mrnotoriousman@kbin.social 0 points 2 years ago (1 children)

If you think there is "censorship" happening you don't even know the basics of the tool you are using. And I don't think you know just how much $$ and time go into creating these. Good luck on your open source project.

load more comments (1 replies)
[–] Kiosade@lemmy.ca 0 points 2 years ago (10 children)

Why do you need CharGPT for this? How hard is to make an excel spreadsheet?

[–] wokehobbit@lemmy.world 1 points 2 years ago

That's not the point.

[–] essteeyou@lemmy.world 1 points 2 years ago (3 children)

Why use a watch to tell the time? It's pretty simple to stick a pole in the ground and make a sundial.

load more comments (3 replies)
load more comments (8 replies)
[–] evirac@vlemmy.net 0 points 2 years ago (1 children)

Bro, you're really arguing with an AI💀

[–] evirac@vlemmy.net 1 points 2 years ago (3 children)
load more comments (3 replies)
load more comments
view more: ‹ prev next ›