this post was submitted on 06 Jul 2023
669 points (94.4% liked)

ChatGPT

8903 readers
2 users here now

Unofficial ChatGPT community to discuss anything ChatGPT

founded 1 year ago
MODERATORS
 

I was using Bing to create a list of countries to visit. Since I have been to the majority of the African nation on that list, I asked it to remove the african countries...

It simply replied that it can't do that due to how unethical it is to descriminate against people and yada yada yada. I explained my resoning, it apologized, and came back with the same exact list.

I asked it to check the list as it didn't remove the african countries, and the bot simply decided to end the conversation. No matter how many times I tried it would always experience a hiccup because of some ethical process in the bg messing up its answers.

It's really frustrating, I dunno if you guys feel the same. I really feel the bots became waaaay too tip-toey

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 20 points 1 year ago (3 children)

Just make a new chat ad try again with different wording, it's hung up on this

[–] [email protected] 26 points 1 year ago

Honestly, instead of asking it to exclude Africa, I would ask it to give you a list of countries "in North America, South America, Europe, Asia, or Oceania."

[–] [email protected] 6 points 1 year ago

Chat context is a bitch sometimes...

[–] [email protected] 3 points 1 year ago* (last edited 1 year ago) (1 children)

Is there an open source A^i without limitations?

[–] [email protected] 4 points 1 year ago

If there were, we wouldn't have Bing's version...