I was using Bing to create a list of countries to visit. Since I have been to the majority of the African nation on that list, I asked it to remove the african countries…

It simply replied that it can’t do that due to how unethical it is to descriminate against people and yada yada yada. I explained my resoning, it apologized, and came back with the same exact list.

I asked it to check the list as it didn’t remove the african countries, and the bot simply decided to end the conversation. No matter how many times I tried it would always experience a hiccup because of some ethical process in the bg messing up its answers.

It’s really frustrating, I dunno if you guys feel the same. I really feel the bots became waaaay too tip-toey

  • marmo7ade@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    2 years ago

    It’s not confusing at all. ChatGPT has been configured to operate within specific political bounds. Like the political discourse of the people who made it - the facts don’t matter.

    • TheKingBee@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      2 years ago

      Or it’s been configured to operate within these bounds because it is far far better for them to have a screenshot of it refusing to be racist, even in a situation that’s clearly not, than it is for it to go even slightly racist.