That’s really what I’m expecting. My guess is that the training data is skewed, and the prompt cannot adjust.
Either the machine will need to understand what is expected, or the company will need to address this and allow people to enable or disable diversity.
The first option may be impossible to attain at this stage. The second can lead to inappropriate images.
I didn’t see that. I got Oops All Asians when generating gay furry Nazis on Bing.
To answer further questions, it was in response to a post about how Russians are said to view Ukrainian.