• MacN'Cheezus@lemmy.today
    link
    fedilink
    English
    arrow-up
    3
    ·
    edit-2
    9 months ago

    Yes, I saw some talk and a screenshot somewhere that showed that apparently in its current state, Gemini can (or could) be asked to output the prompt enhancements it used along with the generated images.

    The screenshot showed someone asking for images of fruit, and the enhanced prompt included “racially diverse groups of people”. Now if they’re inserting something like that even for images containing no people at, it stands to reason that this is just a default enhancement they ALWAYS apply, no matter the prompt, which would explain the racially diverse Nazis (and all the other brouhahahas we’ve seen from them).

    • PullUpCircuit@iusearchlinux.fyi
      link
      fedilink
      arrow-up
      2
      ·
      9 months ago

      That’s really what I’m expecting. My guess is that the training data is skewed, and the prompt cannot adjust.

      Either the machine will need to understand what is expected, or the company will need to address this and allow people to enable or disable diversity.

      The first option may be impossible to attain at this stage. The second can lead to inappropriate images.