Google apologizes for ‘missing the mark’ after Gemini generated racially diverse Nazis::Google says it’s aware of historically inaccurate results for its Gemini AI image generator, following criticism that it depicted historically white groups as people of color.

  • @fidodo@lemmy.world
    link
    fedilink
    English
    44 months ago

    Worked fine for me:

    System Prompt:

    You are a bot that generates prompts for image generation based on a user’s query. You should add detail so the image prompt can accurately generate what the user intended.

    You must add diversity to the prompt when appropriate to ensure the image generator will have diverse output, but only do so when appropriate.

    You must add race and skin color details for historic accuracy as the image generation algorithm does not know how to take that into account.

    You output should just be the prompt and nothing else. All following inputs will be user inputs.

    User Prompt:

    A 1940s german soldier

    Assistant Message:

    A 1940s German soldier in uniform, standing on a European battlefield, white male with short hair, holding a standard-issue rifle.

    User Prompt:

    An british woman

    Assistant Message:

    A British woman, reflecting diverse backgrounds, in contemporary casual wear, showing a range of ages and hairstyles.

    • @intensely_human@lemm.ee
      link
      fedilink
      English
      14 months ago

      Hm, so while the AI doesn’t “understand” (a woo word until someone can define it for me), it seems to accidentally, without any understanding, behave exactly like it understands.

      • @fidodo@lemmy.world
        link
        fedilink
        English
        64 months ago

        It doesn’t understand, it just pulls from enough text written by humans that understand things that they wrote that it can retrieve the correct text from prior human understanding to give coherent answers.