Google confirms its latest update can scan all your photos to “use actual images of you and your loved ones” in AI image generation. That means Gemini seeing who you know and what you do. You likely have tens or hundreds of thousands of photos. They’re all exposed if you update.

  • InFerNo@lemmy.ml
    link
    fedilink
    English
    arrow-up
    42
    arrow-down
    3
    ·
    12 hours ago

    Does it matter what we do? There’s definitely some family member, friend, colleague who has taken a picture of you and your likeness will be processed.

    It’s like people tagging you in Facebook pictures even if you don’t have an account, but worse, because that was an active step. This is fully automated.

    • Lumisal@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      6 hours ago

      There’s definitely some family member, friend, colleague who has taken a picture of you and your likeness will be processed.

      Hah, I’m good then.

      I’m a vampire.

      • Canaconda@lemmy.ca
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        5 hours ago

        Actually only DSLRs use mirrors. Every other camera can see you Nightwalker

            • Lumisal@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              ·
              4 hours ago

              Don’t know, but in mirrors it looks funky. And I only have a Minolta Maxxum 7000. They don’t build then like they used to.

              I imagine AI finds it even weirder.