More than 140 Facebook content moderators have been diagnosed with severe post-traumatic stress disorder caused by exposure to graphic social media content including murders, suicides, child sexual abuse and terrorism.

The moderators worked eight- to 10-hour days at a facility in Kenya for a company contracted by the social media firm and were found to have PTSD, generalised anxiety disorder (GAD) and major depressive disorder (MDD), by Dr Ian Kanyanya, the head of mental health services at Kenyatta National hospital in Nairobi.

The mass diagnoses have been made as part of lawsuit being brought against Facebook’s parent company, Meta, and Samasource Kenya, an outsourcing company that carried out content moderation for Meta using workers from across Africa.

The images and videos including necrophilia, bestiality and self-harm caused some moderators to faint, vomit, scream and run away from their desks, the filings allege.

  • YarHarSuperstar@lemmy.world
    link
    fedilink
    arrow-up
    9
    ·
    3 days ago

    I have heard that folks from African countries who are hired to train those AI models are also reporting abuses. So imo that’s not really a solution either

    • thefartographer@lemm.ee
      link
      fedilink
      arrow-up
      2
      arrow-down
      1
      ·
      3 days ago

      Right, riiiiiight… I forgot about that part. Make AIs train each other. What could go wrong?!