Mastodon, an alternative social network to Twitter, has a serious problem with child sexual abuse material according to researchers from Stanford University. In just two days, researchers found over 100 instances of known CSAM across over 325,000 posts on Mastodon. The researchers found hundreds of posts containing CSAM related hashtags and links pointing to CSAM trading and grooming of minors. One Mastodon server was even taken down for a period of time due to CSAM being posted. The researchers suggest that decentralized networks like Mastodon need to implement more robust moderation tools and reporting mechanisms to address the prevalence of CSAM.

  • 🦊 OneRedFox 🦊@beehaw.org
    link
    fedilink
    English
    arrow-up
    31
    ·
    1 year ago

    Yeah I recall that the Japanese instances have a big problem with that shit. As for the rest of us, Facebook actually open sourced some efficient hashing algorithms for use for dealing with CSAM; Fediverse platforms could implement these, which would just leave the issue of getting an image hash database to check against. All the big platforms could probably chip in to get access to one of those private databases and then release a public service for use with the ecosystem.

    • zephyrvs@lemmy.ml
      link
      fedilink
      English
      arrow-up
      5
      ·
      1 year ago

      That’d be useless though, because first, it’d probably opt-in via configuration settings and even if it wasn’t, people would just fork and modify the code base or simply switch to another ActivityPub implementation.

      We’re not gonna fix society using tech unless we’re all hooked up to some all knowing AI under government control.

      • 🦊 OneRedFox 🦊@beehaw.org
        link
        fedilink
        English
        arrow-up
        7
        ·
        1 year ago

        That’d be useless though, because first, it’d probably opt-in via configuration settings and even if it wasn’t, people would just fork and modify the code base or simply switch to another ActivityPub implementation.

        No it wouldn’t, because it’d still be significantly easier for instances to deal with CSAM content with this functionality built into the platforms. And I highly doubt there’s going to be a mass migration from any Fediverse platform that implements such a feature (though honestly I’d be down to defederate with any instance that takes serious issue with this).

        • zephyrvs@lemmy.ml
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          And the instances who want to engage with that material would all opt for the fork and be done with it. That’s all I meant.

      • crystal@feddit.de
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        That’s not the point. Yes, child porn sites can host child porn. Other sites/instances can’t stop that. But what other instances can stop, is redistributing said child porn. And for that purpose, such technology would be useful.

      • Paradoxvoid@aussie.zone
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        1 year ago

        As much as we can (and should) lambast Facebook/Meta’s C-Suite for terrible decisions, their engineers are generally pretty legit.