Mastodon, an alternative social network to Twitter, has a serious problem with child sexual abuse material according to researchers from Stanford University. In just two days, researchers found over 100 instances of known CSAM across over 325,000 posts on Mastodon. The researchers found hundreds of posts containing CSAM related hashtags and links pointing to CSAM trading and grooming of minors. One Mastodon server was even taken down for a period of time due to CSAM being posted. The researchers suggest that decentralized networks like Mastodon need to implement more robust moderation tools and reporting mechanisms to address the prevalence of CSAM.

  • Shiri Bailem@foggyminds.com
    link
    fedilink
    arrow-up
    7
    ·
    1 year ago

    @mudeth @pglpm you really don’t beyond our current tools and reporting to authorities.

    This is not a single monolithic platform, it’s like attributing the bad behavior of some websites to HTTP.

    Our existing moderation tools are already remarkably robust and defederating is absolutely how this is approached. If a server shares content that’s illegal in your country (or otherwise just objectionable) and they have no interest in self-moderating, you stop federating with them.

    Moderation is not about stamping out the existence of these things, it’s about protecting your users from them.

    If they’re not willing to take action against this material on their servers, then the only thing further that can be done is reporting it to the authorities or the court of public opinion.