Mastodon, an alternative social network to Twitter, has a serious problem with child sexual abuse material according to researchers from Stanford University. In just two days, researchers found over 100 instances of known CSAM across over 325,000 posts on Mastodon. The researchers found hundreds of posts containing CSAM related hashtags and links pointing to CSAM trading and grooming of minors. One Mastodon server was even taken down for a period of time due to CSAM being posted. The researchers suggest that decentralized networks like Mastodon need to implement more robust moderation tools and reporting mechanisms to address the prevalence of CSAM.

  • Bendavisunlv6@lemmynsfw.com
    link
    fedilink
    arrow-up
    16
    ·
    1 year ago

    This is one of the things I don’t like about the whole Twitter format. There’s no moderator layer. Every lemmy community must be created by a moderator and that mod can be held accountable.

    There isn’t even a concept of communities on Twitter / Mastodon. Hashtags? Nobody owns monitoring them, and they can be freely improvised at will. It really is just the instance and its zillion users with nothing in between. Imagine a lemmy instance admin being responsible for all the moderation… would never work.