corb3t@lemmy.world to Technology@lemmy.ml · edit-22 years agoStanford researchers find Mastodon has a massive child abuse material problemwww.theverge.comexternal-linkmessage-square161fedilinkarrow-up1173arrow-down136file-textcross-posted to: [email protected][email protected][email protected][email protected]
arrow-up1137arrow-down1external-linkStanford researchers find Mastodon has a massive child abuse material problemwww.theverge.comcorb3t@lemmy.world to Technology@lemmy.ml · edit-22 years agomessage-square161fedilinkfile-textcross-posted to: [email protected][email protected][email protected][email protected]
minus-squareredcalcium@lemmy.institutelinkfedilinkarrow-up10arrow-down1·2 years agoIf you run your instance behind cloudlare, you can enable the CSAM scanning tool which can automatically block and report known CSAMs to authorities if they’re uploaded into your server. This should reduce your risk as the instance operator. https://developers.cloudflare.com/cache/reference/csam-scanning/
minus-squareArotrios@kbin.sociallinkfedilinkarrow-up3·2 years agoSweet - thanks - that’s a brilliant tool. Bookmarked.
If you run your instance behind cloudlare, you can enable the CSAM scanning tool which can automatically block and report known CSAMs to authorities if they’re uploaded into your server. This should reduce your risk as the instance operator.
https://developers.cloudflare.com/cache/reference/csam-scanning/
Sweet - thanks - that’s a brilliant tool. Bookmarked.