corb3t@lemmy.world to Technology@lemmy.ml · edit-21 year agoStanford researchers find Mastodon has a massive child abuse material problemwww.theverge.comexternal-linkmessage-square160fedilinkarrow-up1173arrow-down136file-textcross-posted to: [email protected][email protected][email protected][email protected]
arrow-up1137arrow-down1external-linkStanford researchers find Mastodon has a massive child abuse material problemwww.theverge.comcorb3t@lemmy.world to Technology@lemmy.ml · edit-21 year agomessage-square160fedilinkfile-textcross-posted to: [email protected][email protected][email protected][email protected]
minus-squareredcalcium@lemmy.institutelinkfedilinkarrow-up10arrow-down1·1 year agoIf you run your instance behind cloudlare, you can enable the CSAM scanning tool which can automatically block and report known CSAMs to authorities if they’re uploaded into your server. This should reduce your risk as the instance operator. https://developers.cloudflare.com/cache/reference/csam-scanning/
minus-squareArotrios@kbin.sociallinkfedilinkarrow-up3·1 year agoSweet - thanks - that’s a brilliant tool. Bookmarked.
If you run your instance behind cloudlare, you can enable the CSAM scanning tool which can automatically block and report known CSAMs to authorities if they’re uploaded into your server. This should reduce your risk as the instance operator.
https://developers.cloudflare.com/cache/reference/csam-scanning/
Sweet - thanks - that’s a brilliant tool. Bookmarked.