This is entirely the fault of the IWF and Microsoft, who create “exclusive” proprietary CSAM prevention software and then only license big tech companies to use it.
https://gleasonator.com/objects/bf56ad41-7168-4db9-be17-23b7e5e08991
It totally looks to me like Big Tech is gonna try to leverage CSAM prevention against the Fediverse. “Oh you want to prevent sex crimes against CHILDREN? Sure, but only on our proprietary services because we’re certainly not gonna fight CP for FREE!”
Putting the blame on Microsoft or IWF is meaningfully missing the point.
People were responsible for moderating what showed up on their forums or servers for years prior to these tools’ existence, people have been doing the same since those tools existed. Neither the tool nor it’s absence are responsible for child porn getting posted to Fediverse instances. If those shards won’t take action against CSAM materials now - what good will the tool do? We can’t run it here and have the tool go delete content from someone elses’ box.
While those tools would make some enforcement significantly easier, the fact that enforcement isn’t meaningfully occurring on all instances isn’t something we can point at Microsoft and claim is their fault somehow.
The same people who are mad at Meta for scraping already public information, are now mad at Microsoft for not forcing themselves into the fedi to scan all private and public content? Consistent view points are hard!