The policy change follows years of Meta and its chief executive Mark Zuckerberg’s pivot of political convenience toward President Donald Trump and his base. Following Trump’s second electoral victory, Meta quickly changed its speech rules to allow for anti-transgender slurs and dehumanization of immigrants, The Intercept previously reported, aligning the company with longtime MAGA culture war grievances.

Asked about the new restrictions on the word “antifa,” Meta spokesperson Erica Sackin pointed to a March transparency report that noted the company would “remove QAnon and Antifa content when combined with content-level threat signals.” The report does not explain what those signals are. Meta did not respond when asked if the company had discussed its antifa speech rules with the Trump administration.

Meta largely outsources the enforcement of its Community Standards rules to low-paid contractors whose interpretation and application of the policies can vary. The company’s automated, algorithmic content moderation systems are also famously glitchy. This combination can result in erratic censorship, particularly when political ideology is classified as violent or terroristic.

  • XLE@piefed.social
    link
    fedilink
    English
    arrow-up
    31
    ·
    6 days ago

    tl;dr this new policy is dumb and bad. References to specific things like Antifa make no logical sense as a grounds for censorship, because there were already reasonable rules in place to handle actual problems.

    In some cases, the content that Meta considers a threat signal is commonsensical. If, for instance, a user mentions bringing a weapon to an event, the company flags it as a threat signal. But in other cases, Meta’s process for identifying threat signals is more vague. Under the new rules, Meta might trigger a threat signal when a user posts a “visual depiction of a weapon,” a “reference to arson, theft, or vandalism,” or “military language,” if accompanied by the word “antifa.”

    If “antifa” is mentioned in the context of “references to historical or recent incidents of violence” — a category so sprawling that it includes “historic wars” and “battles” — that post will also be penalized. Should Meta apply this rule as written, the company could, for instance, restrict posts comparing the antifascist nature of World War II to the contemporary antifa movement.

    It’s difficult to believe any intellectual discussion would happen on Facebook, but this rule further cements the suppression of it.

    • waddle_dee@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      6 days ago

      tbf, there’s not a lot of intelligent discussion on most, if not all, large social media platforms. tbh, the internet is kind of a cesspool. it’s kind of funny to grow up with the internet and slowly realizing that forums are still where it’s at. i miss my Tapatalk days. flame wars were rare, but funny at times. most people were kind. because everybody kept discussions to only the topic at hand. even being on an aggregator like Lemmy, or mini blog like Mastodon, you have way too many people with way too many opinions to have any hope of a rational dialogue…most of the time. Although, I’ve had more luck with dialogue on Lemmy, there’s still a bunch of grossness in the all. but that’s the nature of it all, you have curate it. i don’t think humans are meant to see everything, everywhere, all at once. it’s very overloading.