The popular brand Meta adds Threads to the Oversight Board, which is a platform that helps moderate content on Facebook, Instagram, and Thread.
Instagram Threads, the company’s newest platform, is now included under the purview of Meta’s Oversight Board. Originally intended to serve as an independent appeals body, the Board has ruled on matters involving Facebook’s suspension of Donald Trump, false information about COVID-19, the removal of images showing breast cancer, and more.
The Board has started considering cases from Meta’s rival Twitter/X, Threads.
This is a crucial distinction between Threads and competitors such as X, where users such as Elon Musk mainly depend on Community Notes’ crowdsourced fact-checks to supplement its otherwise lax moderation. Additionally, it differs significantly from the moderating functions that decentralized solutions like Bluesky and Mastodon perform on their networks. Thanks to decentralization, community members can create their own servers with their own moderating policies, and they can choose to de-federate from other servers whose content violates their standards.
Additionally, the business Bluesky is investing in stacking moderation, which enables community members to establish and manage their moderation services. These services may be coupled with others to provide a personalized user experience.
Meta’s effort to transfer decision-making authority from the firm and its CEO, Mark Zuckerberg, to an independent board was intended to address Meta’s centralized power and control over content moderation. However, different approaches exist that provide the user with more control over what they view without infringing on the rights of others to do the same, as demonstrated by these firms.
However, the Oversight Board declared on Thursday that Threads would be the subject of its first lawsuit.
In this case, a user responded to a post that included a screenshot of a recent article in which Japanese Prime Minister Fumio Kishida discussed his party’s purported underreporting of fundraising income. In addition, the post featured a disparaging caption that called him out for tax dodging and used the term “drop dead.” Additionally, it made fun of people who wore spectacles. Even though the post seemed a lot like your typical X post these days, a human reviewer at Meta determined it violated the company’s Violence and Incitement policy because of the “drop dead” element and hashtags that called for death. The user appealed to the Board after their appeal was turned down twice.
The Board claims that it chose this case to investigate Meta’s procedures for content moderation and enforcement regarding political content on Threads. Given that it’s election season and that Meta stated it won’t be aggressively promoting political content on Instagram or Threads, this is a sensible step.
Though it won’t be the last, the Board’s case will be the first involving Threads. The group is preparing to reveal a new batch of cases centered around criminal accusations from nationality tomorrow. The Board received referrals for these cases from Meta, but it will also consider appeals from Threads users, as it did in the case of Prime Minister Kishida.
The Board’s decisions will impact Threads’ platform policy regarding user freedom of expression and whether or not Threads will censor content more strictly than Twitter/X. That will ultimately help mold the public perception of the platforms and sway consumers’ decisions to select one over the other. Alternatively, a company might test novel approaches to more individualized content moderation.