Recently, Meta, the parent company of Facebook, has cracked down on posts, accounts, and pages that have violated the Facebook Community Standards. This is especially critical in the Philippines as the national elections are approaching. With an endless number of heated exchanges on the platform, many have seen their accounts getting warnings or even getting suspended. And even high-profile personalities aren’t exempt. The most recent personality who got their account suspended is lawyer Vic Rodriguez, the spokesperson and chief of staff of presidential candidate Ferdinand ‘Bongbong’ Marcos Jr.
Vic Rodriguez’s Facebook account gets suspended
LOOK: Facebook has suspended the account of Bongbong Marcos’ chief of staff and spokesman, Atty. Vic Rodriguez. | via @InaReformina pic.twitter.com/AXd7kdzOYA
— ABS-CBN News (@ABSCBNNews) April 26, 2022
On April 26, Rodriguez shared screenshots of his personal Facebook account getting suspended by Facebook. According to the screenshots, Facebook said that Rodriguez’s “account, or activity on it, doesn’t follow our Community Standards.”
He was given 30 days to submit an appeal and disagree with the decision.
Is this censorship?
Statement of Atty. Vic Rodriguez: pic.twitter.com/xBsUmq0wux
— sandra aguinaldo (@sandraguinaldo) April 26, 2022
In a statement, Rodriguez claimed “FB/Meta suspended my account because I am for Bongbong Marcos.” He said that he will not be making an appeal and will instead continue to communicate on other platforms.
Rodriguez further claimed that this suspension of his account “is censorship of the highest degree and interference on a sovereign act, digital terrorism no less.”
On the evening of the same day, his account was restored. According to ABS-CBN, a Meta representative said that Rodriguez’s account “was “mistakenly restricted for reasons unrelated to any posted content.”
So how can you recover your account?
If you ever find yourself in a situation where your Facebook account is suspended for a violation, there are a few ways you can recover your account:
- Verify your identity and submit a valid ID. Facebook won’t display your ID anywhere on the platform and will only use it for verification purposes.
- Submit an appeal. This is usually prompted upon notification of suspension, but you can also head to the Facebook Help Center.
But of course, getting the account unsuspended still largely depends on what violation was committed and how Facebook responds to your request.
What should you do after recovering your account?
Are you able to access your account again? This time, you should thoroughly review Facebook’s Community Standards and familiarize yourself with them. Make sure you follow these Community Standards and avoid posting any content that would violate them. Use your real name on your Facebook account. If you were also able to uncover the specific reason behind Facebook’s suspension of your account, do your best to stop doing whatever that is.
What are the possible reasons for an account’s suspension?
Facebook has an extensive list of rules and regulations in its Community Standards. “The goal of our Community Standards is to create a place for expression and give people a voice,” read the policy page.
From posting copyright-protected content and uploading inappropriate content to impersonating someone else and sending hate to other users, there are a number of reasons that could get a Facebook account suspended. But the bottom line is that accounts can get suspended if they violate any part of Facebook’s policies. The policies are divided into five main categories, with each enumerating clearly the content that may be issued with a warning, content that may be restricted, and content that is not allowed. See them below:
- Violence and criminal behavior — Violence and Incitement, Dangerous Individuals and Organizations, Coordinating Harm and Promoting Crime, Restricted Goods and Services, Fraud and Deception
- Safety — Suicide and Self-Injure, Child Sexual Exploitation, Abuse and Nudity, Adult Sexual Exploitation, Bullying and Harassment, Human Exploitation, Privacy Violations
- Objectionable content — Hate Speech, Violent and Graphic Content, Adult Nudity and Sexual Activity, Sexual Solicitation
- Integrity and inauthentic behavior — Account Integrity and Authentic Identity, Spam, Cybersecurity, Inauthentic Behavior, Misinformation, Memorialization
- Respecting intellectual property
This information is readily available at Meta’s Transparency Center.
What about misinformation?
According to Meta’s policy page, they tackle misinformation and the policies corresponding to it differently. Why? It would simply be too difficult to enforce this, as the platform “[doesn’t] have perfect access to information”.
Instead, Meta categorizes the kinds of misinformation and how they tackle each.
They immediately remove misinformation that is likely to directly contribute to the risk of imminent physical harm and interference with the functioning of political processes and certain highly deceptive manipulated media. Meta partners with third-party experts, such as local organizations with a presence on the ground, to determine the truth of the content and whether they fall under the aforementioned categories.
For grey areas such as humor or satire, Meta focuses on minimizing how prevalent those content are. They work with third-party fact-checkers to review and verify the content.
As the election approaches, Meta has cracked down further on accounts and content that violate their Community Standards in an effort to prevent interference, combat misinformation, and increase transparency. In the Philippines, they work with the Commission on Elections, election watchdogs, independent fact-checkers and civil society organizations to do so.
Their three-part strategy to Remove, Reduce, and Inform includes working with third-party fact-checking partners certified by the non-partisan International Fact-Checking Network to identify content as true and false.
How do Meta and Facebook detect these violations?
Meta uses artificial intelligence (AI) technology to detect any violation, most even before they are posted and seen by others on the platform. These AIs are trained to perform tasks, such as recognizing what’s in a photo or understanding text, in order to recognize whether a violation is being made. This, of course, is backed by human decisions, as review teams make the final call until an AI is able to grasp content violations through repetition. These teams are composed of thousands of reviewers around the world, who look at content and context to determine whether a violation has been done or not.
You can read further about how Meta detects violations and enforces its Community standards in Meta’s Transparency Center.
Do you agree with Facebook’s Community Standards?