Sometime in October, Facebook decided I was promoting extremist content.
It was a Times of Malta editorial that did it - a piece about the Israel-Palestine conflict and the killing of Hezbollah’s leader, Hassan Nasrallah.
Facebook’s algorithm said the post “shared or sent symbols, praise or support of people and organizations we define as dangerous”. It did not elaborate.
A few days later, Facebook’s terrorism radar pinged once again.
This time, the culprit was an article provided by international news wire service AFP. It, too, was about Palestine and Hezbollah. It, too, was flagged as dangerous.
Like a shark smelling blood, the algorithm started circling around my digital heels.
A third article about the Middle Eastern conflict – also provided by AFP – earned me another Facebook strike.
And when I shared an interview with an HHC addict warning people about the dangers of the drug, Facebook told me I was trying to “buy, sell, exchange or promote cannabis”.
By early November, I had crossed Facebook's Rubicon. A pop-up informed me, in white lettering on a bright red background, that my account was now “restricted”.
Now you might be asking yourself: who is this writer, and why should I care if he’s having trouble with Facebook?
You would be right, if it were not for one detail.
The articles flagged by Facebook’s algorithm, I shared for Times of Malta. And when my Facebook account was restricted, Times of Malta’s was, too.
Facebook told us that it would be displaying Times of Malta articles to fewer people. It threatened to “unpublish” our page. Our commercial team was blocked from running paid partnership adverts.
Things got so bad, we had to self-censor. We stopped posting articles about Israel, Palestine or Lebanon on Facebook.
Over the past month, we have tried to contact customer support at Facebook’s parent company Meta by phone, email and messenger. Most requests go ignored. The best we can get is that we need to “wait”. It’s been six weeks and we’re still waiting.
We are by no means an isolated case: Meta has a well-documented history of censoring posts about Palestine. Last year, a Human Rights Watch report found evidence that the company was doing so systematically, and at a global level.
Even humour isn’t safe. Local political satire page Bis-Serjeta’ has had multiple posts removed because the algorithm decided the page “tried to get like, follows, shares or video views in a misleading way”, whatever that means.
Perhaps satire is still a bridge too far for the algorithms. Maybe the censorship flagged by Human Rights Watch was of new, anonymous accounts posting unverified information.
But content written by AFP – one of the world’s most respected news wire services – and published by Malta’s largest news company does not fall into any of those two categories. So, on what grounds is it getting deleted?
Of course, algorithms will make mistakes. What is inexcusable is that the humans in charge of those machines do nothing to fix their errors. It’s not that the company is incapable of doing so: when Facebook noticed a problem with one of our payment methods, its customer support team fixed the issue in one day.
When Elon Musk bought X, one of the first things he did was to shut down its media enquiries arm and automatically reply to media questions with a poop emoji. Mark Zuckerberg and his behemoth of a company (Meta’s Q3 2024 revenues: $40.6 billion) are more subtle, but the substance is not worlds apart.
Social media platforms have already poached most of the advertising that media companies relied on to survive. Not content with stealing our supper, now they seem intent on kicking us out of the house.
Legislators are trying to patch this imbalance: the European Media Freedom Act will require social media giants to promptly handle complaints from media companies and to publish information about how much content they are suppressing, and why.
The problem is that companies and governments operate on wildly different timeframes. The Media Freedom Act took years to draft and will only come into effect next August. Every comma was negotiated.
Companies like Meta and X move much quicker and with less accountability. We don’t know how their algorithms work, when they change or what they are programmed to favour. And if users don’t like it, they don’t really have anywhere else to go.
It is too much power, concentrated in too small a set of hands with too little accountability.
We are still waiting for Meta’s customer service team to reply to our requests to review the restrictions placed on our Facebook page. In the meantime, we continue to self-censor while posting on Facebook.
These platforms control what we see and hear, and they do it with a mandate to focus on their own bottom lines. At what point should the public interest trump that of their shareholders?