Facebook is too big to tame or kill

on

Facebook is having a difficult time of it of late. It has weathered storms before — most notably that whole undermining democracy snafu — but this time it’s clearly pretty shaken, and doesn’t know exactly how it should react. 

It tried distraction by changing its name and dazzling the world with its vision for a dystopian future where we all exist in virtual reality. It tried backtracking on the super creepy facial recognition technology it’s working on. It tried discrediting the whistleblowers in the hope that they might go away. 

It’s even resorted to rolling out Nick Clegg, the company’s head of global affairs to bat on its behalf — baffling for those of us in the UK who watched him torch his reputation as honest and straight talking as soon as he got into government.

While a whistleblower, Frances Haugen, was calling for Mark Zuckerberg to quit so the business could change, Clegg was insisting that most of the social network was user generated “barbecues and bar mitzvahs.” 

He went on to reiterate what the company recently put out in a punchily worded press statement elsewhere. “The argument that we deliberately push content that makes people angry for profit is deeply illogical,” the statement read. “We are on track to spend more than $5bn on safety and security in 2021 – more than any other tech company – and have 40,000 people to do one job: keep people safe on our apps.”

Safety in numbers

Forty thousand people employed for safety sounds like a big number, because it is. Facebook is several magnitudes bigger than all its rivals, which is why it has to spend more. But you shouldn’t be placated by that, because it’s another example of Facebook using numbers to bamboozle and obfuscate

Nearly three billion people use Facebook — over a third of the world’s population. That means that there’s roughly one moderator for every 75,000 users (that’s actually being generous and assuming all 40,000 are stationed on Facebook, rather than Instagram, but you get the idea). 

To put that figure into perspective, according to the most recent census data, there are 9,831 police officers in New Zealand, meaning that there’s one cop for every 509 people. Or, to put it as reductively as Facebook’s PR might do if the stats were reversed, New Zealand cares 14,635% more about the safety of its population (or “userbase”) than Meta does. 

And that’s before you look into the dubious consistency and effectiveness of Facebook’s moderation guidelines. “Don’t overthink” is apparently part of the moderation handbook, and each moderator has to make a snap decision every eight to ten seconds over gruelling multi-hour shifts. Many of these moderators are based overseas, where English isn’t a first language and subtleties can be overlooked. The best (or “most effective”) trolls, after all, know how to stay just on the right side of the rules, while still whipping up mayhem.

Pull too hard on this loose logical thread, and the whole comforting sweater falls apart: you can see why Facebook would be more keen to shout reassuringly large sounding numbers. 

A problem that’s impossible to scale

Image by ijmaki from Pixabay

On some level, I suspect the top brass at Facebook realise that this simply isn’t a problem that can be solved, and it becomes more difficult the larger the social network becomes. Say 0.3% of your social network are dangerous psychopaths. That’s not really a problem if your site is home to 100,000 people, as it means you’ve got 300 wrong’uns to deal with. Scale it up to three billion, though, and 0.3% is suddenly 900,000 people.

Would doubling the number of moderators make a difference? Unlikely. Would quadrupling it? By that point you’ve moved up from “derisory” to “feeble” on the ratio of moderators to users: one for every 18,750 users. But that change would come at a cost of $20bn a year — and then you’re dangerously close to wiping out the $29.14bn the company made in net income for 2020. You can see why the company would want to wave the magic wand labelled “AI” in the hopes that it’ll all just go away, but that’s at best hopelessly optimistic

The truth, I suspect, is that Facebook will weather this storm in the same way it has every other one: delivering just enough platitudes about listening and change that the majority of users don’t leap from ambivalence to hostility. In the long run, demographic trends don’t look good for Facebook, but in the short term I suspect it’s too big to tame or kill, leaving it untouched as the fake news spewing, data leaking echo chamber we all know and love today. If only Yahoo had killed it when it had the chance.