Viewpoint: Is Facebook evil, or just out of control?

Anyone who has started a business will know that as it grows, it becomes harder to manage. My second job after leaving Uni – don’t get me started on the first – was with a small B2B marketing consultancy as a copywriter then account manager. I was baffled when, after a year with the firm, the owner of the company said he wanted to put in a layer of middle management, with me as part of it, to protect the business as it grew. At that point, the company employed a grand total of six people, including the owner and his wife.

But he was right. The business eventually grew to around 20 people, I had a small team under me, and I could deal with some of their day-to-day questions, leaving the big man to get on with his speciality which was getting the business in and macro-managing things.

Now I’m sure that when Mark Zuckerberg started Facebook, he hoped it would be a success and that he would make money from it. I’m equally sure that if you had told him it would become the behemoth it has become, with just shy of a third of the world’s population on the platform every month, he would have laughed you out of the room.

A year or so ago, I read a book called Chaos Monkeys, written by an ex Facebook employee, Antonio Garcia Martinez. It purported to be an insider’s account of life ‘inside the Silicon Valley money machine’. In fact, a large part of it was about the author’s experiences working at Facebook. He wrote disparagingly about the chaotic way the company was managed. But it was all a bit rock & roll, the money was coming in, who cared if the management of the firm wasn’t all it could be.

Fast forward to the present day, however, and this week’s Channel 4 Dispatches documentary, where one of its reporters went undercover working for a firm in Ireland, CPL Resources, subcontracted to moderate the content that can and cannot appear on Facebook, and things look a lot more serious.

The documentary was shocking on a couple of fronts. The first was in some of the content, including children being beaten, that Facebook allows to remain on the platform. The second was the open admission from one the moderators that “it’s all about the money”. A subset of this was the reference to certain accounts belonging to extreme organisations that were, in Facebook’s terminology “shielded”, because they have a lot of followers, and followers equals engagement equals ad revenues. This was an accusation that Richard Allen, Facebook’s vice president for global policy solutions, denied in an interview on the programme.

But perhaps the most revealing part about the documentary was the lid that it lifted on just what Facebook’s content moderation system looks like. One, it’s outsourced, in part at least to companies like the one in the documentary. There was no reference in the programme to the skills required to land a job as a Facebook content moderator, so that remains a moot point.

Two, there has been a lot of talk from the tech giants about the role that AI and machine learning have to play in identifying unsuitable content and removing it from their platforms, given the huge amounts of data involved. There wasn’t much evidence of this in the documentary. It looked more like regular human beings looking at a screen and making a call on it. About as convincing as the VAR (Video Assistant Referee) system used in the World Cup over the last few weeks.

And three, when defending the company against the accusations made in the documentary earlier this week, Facebook’s vice president of global policy management, Monika Bickert, made the point that the company is “doubling the number of people working on our safety and security teams this year to 20,000. This includes over 7,500 content reviewers.”

As the Channel 4 documentary made clear, highlighting the backlog of cases the content reviewers have to deal with, 7,500 is not enough. If nothing else, Facebook should now realise that the number might be closer to 15,000, or 30,000, or even 75,000. For pity’s sake, the company generated $40bn of revenues and $15bn of profit last year; it’s not as if it can’t afford to invest more in content moderation.

This, I believe, is the crux of the matter. Every time Facebook is called out, it says it knows it needs to do more and accepts what its responsibilities are. I have no problem with the company making shedloads of money, but when its platform has become such an integral part of so many people’s everyday lives, it has a duty to police it properly. Whatever it costs.